[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 24134 1727096395.18558: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-And executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 24134 1727096395.19032: Added group all to inventory 24134 1727096395.19034: Added group ungrouped to inventory 24134 1727096395.19039: Group all now contains ungrouped 24134 1727096395.19042: Examining possible inventory source: /tmp/network-EuO/inventory.yml 24134 1727096395.36944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 24134 1727096395.37002: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 24134 1727096395.37024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 24134 1727096395.37085: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 24134 1727096395.37158: Loaded config def from plugin (inventory/script) 24134 1727096395.37160: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 24134 1727096395.37199: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 24134 1727096395.37285: Loaded config def from plugin (inventory/yaml) 24134 1727096395.37288: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 24134 1727096395.37370: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 24134 1727096395.37827: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 24134 1727096395.37831: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 24134 1727096395.37834: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 24134 1727096395.37839: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 24134 1727096395.37845: Loading data from /tmp/network-EuO/inventory.yml 24134 1727096395.37921: /tmp/network-EuO/inventory.yml was not parsable by auto 24134 1727096395.37986: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 24134 1727096395.38031: Loading data from /tmp/network-EuO/inventory.yml 24134 1727096395.38135: group all already in inventory 24134 1727096395.38143: set inventory_file for managed_node1 24134 1727096395.38146: set inventory_dir for managed_node1 24134 1727096395.38147: Added host managed_node1 to inventory 24134 1727096395.38150: Added host managed_node1 to group all 24134 1727096395.38151: set ansible_host for managed_node1 24134 1727096395.38152: set ansible_ssh_extra_args for managed_node1 24134 1727096395.38155: set inventory_file for managed_node2 24134 1727096395.38158: set inventory_dir for managed_node2 24134 1727096395.38158: Added host managed_node2 to inventory 24134 1727096395.38160: Added host managed_node2 to group all 24134 1727096395.38161: set ansible_host for managed_node2 24134 1727096395.38162: set ansible_ssh_extra_args for managed_node2 24134 1727096395.38164: set inventory_file for managed_node3 24134 1727096395.38169: set inventory_dir for managed_node3 24134 1727096395.38170: Added host managed_node3 to inventory 24134 1727096395.38171: Added host managed_node3 to group all 24134 1727096395.38172: set ansible_host for managed_node3 24134 1727096395.38173: set ansible_ssh_extra_args for managed_node3 24134 1727096395.38175: Reconcile groups and hosts in inventory. 24134 1727096395.38179: Group ungrouped now contains managed_node1 24134 1727096395.38181: Group ungrouped now contains managed_node2 24134 1727096395.38182: Group ungrouped now contains managed_node3 24134 1727096395.38264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 24134 1727096395.38488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 24134 1727096395.38536: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 24134 1727096395.38562: Loaded config def from plugin (vars/host_group_vars) 24134 1727096395.38674: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 24134 1727096395.38682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 24134 1727096395.38690: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 24134 1727096395.38732: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 24134 1727096395.39724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096395.39932: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 24134 1727096395.39976: Loaded config def from plugin (connection/local) 24134 1727096395.39980: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 24134 1727096395.41791: Loaded config def from plugin (connection/paramiko_ssh) 24134 1727096395.41795: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 24134 1727096395.44106: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24134 1727096395.44147: Loaded config def from plugin (connection/psrp) 24134 1727096395.44150: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 24134 1727096395.46105: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24134 1727096395.46146: Loaded config def from plugin (connection/ssh) 24134 1727096395.46149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 24134 1727096395.50128: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24134 1727096395.50166: Loaded config def from plugin (connection/winrm) 24134 1727096395.50373: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 24134 1727096395.50405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 24134 1727096395.50471: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 24134 1727096395.50537: Loaded config def from plugin (shell/cmd) 24134 1727096395.50539: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 24134 1727096395.50565: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 24134 1727096395.50838: Loaded config def from plugin (shell/powershell) 24134 1727096395.50840: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 24134 1727096395.50899: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 24134 1727096395.51286: Loaded config def from plugin (shell/sh) 24134 1727096395.51288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 24134 1727096395.51324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 24134 1727096395.51649: Loaded config def from plugin (become/runas) 24134 1727096395.51652: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 24134 1727096395.52046: Loaded config def from plugin (become/su) 24134 1727096395.52048: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 24134 1727096395.52248: Loaded config def from plugin (become/sudo) 24134 1727096395.52250: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 24134 1727096395.52288: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 24134 1727096395.52671: in VariableManager get_vars() 24134 1727096395.52694: done with get_vars() 24134 1727096395.52830: trying /usr/local/lib/python3.12/site-packages/ansible/modules 24134 1727096395.55846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 24134 1727096395.55963: in VariableManager get_vars() 24134 1727096395.55970: done with get_vars() 24134 1727096395.55973: variable 'playbook_dir' from source: magic vars 24134 1727096395.55974: variable 'ansible_playbook_python' from source: magic vars 24134 1727096395.55975: variable 'ansible_config_file' from source: magic vars 24134 1727096395.55976: variable 'groups' from source: magic vars 24134 1727096395.55977: variable 'omit' from source: magic vars 24134 1727096395.55977: variable 'ansible_version' from source: magic vars 24134 1727096395.55978: variable 'ansible_check_mode' from source: magic vars 24134 1727096395.55979: variable 'ansible_diff_mode' from source: magic vars 24134 1727096395.55979: variable 'ansible_forks' from source: magic vars 24134 1727096395.55980: variable 'ansible_inventory_sources' from source: magic vars 24134 1727096395.55981: variable 'ansible_skip_tags' from source: magic vars 24134 1727096395.55981: variable 'ansible_limit' from source: magic vars 24134 1727096395.55982: variable 'ansible_run_tags' from source: magic vars 24134 1727096395.55983: variable 'ansible_verbosity' from source: magic vars 24134 1727096395.56022: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml 24134 1727096395.56525: in VariableManager get_vars() 24134 1727096395.56541: done with get_vars() 24134 1727096395.56587: in VariableManager get_vars() 24134 1727096395.56608: done with get_vars() 24134 1727096395.56644: in VariableManager get_vars() 24134 1727096395.56661: done with get_vars() 24134 1727096395.56786: in VariableManager get_vars() 24134 1727096395.56800: done with get_vars() 24134 1727096395.56804: variable 'omit' from source: magic vars 24134 1727096395.56822: variable 'omit' from source: magic vars 24134 1727096395.56855: in VariableManager get_vars() 24134 1727096395.56866: done with get_vars() 24134 1727096395.56919: in VariableManager get_vars() 24134 1727096395.56932: done with get_vars() 24134 1727096395.56969: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 24134 1727096395.57190: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 24134 1727096395.57323: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 24134 1727096395.57958: in VariableManager get_vars() 24134 1727096395.57979: done with get_vars() 24134 1727096395.58400: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 24134 1727096395.58529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24134 1727096395.61235: in VariableManager get_vars() 24134 1727096395.61239: done with get_vars() 24134 1727096395.61242: variable 'playbook_dir' from source: magic vars 24134 1727096395.61243: variable 'ansible_playbook_python' from source: magic vars 24134 1727096395.61244: variable 'ansible_config_file' from source: magic vars 24134 1727096395.61244: variable 'groups' from source: magic vars 24134 1727096395.61245: variable 'omit' from source: magic vars 24134 1727096395.61248: variable 'ansible_version' from source: magic vars 24134 1727096395.61249: variable 'ansible_check_mode' from source: magic vars 24134 1727096395.61250: variable 'ansible_diff_mode' from source: magic vars 24134 1727096395.61250: variable 'ansible_forks' from source: magic vars 24134 1727096395.61251: variable 'ansible_inventory_sources' from source: magic vars 24134 1727096395.61252: variable 'ansible_skip_tags' from source: magic vars 24134 1727096395.61252: variable 'ansible_limit' from source: magic vars 24134 1727096395.61253: variable 'ansible_run_tags' from source: magic vars 24134 1727096395.61254: variable 'ansible_verbosity' from source: magic vars 24134 1727096395.61309: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 24134 1727096395.61614: in VariableManager get_vars() 24134 1727096395.61627: done with get_vars() 24134 1727096395.61883: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 24134 1727096395.62015: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 24134 1727096395.62114: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 24134 1727096395.63610: in VariableManager get_vars() 24134 1727096395.63652: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24134 1727096395.67431: in VariableManager get_vars() 24134 1727096395.67434: done with get_vars() 24134 1727096395.67437: variable 'playbook_dir' from source: magic vars 24134 1727096395.67438: variable 'ansible_playbook_python' from source: magic vars 24134 1727096395.67439: variable 'ansible_config_file' from source: magic vars 24134 1727096395.67439: variable 'groups' from source: magic vars 24134 1727096395.67440: variable 'omit' from source: magic vars 24134 1727096395.67441: variable 'ansible_version' from source: magic vars 24134 1727096395.67442: variable 'ansible_check_mode' from source: magic vars 24134 1727096395.67442: variable 'ansible_diff_mode' from source: magic vars 24134 1727096395.67444: variable 'ansible_forks' from source: magic vars 24134 1727096395.67445: variable 'ansible_inventory_sources' from source: magic vars 24134 1727096395.67445: variable 'ansible_skip_tags' from source: magic vars 24134 1727096395.67446: variable 'ansible_limit' from source: magic vars 24134 1727096395.67447: variable 'ansible_run_tags' from source: magic vars 24134 1727096395.67447: variable 'ansible_verbosity' from source: magic vars 24134 1727096395.67488: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 24134 1727096395.67564: in VariableManager get_vars() 24134 1727096395.67579: done with get_vars() 24134 1727096395.67625: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 24134 1727096395.67739: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 24134 1727096395.69680: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 24134 1727096395.70322: in VariableManager get_vars() 24134 1727096395.70342: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24134 1727096395.73411: in VariableManager get_vars() 24134 1727096395.73427: done with get_vars() 24134 1727096395.73464: in VariableManager get_vars() 24134 1727096395.73483: done with get_vars() 24134 1727096395.73519: in VariableManager get_vars() 24134 1727096395.73546: done with get_vars() 24134 1727096395.73788: in VariableManager get_vars() 24134 1727096395.73801: done with get_vars() 24134 1727096395.73864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 24134 1727096395.73883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 24134 1727096395.74317: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 24134 1727096395.74697: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 24134 1727096395.74700: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 24134 1727096395.74733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 24134 1727096395.74758: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 24134 1727096395.75203: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 24134 1727096395.75262: Loaded config def from plugin (callback/default) 24134 1727096395.75265: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 24134 1727096395.78091: Loaded config def from plugin (callback/junit) 24134 1727096395.78094: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 24134 1727096395.78140: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 24134 1727096395.78375: Loaded config def from plugin (callback/minimal) 24134 1727096395.78377: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 24134 1727096395.78417: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 24134 1727096395.78607: Loaded config def from plugin (callback/tree) 24134 1727096395.78610: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 24134 1727096395.78816: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 24134 1727096395.78819: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_disabled_nm.yml ******************************************* 5 plays in /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 24134 1727096395.78846: in VariableManager get_vars() 24134 1727096395.78861: done with get_vars() 24134 1727096395.78871: in VariableManager get_vars() 24134 1727096395.78881: done with get_vars() 24134 1727096395.78885: variable 'omit' from source: magic vars 24134 1727096395.78922: in VariableManager get_vars() 24134 1727096395.78935: done with get_vars() 24134 1727096395.78956: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6_disabled.yml' with nm as provider] **** 24134 1727096395.80261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 24134 1727096395.80399: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 24134 1727096395.80429: getting the remaining hosts for this loop 24134 1727096395.80431: done getting the remaining hosts for this loop 24134 1727096395.80434: getting the next task for host managed_node1 24134 1727096395.80438: done getting next task for host managed_node1 24134 1727096395.80439: ^ task is: TASK: Gathering Facts 24134 1727096395.80441: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096395.80443: getting variables 24134 1727096395.80444: in VariableManager get_vars() 24134 1727096395.80456: Calling all_inventory to load vars for managed_node1 24134 1727096395.80458: Calling groups_inventory to load vars for managed_node1 24134 1727096395.80461: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096395.80478: Calling all_plugins_play to load vars for managed_node1 24134 1727096395.80490: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096395.80493: Calling groups_plugins_play to load vars for managed_node1 24134 1727096395.80527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096395.80585: done with get_vars() 24134 1727096395.80592: done getting variables 24134 1727096395.80660: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 Monday 23 September 2024 08:59:55 -0400 (0:00:00.020) 0:00:00.020 ****** 24134 1727096395.80687: entering _queue_task() for managed_node1/gather_facts 24134 1727096395.80689: Creating lock for gather_facts 24134 1727096395.81048: worker is 1 (out of 1 available) 24134 1727096395.81056: exiting _queue_task() for managed_node1/gather_facts 24134 1727096395.81275: done queuing things up, now waiting for results queue to drain 24134 1727096395.81277: waiting for pending results... 24134 1727096395.81410: running TaskExecutor() for managed_node1/TASK: Gathering Facts 24134 1727096395.81416: in run() - task 0afff68d-5257-1673-d3fc-0000000000a3 24134 1727096395.81419: variable 'ansible_search_path' from source: unknown 24134 1727096395.81458: calling self._execute() 24134 1727096395.81529: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096395.81541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096395.81553: variable 'omit' from source: magic vars 24134 1727096395.81656: variable 'omit' from source: magic vars 24134 1727096395.81693: variable 'omit' from source: magic vars 24134 1727096395.81740: variable 'omit' from source: magic vars 24134 1727096395.81788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096395.81830: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096395.81856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096395.81881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096395.81897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096395.81926: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096395.81933: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096395.81945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096395.82040: Set connection var ansible_shell_executable to /bin/sh 24134 1727096395.82073: Set connection var ansible_pipelining to False 24134 1727096395.82076: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096395.82078: Set connection var ansible_timeout to 10 24134 1727096395.82080: Set connection var ansible_connection to ssh 24134 1727096395.82082: Set connection var ansible_shell_type to sh 24134 1727096395.82106: variable 'ansible_shell_executable' from source: unknown 24134 1727096395.82164: variable 'ansible_connection' from source: unknown 24134 1727096395.82172: variable 'ansible_module_compression' from source: unknown 24134 1727096395.82175: variable 'ansible_shell_type' from source: unknown 24134 1727096395.82177: variable 'ansible_shell_executable' from source: unknown 24134 1727096395.82180: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096395.82182: variable 'ansible_pipelining' from source: unknown 24134 1727096395.82185: variable 'ansible_timeout' from source: unknown 24134 1727096395.82187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096395.82334: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096395.82676: variable 'omit' from source: magic vars 24134 1727096395.82679: starting attempt loop 24134 1727096395.82681: running the handler 24134 1727096395.82684: variable 'ansible_facts' from source: unknown 24134 1727096395.82686: _low_level_execute_command(): starting 24134 1727096395.82688: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096395.83994: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096395.84108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096395.84366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096395.84440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096395.86163: stdout chunk (state=3): >>>/root <<< 24134 1727096395.86303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096395.86575: stdout chunk (state=3): >>><<< 24134 1727096395.86578: stderr chunk (state=3): >>><<< 24134 1727096395.86581: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096395.86584: _low_level_execute_command(): starting 24134 1727096395.86587: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325 `" && echo ansible-tmp-1727096395.864405-24173-97990861984325="` echo /root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325 `" ) && sleep 0' 24134 1727096395.87989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096395.88088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096395.88193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096395.88249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096395.88359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096395.90360: stdout chunk (state=3): >>>ansible-tmp-1727096395.864405-24173-97990861984325=/root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325 <<< 24134 1727096395.90785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096395.90789: stdout chunk (state=3): >>><<< 24134 1727096395.90795: stderr chunk (state=3): >>><<< 24134 1727096395.90815: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096395.864405-24173-97990861984325=/root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096395.90852: variable 'ansible_module_compression' from source: unknown 24134 1727096395.91106: ANSIBALLZ: Using generic lock for ansible.legacy.setup 24134 1727096395.91110: ANSIBALLZ: Acquiring lock 24134 1727096395.91113: ANSIBALLZ: Lock acquired: 140085163806880 24134 1727096395.91115: ANSIBALLZ: Creating module 24134 1727096396.47764: ANSIBALLZ: Writing module into payload 24134 1727096396.47772: ANSIBALLZ: Writing module 24134 1727096396.47775: ANSIBALLZ: Renaming module 24134 1727096396.47777: ANSIBALLZ: Done creating module 24134 1727096396.47779: variable 'ansible_facts' from source: unknown 24134 1727096396.47781: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096396.47783: _low_level_execute_command(): starting 24134 1727096396.47785: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 24134 1727096396.49291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096396.49296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096396.49327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096396.49406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096396.51133: stdout chunk (state=3): >>>PLATFORM <<< 24134 1727096396.51213: stdout chunk (state=3): >>>Linux <<< 24134 1727096396.51217: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 24134 1727096396.51219: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 24134 1727096396.51376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096396.51409: stderr chunk (state=3): >>><<< 24134 1727096396.51481: stdout chunk (state=3): >>><<< 24134 1727096396.51560: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096396.51566 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 24134 1727096396.51571: _low_level_execute_command(): starting 24134 1727096396.51573: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 24134 1727096396.51808: Sending initial data 24134 1727096396.51812: Sent initial data (1181 bytes) 24134 1727096396.53078: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096396.53082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096396.53107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096396.53128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096396.53256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096396.56910: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 24134 1727096396.57274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096396.57316: stderr chunk (state=3): >>><<< 24134 1727096396.57324: stdout chunk (state=3): >>><<< 24134 1727096396.57340: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096396.57437: variable 'ansible_facts' from source: unknown 24134 1727096396.57440: variable 'ansible_facts' from source: unknown 24134 1727096396.57442: variable 'ansible_module_compression' from source: unknown 24134 1727096396.57765: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24134 1727096396.57770: variable 'ansible_facts' from source: unknown 24134 1727096396.57970: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325/AnsiballZ_setup.py 24134 1727096396.58354: Sending initial data 24134 1727096396.58357: Sent initial data (152 bytes) 24134 1727096396.59432: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096396.59536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096396.59579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096396.59680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096396.59685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096396.59828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096396.61596: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096396.61601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096396.61730: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpa9kjr2t_ /root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325/AnsiballZ_setup.py <<< 24134 1727096396.61735: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325/AnsiballZ_setup.py" <<< 24134 1727096396.61863: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpa9kjr2t_" to remote "/root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325/AnsiballZ_setup.py" <<< 24134 1727096396.64129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096396.64181: stderr chunk (state=3): >>><<< 24134 1727096396.64185: stdout chunk (state=3): >>><<< 24134 1727096396.64208: done transferring module to remote 24134 1727096396.64223: _low_level_execute_command(): starting 24134 1727096396.64343: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325/ /root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325/AnsiballZ_setup.py && sleep 0' 24134 1727096396.65685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096396.65695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096396.65698: stderr chunk (state=3): >>>debug2: match found <<< 24134 1727096396.65707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096396.65895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096396.65918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096396.66055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096396.68075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096396.68079: stdout chunk (state=3): >>><<< 24134 1727096396.68086: stderr chunk (state=3): >>><<< 24134 1727096396.68103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096396.68111: _low_level_execute_command(): starting 24134 1727096396.68274: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325/AnsiballZ_setup.py && sleep 0' 24134 1727096396.69306: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096396.69309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096396.69323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096396.69336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096396.69349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096396.69484: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096396.69550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096396.69560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096396.69587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096396.69741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096396.72042: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 24134 1727096396.72062: stdout chunk (state=3): >>>import _imp # builtin <<< 24134 1727096396.72151: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 24134 1727096396.72195: stdout chunk (state=3): >>>import 'posix' # <<< 24134 1727096396.72223: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 24134 1727096396.72297: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 24134 1727096396.72301: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 24134 1727096396.72313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096396.72429: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 24134 1727096396.72432: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 24134 1727096396.72455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82579104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82578dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257912a50> <<< 24134 1727096396.72583: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 24134 1727096396.72644: stdout chunk (state=3): >>>import '_collections_abc' # <<< 24134 1727096396.72783: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 24134 1727096396.72795: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 24134 1727096396.72811: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576c1130> <<< 24134 1727096396.72875: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096396.72914: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576c1fa0> import 'site' # <<< 24134 1727096396.72941: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24134 1727096396.73349: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 24134 1727096396.73357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096396.73375: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 24134 1727096396.73461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 24134 1727096396.73605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576ffdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576fffe0> <<< 24134 1727096396.73622: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 24134 1727096396.73650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096396.73678: stdout chunk (state=3): >>>import 'itertools' # <<< 24134 1727096396.73698: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 24134 1727096396.73731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257737800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 24134 1727096396.73760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257737e90> import '_collections' # <<< 24134 1727096396.73971: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257717aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577151c0> <<< 24134 1727096396.73986: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576fcf80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 24134 1727096396.73992: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 24134 1727096396.74027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 24134 1727096396.74040: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 24134 1727096396.74046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 24134 1727096396.74081: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577576e0> <<< 24134 1727096396.74089: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257756300> <<< 24134 1727096396.74491: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257716060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576fee70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778c7a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576fc200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825778cc50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778cb00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825778cef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576fad20> <<< 24134 1727096396.74495: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778d5b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778d280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778e4b0> import 'importlib.util' # import 'runpy' # <<< 24134 1727096396.74509: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 24134 1727096396.74545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 24134 1727096396.74556: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 24134 1727096396.74631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577a4680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82577a5d30> <<< 24134 1727096396.74634: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 24134 1727096396.74743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577a6bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82577a7230> <<< 24134 1727096396.74750: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577a6120> <<< 24134 1727096396.74763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 24134 1727096396.74810: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82577a7cb0> <<< 24134 1727096396.74837: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577a73e0> <<< 24134 1727096396.74885: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778e450> <<< 24134 1727096396.74899: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 24134 1727096396.74910: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 24134 1727096396.74948: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 24134 1727096396.74994: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574a3bc0> <<< 24134 1727096396.75061: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 24134 1727096396.75174: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574cc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574cc470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574cc6b0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096396.75437: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574ccfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096396.75441: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574cd910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574cc890> <<< 24134 1727096396.75857: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574a1d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574cecc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574cd790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778eba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574fb020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 24134 1727096396.75860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 24134 1727096396.75898: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825751b3e0> <<< 24134 1727096396.75948: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 24134 1727096396.76161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825757c200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 24134 1727096396.76686: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825757e960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825757c320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82575491f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573812e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825751a1e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574cfbf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f825751a300> <<< 24134 1727096396.76859: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_vuoglklp/ansible_ansible.legacy.setup_payload.zip' <<< 24134 1727096396.76862: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.76981: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.77006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 24134 1727096396.77010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 24134 1727096396.77051: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 24134 1727096396.77185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573e6f30> import '_typing' # <<< 24134 1727096396.77474: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573c5e50> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573c5010> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 24134 1727096396.78872: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.80137: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573e4e00> <<< 24134 1727096396.80154: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825741e8d0> <<< 24134 1727096396.80181: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825741e660> <<< 24134 1727096396.80395: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825741df70> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825741e9c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573e7bc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825741f5f0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825741f770> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 24134 1727096396.80433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 24134 1727096396.80519: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825741fcb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 24134 1727096396.80522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 24134 1727096396.81074: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d29a90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d2b6b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d2bf80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d2d220> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d2fd10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574faf90> <<< 24134 1727096396.81091: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d2dfd0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 24134 1727096396.81184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 24134 1727096396.81403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d37cb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d36780> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d364e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d36a50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d2e4e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096396.81411: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d7bf20> <<< 24134 1727096396.81539: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d7c080> <<< 24134 1727096396.81764: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d7db50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d7d910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d7ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d7e1e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 24134 1727096396.81790: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d83860> <<< 24134 1727096396.81909: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d80230> <<< 24134 1727096396.82195: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d84920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d84740> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d84a40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d7c290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256c10110> <<< 24134 1727096396.82340: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096396.82348: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256c11280> <<< 24134 1727096396.82351: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d868d0> <<< 24134 1727096396.82482: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d87c50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d864e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 24134 1727096396.82519: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.82738: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 24134 1727096396.82955: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.83452: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.84276: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256c15580> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c16420> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c114c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 24134 1727096396.84298: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 24134 1727096396.84315: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.84642: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c16510> # zipimport: zlib available <<< 24134 1727096396.85092: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.85627: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.85742: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.85755: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 24134 1727096396.85894: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.85955: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 24134 1727096396.86388: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.86685: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c175f0> # zipimport: zlib available <<< 24134 1727096396.86726: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.86988: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.87014: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.87073: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.87203: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096396.87485: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256c21ee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c1f290> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.87561: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.87570: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 24134 1727096396.87573: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096396.87592: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 24134 1727096396.87608: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 24134 1727096396.87619: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 24134 1727096396.87931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d0a9c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256dfe690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c220c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c21ca0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 24134 1727096396.87935: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.88036: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 24134 1727096396.88050: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 24134 1727096396.88083: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.88134: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.88353: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.88391: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 24134 1727096396.88394: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.88576: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.88683: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 24134 1727096396.88886: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.89195: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256cb60f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 24134 1727096396.89199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 24134 1727096396.89214: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 24134 1727096396.89337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 24134 1727096396.89391: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82568e4110> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82568e4380> <<< 24134 1727096396.89679: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c9c0e0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256cb6c60> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256cb4770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256cb43e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82568e7500> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82568e6db0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096396.89683: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82568e6f90> <<< 24134 1727096396.89714: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82568e61e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 24134 1727096396.90008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82568e76e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82569321e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256930200> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256cb5880> <<< 24134 1727096396.90012: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 24134 1727096396.90027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 24134 1727096396.90089: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 24134 1727096396.90137: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.90212: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 24134 1727096396.90433: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 24134 1727096396.90493: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.90538: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 24134 1727096396.90582: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.90652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 24134 1727096396.90870: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 24134 1727096396.91346: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.92094: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.92148: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 24134 1727096396.92184: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.92297: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 24134 1727096396.92372: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.92459: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 24134 1727096396.92510: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256932510> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 24134 1727096396.92540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 24134 1727096396.92703: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256933110> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 24134 1727096396.92744: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.92799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 24134 1727096396.92813: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.92897: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.92984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 24134 1727096396.93011: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.93180: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 24134 1727096396.93183: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.93228: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 24134 1727096396.93277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 24134 1727096396.93602: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82569726f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82569634a0> <<< 24134 1727096396.93606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 24134 1727096396.93657: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.93738: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 24134 1727096396.93807: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.93884: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.94064: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.94138: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 24134 1727096396.94151: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 24134 1727096396.94195: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.94287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.94328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 24134 1727096396.94371: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096396.94657: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256985eb0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256963620> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 24134 1727096396.94886: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 24134 1727096396.94949: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.95285: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.95327: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.95589: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 24134 1727096396.95604: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.95786: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.95799: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.96577: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.97100: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 24134 1727096396.97187: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.97285: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 24134 1727096396.97444: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.97689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.97718: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 24134 1727096396.97730: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.97825: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.97985: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.98119: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.98489: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 24134 1727096396.98537: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.98602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 24134 1727096396.98617: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.98751: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.98793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 24134 1727096396.98796: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.98855: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.98972: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 24134 1727096396.99178: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.99491: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096396.99553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 24134 1727096396.99565: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.99628: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 24134 1727096396.99642: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.99673: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.99735: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 24134 1727096396.99752: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.99785: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 24134 1727096396.99866: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096396.99960: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 24134 1727096396.99979: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 24134 1727096396.99995: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.00049: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.00184: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096397.00187: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.00228: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.00393: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 24134 1727096397.00431: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.00490: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 24134 1727096397.00861: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.00886: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 24134 1727096397.00889: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.00932: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.00976: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 24134 1727096397.01289: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 24134 1727096397.01374: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.01511: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 24134 1727096397.01514: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.02657: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 24134 1727096397.02666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 24134 1727096397.02703: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825671e6c0> <<< 24134 1727096397.02718: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825671f380> <<< 24134 1727096397.02803: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256715550> <<< 24134 1727096397.16202: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 24134 1727096397.16242: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256766720> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256765220> <<< 24134 1727096397.16305: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 24134 1727096397.16308: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096397.16310: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 24134 1727096397.16503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 24134 1727096397.16506: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256766a80> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256766120> <<< 24134 1727096397.16611: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 24134 1727096397.40647: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "59", "second": "57", "epoch": "1727096397", "epoch_int": "1727096397", "date": "2024-09-23", "time": "08:59:57", "iso8601_micro": "2024-09-23T12:59:57.029623Z", "iso8601": "2024-09-23T12:59:57Z", "iso8601_basic": "20240923T085957029623", "iso8601_basic_short": "20240923T085957", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.64990234375, "5m": 0.4482421875, "15m": 0.2275390625}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3302, "used": 229}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 550, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795254272, "block_size": 4096, "block_total": 65519099, "block_available": 63914857, "block_used": 1604242, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24134 1727096397.41277: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 24134 1727096397.41300: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc <<< 24134 1727096397.41331: stdout chunk (state=3): >>># cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib <<< 24134 1727096397.41360: stdout chunk (state=3): >>># cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse <<< 24134 1727096397.41447: stdout chunk (state=3): >>># destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes <<< 24134 1727096397.41452: stdout chunk (state=3): >>># cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils <<< 24134 1727096397.41520: stdout chunk (state=3): >>># destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python <<< 24134 1727096397.41557: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly <<< 24134 1727096397.41626: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user <<< 24134 1727096397.41658: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base <<< 24134 1727096397.41693: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize <<< 24134 1727096397.41707: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 24134 1727096397.42086: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery <<< 24134 1727096397.42104: stdout chunk (state=3): >>># destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 24134 1727096397.42163: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 24134 1727096397.42171: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 24134 1727096397.42241: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 24134 1727096397.42264: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale <<< 24134 1727096397.42268: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 24134 1727096397.42319: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 24134 1727096397.42322: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 24134 1727096397.42499: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 24134 1727096397.42506: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle <<< 24134 1727096397.42558: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob <<< 24134 1727096397.42659: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 24134 1727096397.42736: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 24134 1727096397.42743: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools <<< 24134 1727096397.42785: stdout chunk (state=3): >>># cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 24134 1727096397.42808: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 24134 1727096397.43059: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 24134 1727096397.43066: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 24134 1727096397.43072: stdout chunk (state=3): >>># destroy _typing <<< 24134 1727096397.43196: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 24134 1727096397.43251: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 24134 1727096397.43340: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 24134 1727096397.43343: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 24134 1727096397.43355: stdout chunk (state=3): >>># clear sys.audit hooks <<< 24134 1727096397.43739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096397.43757: stdout chunk (state=3): >>><<< 24134 1727096397.43775: stderr chunk (state=3): >>><<< 24134 1727096397.44035: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82579104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82578dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257912a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576c1fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576ffdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576fffe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257737800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257737e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257717aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577151c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576fcf80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577576e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257756300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8257716060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576fee70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778c7a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576fc200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825778cc50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778cb00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825778cef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82576fad20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778d5b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778d280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778e4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577a4680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82577a5d30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577a6bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82577a7230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577a6120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82577a7cb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82577a73e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778e450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574a3bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574cc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574cc470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574cc6b0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574ccfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574cd910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574cc890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574a1d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574cecc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574cd790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825778eba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574fb020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825751b3e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825757c200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825757e960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825757c320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82575491f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573812e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825751a1e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82574cfbf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f825751a300> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_vuoglklp/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573e6f30> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573c5e50> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573c5010> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573e4e00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825741e8d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825741e660> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825741df70> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825741e9c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82573e7bc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825741f5f0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825741f770> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825741fcb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d29a90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d2b6b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d2bf80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d2d220> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d2fd10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82574faf90> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d2dfd0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d37cb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d36780> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d364e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d36a50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d2e4e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d7bf20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d7c080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d7db50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d7d910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d7ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d7e1e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d83860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d80230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d84920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d84740> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d84a40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d7c290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256c10110> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256c11280> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d868d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256d87c50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d864e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256c15580> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c16420> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c114c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c16510> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c175f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256c21ee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c1f290> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256d0a9c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256dfe690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c220c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c21ca0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256cb60f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82568e4110> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82568e4380> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256c9c0e0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256cb6c60> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256cb4770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256cb43e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82568e7500> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82568e6db0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82568e6f90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82568e61e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82568e76e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82569321e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256930200> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256cb5880> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256932510> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256933110> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82569726f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82569634a0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8256985eb0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256963620> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f825671e6c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f825671f380> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256715550> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256766720> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256765220> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256766a80> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8256766120> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "59", "second": "57", "epoch": "1727096397", "epoch_int": "1727096397", "date": "2024-09-23", "time": "08:59:57", "iso8601_micro": "2024-09-23T12:59:57.029623Z", "iso8601": "2024-09-23T12:59:57Z", "iso8601_basic": "20240923T085957029623", "iso8601_basic_short": "20240923T085957", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.64990234375, "5m": 0.4482421875, "15m": 0.2275390625}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3302, "used": 229}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 550, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795254272, "block_size": 4096, "block_total": 65519099, "block_available": 63914857, "block_used": 1604242, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 24134 1727096397.46087: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096397.46108: _low_level_execute_command(): starting 24134 1727096397.46116: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096395.864405-24173-97990861984325/ > /dev/null 2>&1 && sleep 0' 24134 1727096397.46788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096397.46874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096397.46888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096397.46926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096397.46943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096397.46955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096397.47061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096397.49078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096397.49087: stdout chunk (state=3): >>><<< 24134 1727096397.49094: stderr chunk (state=3): >>><<< 24134 1727096397.49111: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096397.49121: handler run complete 24134 1727096397.49201: variable 'ansible_facts' from source: unknown 24134 1727096397.49263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096397.49451: variable 'ansible_facts' from source: unknown 24134 1727096397.49504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096397.49577: attempt loop complete, returning result 24134 1727096397.49581: _execute() done 24134 1727096397.49583: dumping result to json 24134 1727096397.49601: done dumping result, returning 24134 1727096397.49608: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-1673-d3fc-0000000000a3] 24134 1727096397.49611: sending task result for task 0afff68d-5257-1673-d3fc-0000000000a3 24134 1727096397.50159: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000a3 ok: [managed_node1] 24134 1727096397.50170: WORKER PROCESS EXITING 24134 1727096397.50231: no more pending results, returning what we have 24134 1727096397.50236: results queue empty 24134 1727096397.50237: checking for any_errors_fatal 24134 1727096397.50238: done checking for any_errors_fatal 24134 1727096397.50238: checking for max_fail_percentage 24134 1727096397.50239: done checking for max_fail_percentage 24134 1727096397.50239: checking to see if all hosts have failed and the running result is not ok 24134 1727096397.50240: done checking to see if all hosts have failed 24134 1727096397.50241: getting the remaining hosts for this loop 24134 1727096397.50242: done getting the remaining hosts for this loop 24134 1727096397.50245: getting the next task for host managed_node1 24134 1727096397.50249: done getting next task for host managed_node1 24134 1727096397.50250: ^ task is: TASK: meta (flush_handlers) 24134 1727096397.50252: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096397.50255: getting variables 24134 1727096397.50256: in VariableManager get_vars() 24134 1727096397.50276: Calling all_inventory to load vars for managed_node1 24134 1727096397.50278: Calling groups_inventory to load vars for managed_node1 24134 1727096397.50280: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096397.50287: Calling all_plugins_play to load vars for managed_node1 24134 1727096397.50288: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096397.50290: Calling groups_plugins_play to load vars for managed_node1 24134 1727096397.50404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096397.50519: done with get_vars() 24134 1727096397.50529: done getting variables 24134 1727096397.50578: in VariableManager get_vars() 24134 1727096397.50592: Calling all_inventory to load vars for managed_node1 24134 1727096397.50595: Calling groups_inventory to load vars for managed_node1 24134 1727096397.50597: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096397.50602: Calling all_plugins_play to load vars for managed_node1 24134 1727096397.50605: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096397.50607: Calling groups_plugins_play to load vars for managed_node1 24134 1727096397.50738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096397.50918: done with get_vars() 24134 1727096397.50930: done queuing things up, now waiting for results queue to drain 24134 1727096397.50933: results queue empty 24134 1727096397.50933: checking for any_errors_fatal 24134 1727096397.50935: done checking for any_errors_fatal 24134 1727096397.50936: checking for max_fail_percentage 24134 1727096397.50937: done checking for max_fail_percentage 24134 1727096397.50942: checking to see if all hosts have failed and the running result is not ok 24134 1727096397.50943: done checking to see if all hosts have failed 24134 1727096397.50944: getting the remaining hosts for this loop 24134 1727096397.50945: done getting the remaining hosts for this loop 24134 1727096397.50947: getting the next task for host managed_node1 24134 1727096397.50951: done getting next task for host managed_node1 24134 1727096397.50953: ^ task is: TASK: Include the task 'el_repo_setup.yml' 24134 1727096397.50955: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096397.50957: getting variables 24134 1727096397.50957: in VariableManager get_vars() 24134 1727096397.50964: Calling all_inventory to load vars for managed_node1 24134 1727096397.50966: Calling groups_inventory to load vars for managed_node1 24134 1727096397.50972: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096397.50977: Calling all_plugins_play to load vars for managed_node1 24134 1727096397.50979: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096397.50981: Calling groups_plugins_play to load vars for managed_node1 24134 1727096397.51126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096397.51307: done with get_vars() 24134 1727096397.51315: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:11 Monday 23 September 2024 08:59:57 -0400 (0:00:01.707) 0:00:01.727 ****** 24134 1727096397.51393: entering _queue_task() for managed_node1/include_tasks 24134 1727096397.51394: Creating lock for include_tasks 24134 1727096397.51671: worker is 1 (out of 1 available) 24134 1727096397.51683: exiting _queue_task() for managed_node1/include_tasks 24134 1727096397.51693: done queuing things up, now waiting for results queue to drain 24134 1727096397.51694: waiting for pending results... 24134 1727096397.51846: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 24134 1727096397.51912: in run() - task 0afff68d-5257-1673-d3fc-000000000006 24134 1727096397.51923: variable 'ansible_search_path' from source: unknown 24134 1727096397.51950: calling self._execute() 24134 1727096397.52012: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096397.52015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096397.52024: variable 'omit' from source: magic vars 24134 1727096397.52110: _execute() done 24134 1727096397.52114: dumping result to json 24134 1727096397.52116: done dumping result, returning 24134 1727096397.52123: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0afff68d-5257-1673-d3fc-000000000006] 24134 1727096397.52134: sending task result for task 0afff68d-5257-1673-d3fc-000000000006 24134 1727096397.52237: done sending task result for task 0afff68d-5257-1673-d3fc-000000000006 24134 1727096397.52240: WORKER PROCESS EXITING 24134 1727096397.52291: no more pending results, returning what we have 24134 1727096397.52295: in VariableManager get_vars() 24134 1727096397.52324: Calling all_inventory to load vars for managed_node1 24134 1727096397.52327: Calling groups_inventory to load vars for managed_node1 24134 1727096397.52330: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096397.52340: Calling all_plugins_play to load vars for managed_node1 24134 1727096397.52342: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096397.52347: Calling groups_plugins_play to load vars for managed_node1 24134 1727096397.52491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096397.52614: done with get_vars() 24134 1727096397.52620: variable 'ansible_search_path' from source: unknown 24134 1727096397.52629: we have included files to process 24134 1727096397.52630: generating all_blocks data 24134 1727096397.52631: done generating all_blocks data 24134 1727096397.52631: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24134 1727096397.52632: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24134 1727096397.52633: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24134 1727096397.53090: in VariableManager get_vars() 24134 1727096397.53100: done with get_vars() 24134 1727096397.53107: done processing included file 24134 1727096397.53109: iterating over new_blocks loaded from include file 24134 1727096397.53109: in VariableManager get_vars() 24134 1727096397.53115: done with get_vars() 24134 1727096397.53116: filtering new block on tags 24134 1727096397.53125: done filtering new block on tags 24134 1727096397.53133: in VariableManager get_vars() 24134 1727096397.53140: done with get_vars() 24134 1727096397.53141: filtering new block on tags 24134 1727096397.53151: done filtering new block on tags 24134 1727096397.53153: in VariableManager get_vars() 24134 1727096397.53158: done with get_vars() 24134 1727096397.53159: filtering new block on tags 24134 1727096397.53166: done filtering new block on tags 24134 1727096397.53171: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 24134 1727096397.53175: extending task lists for all hosts with included blocks 24134 1727096397.53203: done extending task lists 24134 1727096397.53203: done processing included files 24134 1727096397.53204: results queue empty 24134 1727096397.53204: checking for any_errors_fatal 24134 1727096397.53205: done checking for any_errors_fatal 24134 1727096397.53205: checking for max_fail_percentage 24134 1727096397.53206: done checking for max_fail_percentage 24134 1727096397.53206: checking to see if all hosts have failed and the running result is not ok 24134 1727096397.53207: done checking to see if all hosts have failed 24134 1727096397.53207: getting the remaining hosts for this loop 24134 1727096397.53208: done getting the remaining hosts for this loop 24134 1727096397.53210: getting the next task for host managed_node1 24134 1727096397.53212: done getting next task for host managed_node1 24134 1727096397.53214: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 24134 1727096397.53215: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096397.53217: getting variables 24134 1727096397.53217: in VariableManager get_vars() 24134 1727096397.53222: Calling all_inventory to load vars for managed_node1 24134 1727096397.53223: Calling groups_inventory to load vars for managed_node1 24134 1727096397.53225: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096397.53228: Calling all_plugins_play to load vars for managed_node1 24134 1727096397.53230: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096397.53231: Calling groups_plugins_play to load vars for managed_node1 24134 1727096397.53331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096397.53438: done with get_vars() 24134 1727096397.53446: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Monday 23 September 2024 08:59:57 -0400 (0:00:00.021) 0:00:01.748 ****** 24134 1727096397.53524: entering _queue_task() for managed_node1/setup 24134 1727096397.54095: worker is 1 (out of 1 available) 24134 1727096397.54107: exiting _queue_task() for managed_node1/setup 24134 1727096397.54118: done queuing things up, now waiting for results queue to drain 24134 1727096397.54119: waiting for pending results... 24134 1727096397.54794: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 24134 1727096397.54800: in run() - task 0afff68d-5257-1673-d3fc-0000000000b4 24134 1727096397.54802: variable 'ansible_search_path' from source: unknown 24134 1727096397.54804: variable 'ansible_search_path' from source: unknown 24134 1727096397.54807: calling self._execute() 24134 1727096397.54917: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096397.54971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096397.54984: variable 'omit' from source: magic vars 24134 1727096397.55491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096397.58149: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096397.58223: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096397.58272: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096397.58324: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096397.58352: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096397.58432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096397.58462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096397.58500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096397.58546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096397.58575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096397.58743: variable 'ansible_facts' from source: unknown 24134 1727096397.58817: variable 'network_test_required_facts' from source: task vars 24134 1727096397.58858: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 24134 1727096397.58872: variable 'omit' from source: magic vars 24134 1727096397.58974: variable 'omit' from source: magic vars 24134 1727096397.58977: variable 'omit' from source: magic vars 24134 1727096397.58980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096397.59006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096397.59028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096397.59049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096397.59064: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096397.59098: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096397.59106: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096397.59113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096397.59213: Set connection var ansible_shell_executable to /bin/sh 24134 1727096397.59223: Set connection var ansible_pipelining to False 24134 1727096397.59232: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096397.59244: Set connection var ansible_timeout to 10 24134 1727096397.59251: Set connection var ansible_connection to ssh 24134 1727096397.59257: Set connection var ansible_shell_type to sh 24134 1727096397.59374: variable 'ansible_shell_executable' from source: unknown 24134 1727096397.59377: variable 'ansible_connection' from source: unknown 24134 1727096397.59383: variable 'ansible_module_compression' from source: unknown 24134 1727096397.59385: variable 'ansible_shell_type' from source: unknown 24134 1727096397.59387: variable 'ansible_shell_executable' from source: unknown 24134 1727096397.59389: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096397.59391: variable 'ansible_pipelining' from source: unknown 24134 1727096397.59393: variable 'ansible_timeout' from source: unknown 24134 1727096397.59395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096397.59471: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096397.59491: variable 'omit' from source: magic vars 24134 1727096397.59501: starting attempt loop 24134 1727096397.59508: running the handler 24134 1727096397.59524: _low_level_execute_command(): starting 24134 1727096397.59534: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096397.60288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096397.60364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096397.60485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096397.60573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096397.62304: stdout chunk (state=3): >>>/root <<< 24134 1727096397.62396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096397.62441: stderr chunk (state=3): >>><<< 24134 1727096397.62452: stdout chunk (state=3): >>><<< 24134 1727096397.62675: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096397.62686: _low_level_execute_command(): starting 24134 1727096397.62689: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117 `" && echo ansible-tmp-1727096397.6254888-24254-93210279844117="` echo /root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117 `" ) && sleep 0' 24134 1727096397.63742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096397.63754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096397.63766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096397.63846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096397.63994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096397.64083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096397.66292: stdout chunk (state=3): >>>ansible-tmp-1727096397.6254888-24254-93210279844117=/root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117 <<< 24134 1727096397.66304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096397.66354: stderr chunk (state=3): >>><<< 24134 1727096397.66357: stdout chunk (state=3): >>><<< 24134 1727096397.66399: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096397.6254888-24254-93210279844117=/root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096397.66440: variable 'ansible_module_compression' from source: unknown 24134 1727096397.66775: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24134 1727096397.66778: variable 'ansible_facts' from source: unknown 24134 1727096397.67105: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117/AnsiballZ_setup.py 24134 1727096397.67408: Sending initial data 24134 1727096397.67436: Sent initial data (153 bytes) 24134 1727096397.68800: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096397.68983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096397.69157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096397.69200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096397.69270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096397.70948: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 24134 1727096397.70962: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096397.71017: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096397.71102: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpu34gg7nh /root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117/AnsiballZ_setup.py <<< 24134 1727096397.71208: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117/AnsiballZ_setup.py" <<< 24134 1727096397.71293: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpu34gg7nh" to remote "/root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117/AnsiballZ_setup.py" <<< 24134 1727096397.73816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096397.73879: stderr chunk (state=3): >>><<< 24134 1727096397.73890: stdout chunk (state=3): >>><<< 24134 1727096397.73916: done transferring module to remote 24134 1727096397.73935: _low_level_execute_command(): starting 24134 1727096397.73944: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117/ /root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117/AnsiballZ_setup.py && sleep 0' 24134 1727096397.74625: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096397.74642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096397.74663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096397.74751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096397.74784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096397.74808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096397.74871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096397.75030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096397.77237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096397.77263: stderr chunk (state=3): >>><<< 24134 1727096397.77299: stdout chunk (state=3): >>><<< 24134 1727096397.77407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096397.77411: _low_level_execute_command(): starting 24134 1727096397.77413: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117/AnsiballZ_setup.py && sleep 0' 24134 1727096397.78029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096397.78043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096397.78060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096397.78085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096397.78180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096397.78201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096397.78218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096397.78238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096397.78340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096397.81518: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 24134 1727096397.81717: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 24134 1727096397.81812: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 24134 1727096397.81878: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 24134 1727096397.81892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096397.81922: stdout chunk (state=3): >>>import '_codecs' # <<< 24134 1727096397.81962: stdout chunk (state=3): >>>import 'codecs' # <<< 24134 1727096397.82115: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfc184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfbe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfc1aa50> import '_signal' # <<< 24134 1727096397.82137: stdout chunk (state=3): >>>import '_abc' # <<< 24134 1727096397.82157: stdout chunk (state=3): >>>import 'abc' # <<< 24134 1727096397.82179: stdout chunk (state=3): >>>import 'io' # <<< 24134 1727096397.82331: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 24134 1727096397.82366: stdout chunk (state=3): >>>import '_collections_abc' # <<< 24134 1727096397.82413: stdout chunk (state=3): >>>import 'genericpath' # <<< 24134 1727096397.82434: stdout chunk (state=3): >>>import 'posixpath' # <<< 24134 1727096397.82459: stdout chunk (state=3): >>>import 'os' # <<< 24134 1727096397.82495: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 24134 1727096397.82543: stdout chunk (state=3): >>>Processing user site-packages <<< 24134 1727096397.82550: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 24134 1727096397.82710: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa2dfa0> <<< 24134 1727096397.82738: stdout chunk (state=3): >>>import 'site' # <<< 24134 1727096397.82791: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24134 1727096397.83477: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa6bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 24134 1727096397.83481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa6bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 24134 1727096397.83528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 24134 1727096397.83531: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 24134 1727096397.83640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096397.83644: stdout chunk (state=3): >>>import 'itertools' # <<< 24134 1727096397.83789: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaa3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaa3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa83b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa812b0> <<< 24134 1727096397.83899: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa69070> <<< 24134 1727096397.83904: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 24134 1727096397.83910: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 24134 1727096397.83928: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 24134 1727096397.83974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 24134 1727096397.83987: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 24134 1727096397.84047: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfac37d0> <<< 24134 1727096397.84082: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfac23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfac0bc0> <<< 24134 1727096397.84184: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaf8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa682f0> <<< 24134 1727096397.84196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cfaf8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaf8bf0> <<< 24134 1727096397.84247: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096397.84254: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cfaf8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa66e10> <<< 24134 1727096397.84449: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaf9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaf9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfafa540> import 'importlib.util' # import 'runpy' # <<< 24134 1727096397.84504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 24134 1727096397.84703: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfb10740> import 'errno' # <<< 24134 1727096397.84706: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cfb11e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfb12cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cfb132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfb12210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 24134 1727096397.84730: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096397.84756: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cfb13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfb134a0> <<< 24134 1727096397.84791: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfafa4b0> <<< 24134 1727096397.84823: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 24134 1727096397.84842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 24134 1727096397.84866: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 24134 1727096397.85059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf847c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf8707a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf870500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf8707d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 24134 1727096397.85104: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096397.85229: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf871100> <<< 24134 1727096397.85370: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf871af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf8709b0> <<< 24134 1727096397.85389: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf845df0> <<< 24134 1727096397.85425: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 24134 1727096397.85496: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf872f00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf871c40> <<< 24134 1727096397.85682: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfafac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 24134 1727096397.85713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf89b230> <<< 24134 1727096397.85756: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 24134 1727096397.85780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 24134 1727096397.85892: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf8bf5f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 24134 1727096397.86233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf920380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 24134 1727096397.86252: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf922ae0> <<< 24134 1727096397.86348: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf9204a0> <<< 24134 1727096397.86395: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf8e1370> <<< 24134 1727096397.86422: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 24134 1727096397.86451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf721430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf8be3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf873e00> <<< 24134 1727096397.86750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f91cf8be750> <<< 24134 1727096397.87123: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_5t5fml59/ansible_setup_payload.zip'<<< 24134 1727096397.87143: stdout chunk (state=3): >>> # zipimport: zlib available<<< 24134 1727096397.87293: stdout chunk (state=3): >>> <<< 24134 1727096397.87464: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.87471: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc'<<< 24134 1727096397.87474: stdout chunk (state=3): >>> <<< 24134 1727096397.87800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 24134 1727096397.87804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf78b170> import '_typing' # <<< 24134 1727096397.87904: stdout chunk (state=3): >>> <<< 24134 1727096397.88096: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf76a060><<< 24134 1727096397.88117: stdout chunk (state=3): >>> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7691c0><<< 24134 1727096397.88591: stdout chunk (state=3): >>> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 24134 1727096397.90003: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.91200: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 24134 1727096397.91234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf789040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096397.91265: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 24134 1727096397.91286: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 24134 1727096397.91317: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf7bab10> <<< 24134 1727096397.91352: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7ba8a0> <<< 24134 1727096397.91445: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7ba1b0> <<< 24134 1727096397.91504: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7baba0> <<< 24134 1727096397.91680: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf78be00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf7bb860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf7bb9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 24134 1727096397.91684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 24134 1727096397.91706: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7bbef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 24134 1727096397.91740: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf129c10> <<< 24134 1727096397.91826: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf12b830> <<< 24134 1727096397.91830: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 24134 1727096397.91832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 24134 1727096397.91948: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf12c230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 24134 1727096397.92196: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf12d100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 24134 1727096397.92236: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 24134 1727096397.92239: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf12fe60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf872e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf12e120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 24134 1727096397.92323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 24134 1727096397.92442: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf137d10> import '_tokenize' # <<< 24134 1727096397.92877: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf1367e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf136540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf136ab0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf12e630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf17bf50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf17c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf17db50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf17d910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf180110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf17e240> <<< 24134 1727096397.92903: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 24134 1727096397.93012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096397.93016: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 24134 1727096397.93293: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf1838c0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf180290> <<< 24134 1727096397.93408: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf184980> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf184920> <<< 24134 1727096397.93601: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf184bf0> <<< 24134 1727096397.93615: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf17c230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf0102f0> <<< 24134 1727096397.93818: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf011880> <<< 24134 1727096397.93856: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf186a80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096397.93876: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf187e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf1866c0> <<< 24134 1727096397.93901: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 24134 1727096397.94170: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096397.94291: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 24134 1727096397.94586: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.94787: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.95605: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.96187: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 24134 1727096397.96214: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 24134 1727096397.96273: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 24134 1727096397.96305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf015a90> <<< 24134 1727096397.96503: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 24134 1727096397.96507: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0167e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf8719d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 24134 1727096397.96523: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 24134 1727096397.96665: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.97278: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf016810> # zipimport: zlib available <<< 24134 1727096397.97686: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.98318: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.98420: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.98544: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 24134 1727096397.98560: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.98592: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.98633: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 24134 1727096397.98663: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.98745: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.98877: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 24134 1727096397.98903: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096397.98926: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 24134 1727096397.98972: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096397.99023: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 24134 1727096397.99395: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096397.99806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 24134 1727096397.99875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 24134 1727096397.99885: stdout chunk (state=3): >>>import '_ast' # <<< 24134 1727096397.99988: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf017b00> # zipimport: zlib available <<< 24134 1727096398.00213: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 24134 1727096398.00238: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 24134 1727096398.00250: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.00288: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.00345: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 24134 1727096398.00363: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.00412: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.00463: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.00547: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.00639: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 24134 1727096398.00715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096398.00884: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf022420> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf01dd30> <<< 24134 1727096398.00931: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 24134 1727096398.00934: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.01019: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.01109: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.01216: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096398.01220: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 24134 1727096398.01263: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 24134 1727096398.01266: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 24134 1727096398.01385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 24134 1727096398.01389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 24134 1727096398.01535: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf10ac60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7e6930> <<< 24134 1727096398.01648: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0224b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf185b80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 24134 1727096398.01682: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.01696: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.01728: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 24134 1727096398.01738: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 24134 1727096398.01802: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 24134 1727096398.01914: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 24134 1727096398.01926: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.02007: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.02032: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.02048: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.02108: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.02154: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.02204: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.02251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 24134 1727096398.02264: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.02366: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.02473: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.02546: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # <<< 24134 1727096398.02561: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.02821: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.03085: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.03210: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096398.03255: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 24134 1727096398.03258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 24134 1727096398.03276: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 24134 1727096398.03303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 24134 1727096398.03345: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0b2990> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 24134 1727096398.03386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 24134 1727096398.03438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 24134 1727096398.03483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cec803e0> <<< 24134 1727096398.03538: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cec807d0> <<< 24134 1727096398.03624: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf09c500> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0b3530> <<< 24134 1727096398.03656: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0b1070> <<< 24134 1727096398.03682: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0b0cb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 24134 1727096398.03758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 24134 1727096398.03782: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 24134 1727096398.03815: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 24134 1727096398.03853: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cec83680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cec82f30> <<< 24134 1727096398.03896: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cec83110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cec82360> <<< 24134 1727096398.04006: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 24134 1727096398.04097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cec83770> <<< 24134 1727096398.04124: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 24134 1727096398.04161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 24134 1727096398.04210: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096398.04216: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cecce270> <<< 24134 1727096398.04272: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91ceccc290> <<< 24134 1727096398.04280: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0b0da0> import 'ansible.module_utils.facts.timeout' # <<< 24134 1727096398.04306: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 24134 1727096398.04326: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.04353: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 24134 1727096398.04481: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 24134 1727096398.04514: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.04573: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.04649: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 24134 1727096398.04683: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 24134 1727096398.04696: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.04721: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.04749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 24134 1727096398.04824: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096398.04895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 24134 1727096398.04898: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.04943: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.05005: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 24134 1727096398.05022: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.05089: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.05158: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.05322: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 24134 1727096398.06112: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.06789: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 24134 1727096398.06793: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.06854: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.07122: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 24134 1727096398.07207: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.07257: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 24134 1727096398.07260: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.07295: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.07339: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 24134 1727096398.07416: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 24134 1727096398.07420: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.07526: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.07650: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 24134 1727096398.07711: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91ceccf980> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 24134 1727096398.07738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 24134 1727096398.07928: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91ceccf020> import 'ansible.module_utils.facts.system.local' # <<< 24134 1727096398.08019: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.08032: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.08120: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 24134 1727096398.08137: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.08256: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.08398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 24134 1727096398.08591: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # <<< 24134 1727096398.08608: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.08653: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.08714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 24134 1727096398.08783: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 24134 1727096398.08874: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096398.08964: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91ced0e660> <<< 24134 1727096398.09279: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cecff260> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 24134 1727096398.09359: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.09440: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 24134 1727096398.09678: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096398.09837: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.10063: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 24134 1727096398.10079: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.10111: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.10175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 24134 1727096398.10229: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.10297: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 24134 1727096398.10316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 24134 1727096398.10358: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096398.10378: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91ced22270> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cecff650> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 24134 1727096398.10396: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 24134 1727096398.10456: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096398.10522: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 24134 1727096398.10529: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.10746: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.10978: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 24134 1727096398.11003: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.11129: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.11317: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.11335: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.11387: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 24134 1727096398.11398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 24134 1727096398.11448: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096398.11652: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.11851: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 24134 1727096398.12007: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.12031: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.12222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 24134 1727096398.12265: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.12313: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.13170: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.13975: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 24134 1727096398.13994: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.14130: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.14295: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 24134 1727096398.14298: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.14433: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.14582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 24134 1727096398.14595: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.14815: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.15053: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 24134 1727096398.15080: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 24134 1727096398.15134: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096398.15196: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 24134 1727096398.15335: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.15478: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.15791: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.16107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 24134 1727096398.16160: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096398.16217: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 24134 1727096398.16223: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.16237: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.16271: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 24134 1727096398.16373: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096398.16477: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 24134 1727096398.16490: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.16525: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.16546: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 24134 1727096398.16629: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.16703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 24134 1727096398.16786: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.16872: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 24134 1727096398.17326: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.17687: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 24134 1727096398.17806: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.17848: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 24134 1727096398.17930: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.17961: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 24134 1727096398.18084: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096398.18107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 24134 1727096398.18163: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 24134 1727096398.18306: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.18343: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 24134 1727096398.18409: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 24134 1727096398.18454: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.18520: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 24134 1727096398.18532: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.18566: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096398.18633: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.18687: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.18893: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 24134 1727096398.18904: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 24134 1727096398.18974: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.19047: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 24134 1727096398.19353: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.19657: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 24134 1727096398.19667: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.19724: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.19793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 24134 1727096398.19816: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.19857: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.19933: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 24134 1727096398.20775: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 24134 1727096398.21455: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 24134 1727096398.21459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 24134 1727096398.21479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 24134 1727096398.21504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 24134 1727096398.21539: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91ceb23500> <<< 24134 1727096398.21633: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91ceb22060> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91ceb20950> <<< 24134 1727096398.22744: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "59", "second": "58", "epoch": "1727096398", "epoch_int": "1727096398", "date": "2024-09-23", "time": "08:59:58", "iso8601_micro": "2024-09-23T12:59:58.219628Z", "iso8601": "2024-09-23T12:59:58Z", "iso8601_basic": "20240923T085958219628", "iso8601_basic_short": "20240923T085958", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr<<< 24134 1727096398.22774: stdout chunk (state=3): >>>/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24134 1727096398.23585: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type <<< 24134 1727096398.23644: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 24134 1727096398.23758: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache <<< 24134 1727096398.24014: stdout chunk (state=3): >>># cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # <<< 24134 1727096398.24042: stdout chunk (state=3): >>>destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 24134 1727096398.24325: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 24134 1727096398.24371: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 24134 1727096398.24375: stdout chunk (state=3): >>># destroy _bz2 <<< 24134 1727096398.24498: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy _blake2 <<< 24134 1727096398.24514: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 24134 1727096398.24673: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 24134 1727096398.24679: stdout chunk (state=3): >>># destroy _pickle <<< 24134 1727096398.24684: stdout chunk (state=3): >>># destroy queue <<< 24134 1727096398.24687: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 24134 1727096398.24692: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 24134 1727096398.24706: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 24134 1727096398.24860: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 24134 1727096398.24866: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 24134 1727096398.24887: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 24134 1727096398.25099: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 24134 1727096398.25114: stdout chunk (state=3): >>># cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 24134 1727096398.25224: stdout chunk (state=3): >>># destroy sys.monitoring <<< 24134 1727096398.25284: stdout chunk (state=3): >>># destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 24134 1727096398.25295: stdout chunk (state=3): >>># destroy tokenize <<< 24134 1727096398.25321: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 24134 1727096398.25387: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 24134 1727096398.25409: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 24134 1727096398.25615: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib <<< 24134 1727096398.25640: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc <<< 24134 1727096398.25774: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 24134 1727096398.26216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096398.26219: stdout chunk (state=3): >>><<< 24134 1727096398.26228: stderr chunk (state=3): >>><<< 24134 1727096398.26602: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfc184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfbe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfc1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa6bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa6bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaa3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaa3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa83b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa69070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfac37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfac23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfac0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaf8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cfaf8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaf8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cfaf8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfa66e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaf9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfaf9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfafa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfb10740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cfb11e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfb12cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cfb132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfb12210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cfb13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfb134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfafa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf847c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf8707a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf870500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf8707d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf871100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf871af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf8709b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf845df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf872f00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf871c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cfafac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf89b230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf8bf5f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf920380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf922ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf9204a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf8e1370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf721430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf8be3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf873e00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f91cf8be750> # zipimport: found 103 names in '/tmp/ansible_setup_payload_5t5fml59/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf78b170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf76a060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7691c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf789040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf7bab10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7ba8a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7ba1b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7baba0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf78be00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf7bb860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf7bb9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7bbef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf129c10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf12b830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf12c230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf12d100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf12fe60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf872e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf12e120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf137d10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf1367e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf136540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf136ab0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf12e630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf17bf50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf17c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf17db50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf17d910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf180110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf17e240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf1838c0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf180290> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf184980> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf184920> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf184bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf17c230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf0102f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf011880> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf186a80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf187e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf1866c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf015a90> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0167e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf8719d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf016810> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf017b00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cf022420> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf01dd30> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf10ac60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf7e6930> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0224b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf185b80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0b2990> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cec803e0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cec807d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf09c500> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0b3530> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0b1070> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0b0cb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cec83680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cec82f30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cec83110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cec82360> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cec83770> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91cecce270> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91ceccc290> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cf0b0da0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91ceccf980> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91ceccf020> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91ced0e660> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cecff260> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91ced22270> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91cecff650> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f91ceb23500> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91ceb22060> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f91ceb20950> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "59", "second": "58", "epoch": "1727096398", "epoch_int": "1727096398", "date": "2024-09-23", "time": "08:59:58", "iso8601_micro": "2024-09-23T12:59:58.219628Z", "iso8601": "2024-09-23T12:59:58Z", "iso8601_basic": "20240923T085958219628", "iso8601_basic_short": "20240923T085958", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 24134 1727096398.28722: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096398.28726: _low_level_execute_command(): starting 24134 1727096398.28728: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096397.6254888-24254-93210279844117/ > /dev/null 2>&1 && sleep 0' 24134 1727096398.28730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096398.28732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096398.28734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096398.28737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096398.28739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096398.28741: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096398.28743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096398.28745: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096398.28747: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096398.28749: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24134 1727096398.28751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096398.28753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096398.28755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096398.28757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096398.28759: stderr chunk (state=3): >>>debug2: match found <<< 24134 1727096398.28761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096398.28762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096398.28764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096398.28766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096398.28773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096398.31106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096398.31110: stderr chunk (state=3): >>><<< 24134 1727096398.31112: stdout chunk (state=3): >>><<< 24134 1727096398.31114: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096398.31117: handler run complete 24134 1727096398.31119: variable 'ansible_facts' from source: unknown 24134 1727096398.31121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096398.31397: variable 'ansible_facts' from source: unknown 24134 1727096398.31457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096398.31507: attempt loop complete, returning result 24134 1727096398.31510: _execute() done 24134 1727096398.31513: dumping result to json 24134 1727096398.31525: done dumping result, returning 24134 1727096398.31539: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0afff68d-5257-1673-d3fc-0000000000b4] 24134 1727096398.31541: sending task result for task 0afff68d-5257-1673-d3fc-0000000000b4 ok: [managed_node1] 24134 1727096398.32137: no more pending results, returning what we have 24134 1727096398.32139: results queue empty 24134 1727096398.32140: checking for any_errors_fatal 24134 1727096398.32142: done checking for any_errors_fatal 24134 1727096398.32142: checking for max_fail_percentage 24134 1727096398.32144: done checking for max_fail_percentage 24134 1727096398.32145: checking to see if all hosts have failed and the running result is not ok 24134 1727096398.32145: done checking to see if all hosts have failed 24134 1727096398.32146: getting the remaining hosts for this loop 24134 1727096398.32147: done getting the remaining hosts for this loop 24134 1727096398.32151: getting the next task for host managed_node1 24134 1727096398.32161: done getting next task for host managed_node1 24134 1727096398.32163: ^ task is: TASK: Check if system is ostree 24134 1727096398.32166: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096398.32173: getting variables 24134 1727096398.32175: in VariableManager get_vars() 24134 1727096398.32202: Calling all_inventory to load vars for managed_node1 24134 1727096398.32204: Calling groups_inventory to load vars for managed_node1 24134 1727096398.32207: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096398.32217: Calling all_plugins_play to load vars for managed_node1 24134 1727096398.32219: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096398.32221: Calling groups_plugins_play to load vars for managed_node1 24134 1727096398.32693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096398.33105: done with get_vars() 24134 1727096398.33116: done getting variables 24134 1727096398.33221: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000b4 24134 1727096398.33224: WORKER PROCESS EXITING TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Monday 23 September 2024 08:59:58 -0400 (0:00:00.797) 0:00:02.546 ****** 24134 1727096398.33309: entering _queue_task() for managed_node1/stat 24134 1727096398.33951: worker is 1 (out of 1 available) 24134 1727096398.33964: exiting _queue_task() for managed_node1/stat 24134 1727096398.33978: done queuing things up, now waiting for results queue to drain 24134 1727096398.33980: waiting for pending results... 24134 1727096398.34345: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 24134 1727096398.34674: in run() - task 0afff68d-5257-1673-d3fc-0000000000b6 24134 1727096398.34679: variable 'ansible_search_path' from source: unknown 24134 1727096398.34681: variable 'ansible_search_path' from source: unknown 24134 1727096398.34688: calling self._execute() 24134 1727096398.34743: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096398.35175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096398.35179: variable 'omit' from source: magic vars 24134 1727096398.35973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096398.36303: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096398.36357: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096398.36773: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096398.36776: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096398.36778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096398.37173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096398.37178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096398.37180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096398.37383: Evaluated conditional (not __network_is_ostree is defined): True 24134 1727096398.37394: variable 'omit' from source: magic vars 24134 1727096398.37434: variable 'omit' from source: magic vars 24134 1727096398.37479: variable 'omit' from source: magic vars 24134 1727096398.37510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096398.37543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096398.37566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096398.37588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096398.37784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096398.37823: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096398.37864: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096398.37875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096398.37976: Set connection var ansible_shell_executable to /bin/sh 24134 1727096398.37987: Set connection var ansible_pipelining to False 24134 1727096398.37996: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096398.38009: Set connection var ansible_timeout to 10 24134 1727096398.38373: Set connection var ansible_connection to ssh 24134 1727096398.38376: Set connection var ansible_shell_type to sh 24134 1727096398.38378: variable 'ansible_shell_executable' from source: unknown 24134 1727096398.38381: variable 'ansible_connection' from source: unknown 24134 1727096398.38383: variable 'ansible_module_compression' from source: unknown 24134 1727096398.38385: variable 'ansible_shell_type' from source: unknown 24134 1727096398.38387: variable 'ansible_shell_executable' from source: unknown 24134 1727096398.38389: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096398.38390: variable 'ansible_pipelining' from source: unknown 24134 1727096398.38392: variable 'ansible_timeout' from source: unknown 24134 1727096398.38394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096398.38396: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096398.38399: variable 'omit' from source: magic vars 24134 1727096398.38580: starting attempt loop 24134 1727096398.38590: running the handler 24134 1727096398.38607: _low_level_execute_command(): starting 24134 1727096398.38618: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096398.39814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096398.39830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096398.39846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096398.40174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096398.40196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096398.40295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096398.41983: stdout chunk (state=3): >>>/root <<< 24134 1727096398.42228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096398.42239: stdout chunk (state=3): >>><<< 24134 1727096398.42254: stderr chunk (state=3): >>><<< 24134 1727096398.42283: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096398.42310: _low_level_execute_command(): starting 24134 1727096398.42321: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699 `" && echo ansible-tmp-1727096398.422966-24297-24379617020699="` echo /root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699 `" ) && sleep 0' 24134 1727096398.43494: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096398.43771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096398.43791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096398.43808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096398.43912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096398.45919: stdout chunk (state=3): >>>ansible-tmp-1727096398.422966-24297-24379617020699=/root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699 <<< 24134 1727096398.46027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096398.46080: stderr chunk (state=3): >>><<< 24134 1727096398.46089: stdout chunk (state=3): >>><<< 24134 1727096398.46111: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096398.422966-24297-24379617020699=/root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096398.46322: variable 'ansible_module_compression' from source: unknown 24134 1727096398.46384: ANSIBALLZ: Using lock for stat 24134 1727096398.46391: ANSIBALLZ: Acquiring lock 24134 1727096398.46398: ANSIBALLZ: Lock acquired: 140085163808080 24134 1727096398.46405: ANSIBALLZ: Creating module 24134 1727096398.73170: ANSIBALLZ: Writing module into payload 24134 1727096398.73282: ANSIBALLZ: Writing module 24134 1727096398.73300: ANSIBALLZ: Renaming module 24134 1727096398.73306: ANSIBALLZ: Done creating module 24134 1727096398.73323: variable 'ansible_facts' from source: unknown 24134 1727096398.73419: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699/AnsiballZ_stat.py 24134 1727096398.73598: Sending initial data 24134 1727096398.73607: Sent initial data (151 bytes) 24134 1727096398.74224: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096398.74283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096398.74354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096398.74381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096398.74399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096398.74501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096398.76358: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096398.76454: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096398.76494: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpa6crch2i /root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699/AnsiballZ_stat.py <<< 24134 1727096398.76503: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699/AnsiballZ_stat.py" <<< 24134 1727096398.76827: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpa6crch2i" to remote "/root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699/AnsiballZ_stat.py" <<< 24134 1727096398.77677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096398.77750: stderr chunk (state=3): >>><<< 24134 1727096398.77762: stdout chunk (state=3): >>><<< 24134 1727096398.77800: done transferring module to remote 24134 1727096398.77831: _low_level_execute_command(): starting 24134 1727096398.77840: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699/ /root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699/AnsiballZ_stat.py && sleep 0' 24134 1727096398.78518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096398.78596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096398.78646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096398.78665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096398.78723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096398.78888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096398.81737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096398.81741: stdout chunk (state=3): >>><<< 24134 1727096398.81743: stderr chunk (state=3): >>><<< 24134 1727096398.81745: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096398.81747: _low_level_execute_command(): starting 24134 1727096398.81749: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699/AnsiballZ_stat.py && sleep 0' 24134 1727096398.82881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096398.82901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096398.82956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096398.82979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096398.83001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096398.83180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096398.86623: stdout chunk (state=3): >>>import _frozen_importlib # frozen<<< 24134 1727096398.86657: stdout chunk (state=3): >>> import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 24134 1727096398.86700: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 24134 1727096398.86736: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 24134 1727096398.86761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 24134 1727096398.86805: stdout chunk (state=3): >>>import 'codecs' # <<< 24134 1727096398.86842: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 24134 1727096398.86883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 24134 1727096398.86888: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b86184d0> <<< 24134 1727096398.86909: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b85e7b30> <<< 24134 1727096398.87024: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b861aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 24134 1727096398.87047: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 24134 1727096398.87158: stdout chunk (state=3): >>>import '_collections_abc' # <<< 24134 1727096398.87192: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 24134 1727096398.87257: stdout chunk (state=3): >>>import 'os' # <<< 24134 1727096398.87263: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 24134 1727096398.87290: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 24134 1727096398.87417: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b842d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 24134 1727096398.87442: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b842dfa0> <<< 24134 1727096398.87471: stdout chunk (state=3): >>>import 'site' # <<< 24134 1727096398.87509: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24134 1727096398.87853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 24134 1727096398.87863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 24134 1727096398.87897: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 24134 1727096398.87910: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096398.87928: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 24134 1727096398.87982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 24134 1727096398.88127: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b846bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b846bf80> <<< 24134 1727096398.88147: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 24134 1727096398.88172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 24134 1727096398.88194: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 24134 1727096398.88284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096398.88287: stdout chunk (state=3): >>>import 'itertools' # <<< 24134 1727096398.88339: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84a3830> <<< 24134 1727096398.88352: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84a3ec0> <<< 24134 1727096398.88609: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8483b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8469070> <<< 24134 1727096398.88629: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 24134 1727096398.88671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 24134 1727096398.88700: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 24134 1727096398.88744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 24134 1727096398.88756: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 24134 1727096398.88810: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84c37d0> <<< 24134 1727096398.88831: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84c23f0> <<< 24134 1727096398.88856: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8482150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84c0bc0> <<< 24134 1727096398.88941: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 24134 1727096398.88964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 24134 1727096398.89009: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b84f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84f8bf0> <<< 24134 1727096398.89061: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b84f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8466e10> <<< 24134 1727096398.89126: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096398.89223: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84f9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84fa540> <<< 24134 1727096398.89262: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 24134 1727096398.89282: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 24134 1727096398.89346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 24134 1727096398.89373: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8510740> <<< 24134 1727096398.89388: stdout chunk (state=3): >>>import 'errno' # <<< 24134 1727096398.89429: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096398.89440: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b8511e20> <<< 24134 1727096398.89503: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8512cc0> <<< 24134 1727096398.89557: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b85132f0> <<< 24134 1727096398.89601: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8512210> <<< 24134 1727096398.89604: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 24134 1727096398.89822: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b8513d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b85134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84fa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 24134 1727096398.89836: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b8297c50> <<< 24134 1727096398.89860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 24134 1727096398.89888: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b82c07a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82c0500> <<< 24134 1727096398.89931: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096398.89933: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b82c07d0> <<< 24134 1727096398.89964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 24134 1727096398.89966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 24134 1727096398.90056: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096398.90237: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b82c1100> <<< 24134 1727096398.90421: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b82c1af0> <<< 24134 1727096398.90432: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82c09b0> <<< 24134 1727096398.90448: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8295df0> <<< 24134 1727096398.90480: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 24134 1727096398.90500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 24134 1727096398.90706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82c2f00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82c1c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 24134 1727096398.90757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 24134 1727096398.90795: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82e7230> <<< 24134 1727096398.90856: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 24134 1727096398.90875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096398.90909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 24134 1727096398.90922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 24134 1727096398.90979: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b830f620> <<< 24134 1727096398.91001: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 24134 1727096398.91060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 24134 1727096398.91133: stdout chunk (state=3): >>>import 'ntpath' # <<< 24134 1727096398.91159: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8370380> <<< 24134 1727096398.91185: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 24134 1727096398.91221: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 24134 1727096398.91253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 24134 1727096398.91314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 24134 1727096398.91623: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8372ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b83704a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b83313a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8175430> <<< 24134 1727096398.91653: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b830e420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82c3e00> <<< 24134 1727096398.91826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 24134 1727096398.91841: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f56b830e780> <<< 24134 1727096398.92060: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_ov915s7r/ansible_stat_payload.zip' # zipimport: zlib available <<< 24134 1727096398.92274: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.92296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 24134 1727096398.92316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 24134 1727096398.92365: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 24134 1727096398.92471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 24134 1727096398.92507: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 24134 1727096398.92517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81cb1d0> <<< 24134 1727096398.92525: stdout chunk (state=3): >>>import '_typing' # <<< 24134 1727096398.92798: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81aa0c0> <<< 24134 1727096398.92803: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81a9220> <<< 24134 1727096398.93002: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 24134 1727096398.95095: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096398.97122: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81c8f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b81f29f0> <<< 24134 1727096398.97127: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81f27b0> <<< 24134 1727096398.97186: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81f20f0> <<< 24134 1727096398.97200: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 24134 1727096398.97204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 24134 1727096398.97234: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81f2540> <<< 24134 1727096398.97259: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81cbe60> import 'atexit' # <<< 24134 1727096398.97416: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b81f3740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b81f38c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 24134 1727096398.97474: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81f3cb0> import 'pwd' # <<< 24134 1727096398.97710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b15af0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b17710> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b180e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 24134 1727096398.97752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 24134 1727096398.97770: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b19250> <<< 24134 1727096398.97794: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 24134 1727096398.98000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b1bd40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b82c2e70> <<< 24134 1727096398.98018: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b1a030> <<< 24134 1727096398.98043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 24134 1727096398.98072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 24134 1727096398.98100: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 24134 1727096398.98123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 24134 1727096398.98159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 24134 1727096398.98193: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 24134 1727096398.98311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b23b30> import '_tokenize' # <<< 24134 1727096398.98333: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b22600> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b22360> <<< 24134 1727096398.98403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 24134 1727096398.98555: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b228d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b1a540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b6bd70> <<< 24134 1727096398.98578: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b6bdd0> <<< 24134 1727096398.98593: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 24134 1727096398.98710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b6d910> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b6d6d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 24134 1727096398.98864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 24134 1727096398.98933: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b6fe60> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b6e000> <<< 24134 1727096398.98966: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 24134 1727096398.99017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096398.99070: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 24134 1727096398.99085: stdout chunk (state=3): >>>import '_string' # <<< 24134 1727096398.99293: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b73620> <<< 24134 1727096398.99323: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b6ffb0> <<< 24134 1727096398.99407: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b74680> <<< 24134 1727096398.99516: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b74620> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b74950> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b6bef0> <<< 24134 1727096398.99711: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b77fe0> <<< 24134 1727096398.99861: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096398.99888: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7a015b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b76810> <<< 24134 1727096398.99917: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 24134 1727096399.00061: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b77b90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b76450> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 24134 1727096399.00111: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.00241: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.00382: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 24134 1727096399.00386: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 24134 1727096399.00408: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.00708: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24134 1727096399.01565: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.02452: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 24134 1727096399.02487: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 24134 1727096399.02511: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 24134 1727096399.02524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096399.02591: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7a05730> <<< 24134 1727096399.02705: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 24134 1727096399.02729: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a06420> <<< 24134 1727096399.02753: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a019a0> <<< 24134 1727096399.02804: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 24134 1727096399.02845: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.02848: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.02883: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 24134 1727096399.03103: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.03449: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 24134 1727096399.03463: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a06330> <<< 24134 1727096399.03713: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.04141: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.04855: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.04962: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.05061: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 24134 1727096399.05084: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.05123: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.05165: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 24134 1727096399.05201: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.05279: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.05391: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 24134 1727096399.05608: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 24134 1727096399.05886: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.06317: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 24134 1727096399.06602: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a07590> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 24134 1727096399.06605: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.06672: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.06676: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 24134 1727096399.06797: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.06800: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.06870: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.06924: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 24134 1727096399.07002: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7a120f0> <<< 24134 1727096399.07439: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a0cdd0> <<< 24134 1727096399.07442: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 24134 1727096399.07445: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 24134 1727096399.07447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 24134 1727096399.07490: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b02900> <<< 24134 1727096399.07550: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b822e5d0> <<< 24134 1727096399.07870: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a11eb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a087d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 24134 1727096399.08015: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.08345: stdout chunk (state=3): >>># zipimport: zlib available <<< 24134 1727096399.08486: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 24134 1727096399.08507: stdout chunk (state=3): >>># destroy __main__ <<< 24134 1727096399.09227: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc <<< 24134 1727096399.09240: stdout chunk (state=3): >>># cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves <<< 24134 1727096399.09374: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters <<< 24134 1727096399.09498: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 24134 1727096399.09639: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 24134 1727096399.09664: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 24134 1727096399.09699: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 24134 1727096399.09725: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 24134 1727096399.09814: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 24134 1727096399.10060: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader <<< 24134 1727096399.10074: stdout chunk (state=3): >>># cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 24134 1727096399.10221: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 24134 1727096399.10245: stdout chunk (state=3): >>># destroy _collections <<< 24134 1727096399.10365: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 24134 1727096399.10574: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 24134 1727096399.10578: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 24134 1727096399.10591: stdout chunk (state=3): >>># destroy itertools <<< 24134 1727096399.10679: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 24134 1727096399.11121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096399.11125: stdout chunk (state=3): >>><<< 24134 1727096399.11127: stderr chunk (state=3): >>><<< 24134 1727096399.11681: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b86184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b85e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b861aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b842d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b842dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b846bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b846bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8483b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8469070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8482150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b84f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b84f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8466e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84f9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84fa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8510740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b8511e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8512cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b85132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8512210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b8513d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b85134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84fa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b8297c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b82c07a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82c0500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b82c07d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b82c1100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b82c1af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82c09b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8295df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82c2f00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82c1c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b84fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82e7230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b830f620> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8370380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8372ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b83704a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b83313a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b8175430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b830e420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b82c3e00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f56b830e780> # zipimport: found 30 names in '/tmp/ansible_stat_payload_ov915s7r/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81cb1d0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81aa0c0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81a9220> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81c8f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b81f29f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81f27b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81f20f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81f2540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81cbe60> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b81f3740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b81f38c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b81f3cb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b15af0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b17710> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b180e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b19250> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b1bd40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b82c2e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b1a030> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b23b30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b22600> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b22360> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b228d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b1a540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b6bd70> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b6bdd0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b6d910> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b6d6d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b6fe60> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b6e000> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b73620> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b6ffb0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b74680> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b74620> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b74950> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b6bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b77fe0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7a015b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b76810> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7b77b90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b76450> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7a05730> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a06420> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a019a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a06330> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a07590> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56b7a120f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a0cdd0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7b02900> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b822e5d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a11eb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56b7a087d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 24134 1727096399.13500: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096399.13688: _low_level_execute_command(): starting 24134 1727096399.13691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096398.422966-24297-24379617020699/ > /dev/null 2>&1 && sleep 0' 24134 1727096399.13693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096399.13696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096399.13698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096399.13700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096399.15075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096399.15111: stderr chunk (state=3): >>><<< 24134 1727096399.15114: stdout chunk (state=3): >>><<< 24134 1727096399.15358: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096399.15362: handler run complete 24134 1727096399.15364: attempt loop complete, returning result 24134 1727096399.15366: _execute() done 24134 1727096399.15371: dumping result to json 24134 1727096399.15374: done dumping result, returning 24134 1727096399.15376: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0afff68d-5257-1673-d3fc-0000000000b6] 24134 1727096399.15379: sending task result for task 0afff68d-5257-1673-d3fc-0000000000b6 24134 1727096399.15445: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000b6 24134 1727096399.15448: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 24134 1727096399.15520: no more pending results, returning what we have 24134 1727096399.15523: results queue empty 24134 1727096399.15525: checking for any_errors_fatal 24134 1727096399.15533: done checking for any_errors_fatal 24134 1727096399.15534: checking for max_fail_percentage 24134 1727096399.15535: done checking for max_fail_percentage 24134 1727096399.15536: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.15537: done checking to see if all hosts have failed 24134 1727096399.15538: getting the remaining hosts for this loop 24134 1727096399.15540: done getting the remaining hosts for this loop 24134 1727096399.15544: getting the next task for host managed_node1 24134 1727096399.15550: done getting next task for host managed_node1 24134 1727096399.15553: ^ task is: TASK: Set flag to indicate system is ostree 24134 1727096399.15556: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.15560: getting variables 24134 1727096399.15562: in VariableManager get_vars() 24134 1727096399.15597: Calling all_inventory to load vars for managed_node1 24134 1727096399.15600: Calling groups_inventory to load vars for managed_node1 24134 1727096399.15603: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.15613: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.15616: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.15618: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.16212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.16797: done with get_vars() 24134 1727096399.16809: done getting variables 24134 1727096399.17105: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Monday 23 September 2024 08:59:59 -0400 (0:00:00.838) 0:00:03.384 ****** 24134 1727096399.17134: entering _queue_task() for managed_node1/set_fact 24134 1727096399.17135: Creating lock for set_fact 24134 1727096399.17541: worker is 1 (out of 1 available) 24134 1727096399.17553: exiting _queue_task() for managed_node1/set_fact 24134 1727096399.17566: done queuing things up, now waiting for results queue to drain 24134 1727096399.17771: waiting for pending results... 24134 1727096399.17897: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 24134 1727096399.18207: in run() - task 0afff68d-5257-1673-d3fc-0000000000b7 24134 1727096399.18225: variable 'ansible_search_path' from source: unknown 24134 1727096399.18289: variable 'ansible_search_path' from source: unknown 24134 1727096399.18332: calling self._execute() 24134 1727096399.18473: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.18535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.18548: variable 'omit' from source: magic vars 24134 1727096399.19726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096399.20344: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096399.20423: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096399.20675: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096399.20679: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096399.20752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096399.20840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096399.20957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096399.20996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096399.21250: Evaluated conditional (not __network_is_ostree is defined): True 24134 1727096399.21263: variable 'omit' from source: magic vars 24134 1727096399.21338: variable 'omit' from source: magic vars 24134 1727096399.21488: variable '__ostree_booted_stat' from source: set_fact 24134 1727096399.21544: variable 'omit' from source: magic vars 24134 1727096399.21586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096399.21619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096399.21643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096399.21665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096399.21691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096399.21726: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096399.21735: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.21790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.21862: Set connection var ansible_shell_executable to /bin/sh 24134 1727096399.21877: Set connection var ansible_pipelining to False 24134 1727096399.21888: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096399.21908: Set connection var ansible_timeout to 10 24134 1727096399.21917: Set connection var ansible_connection to ssh 24134 1727096399.21924: Set connection var ansible_shell_type to sh 24134 1727096399.21950: variable 'ansible_shell_executable' from source: unknown 24134 1727096399.21961: variable 'ansible_connection' from source: unknown 24134 1727096399.22275: variable 'ansible_module_compression' from source: unknown 24134 1727096399.22279: variable 'ansible_shell_type' from source: unknown 24134 1727096399.22281: variable 'ansible_shell_executable' from source: unknown 24134 1727096399.22283: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.22285: variable 'ansible_pipelining' from source: unknown 24134 1727096399.22287: variable 'ansible_timeout' from source: unknown 24134 1727096399.22289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.22299: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096399.22313: variable 'omit' from source: magic vars 24134 1727096399.22322: starting attempt loop 24134 1727096399.22328: running the handler 24134 1727096399.22373: handler run complete 24134 1727096399.22376: attempt loop complete, returning result 24134 1727096399.22378: _execute() done 24134 1727096399.22381: dumping result to json 24134 1727096399.22384: done dumping result, returning 24134 1727096399.22387: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0afff68d-5257-1673-d3fc-0000000000b7] 24134 1727096399.22390: sending task result for task 0afff68d-5257-1673-d3fc-0000000000b7 ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 24134 1727096399.22554: no more pending results, returning what we have 24134 1727096399.22558: results queue empty 24134 1727096399.22559: checking for any_errors_fatal 24134 1727096399.22564: done checking for any_errors_fatal 24134 1727096399.22565: checking for max_fail_percentage 24134 1727096399.22566: done checking for max_fail_percentage 24134 1727096399.22571: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.22572: done checking to see if all hosts have failed 24134 1727096399.22573: getting the remaining hosts for this loop 24134 1727096399.22574: done getting the remaining hosts for this loop 24134 1727096399.22579: getting the next task for host managed_node1 24134 1727096399.22587: done getting next task for host managed_node1 24134 1727096399.22591: ^ task is: TASK: Fix CentOS6 Base repo 24134 1727096399.22594: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.22597: getting variables 24134 1727096399.22599: in VariableManager get_vars() 24134 1727096399.22629: Calling all_inventory to load vars for managed_node1 24134 1727096399.22632: Calling groups_inventory to load vars for managed_node1 24134 1727096399.22636: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.22647: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.22649: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.22652: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.23284: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000b7 24134 1727096399.23294: WORKER PROCESS EXITING 24134 1727096399.23316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.23731: done with get_vars() 24134 1727096399.23742: done getting variables 24134 1727096399.23966: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Monday 23 September 2024 08:59:59 -0400 (0:00:00.069) 0:00:03.454 ****** 24134 1727096399.24100: entering _queue_task() for managed_node1/copy 24134 1727096399.24537: worker is 1 (out of 1 available) 24134 1727096399.24549: exiting _queue_task() for managed_node1/copy 24134 1727096399.24561: done queuing things up, now waiting for results queue to drain 24134 1727096399.24562: waiting for pending results... 24134 1727096399.24825: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 24134 1727096399.24937: in run() - task 0afff68d-5257-1673-d3fc-0000000000b9 24134 1727096399.24961: variable 'ansible_search_path' from source: unknown 24134 1727096399.24970: variable 'ansible_search_path' from source: unknown 24134 1727096399.25011: calling self._execute() 24134 1727096399.25096: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.25107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.25123: variable 'omit' from source: magic vars 24134 1727096399.25601: variable 'ansible_distribution' from source: facts 24134 1727096399.25626: Evaluated conditional (ansible_distribution == 'CentOS'): True 24134 1727096399.25751: variable 'ansible_distribution_major_version' from source: facts 24134 1727096399.25768: Evaluated conditional (ansible_distribution_major_version == '6'): False 24134 1727096399.25776: when evaluation is False, skipping this task 24134 1727096399.25783: _execute() done 24134 1727096399.25791: dumping result to json 24134 1727096399.25797: done dumping result, returning 24134 1727096399.25806: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0afff68d-5257-1673-d3fc-0000000000b9] 24134 1727096399.25820: sending task result for task 0afff68d-5257-1673-d3fc-0000000000b9 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 24134 1727096399.26094: no more pending results, returning what we have 24134 1727096399.26097: results queue empty 24134 1727096399.26098: checking for any_errors_fatal 24134 1727096399.26102: done checking for any_errors_fatal 24134 1727096399.26103: checking for max_fail_percentage 24134 1727096399.26104: done checking for max_fail_percentage 24134 1727096399.26105: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.26105: done checking to see if all hosts have failed 24134 1727096399.26106: getting the remaining hosts for this loop 24134 1727096399.26107: done getting the remaining hosts for this loop 24134 1727096399.26110: getting the next task for host managed_node1 24134 1727096399.26116: done getting next task for host managed_node1 24134 1727096399.26118: ^ task is: TASK: Include the task 'enable_epel.yml' 24134 1727096399.26120: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.26123: getting variables 24134 1727096399.26124: in VariableManager get_vars() 24134 1727096399.26146: Calling all_inventory to load vars for managed_node1 24134 1727096399.26149: Calling groups_inventory to load vars for managed_node1 24134 1727096399.26152: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.26160: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.26163: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.26165: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.26336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.26548: done with get_vars() 24134 1727096399.26557: done getting variables 24134 1727096399.26589: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000b9 24134 1727096399.26593: WORKER PROCESS EXITING TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Monday 23 September 2024 08:59:59 -0400 (0:00:00.025) 0:00:03.480 ****** 24134 1727096399.26691: entering _queue_task() for managed_node1/include_tasks 24134 1727096399.27514: worker is 1 (out of 1 available) 24134 1727096399.27523: exiting _queue_task() for managed_node1/include_tasks 24134 1727096399.27533: done queuing things up, now waiting for results queue to drain 24134 1727096399.27534: waiting for pending results... 24134 1727096399.27952: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 24134 1727096399.28017: in run() - task 0afff68d-5257-1673-d3fc-0000000000ba 24134 1727096399.28065: variable 'ansible_search_path' from source: unknown 24134 1727096399.28273: variable 'ansible_search_path' from source: unknown 24134 1727096399.28276: calling self._execute() 24134 1727096399.28393: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.28404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.28416: variable 'omit' from source: magic vars 24134 1727096399.29499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096399.34271: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096399.34475: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096399.34479: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096399.34560: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096399.34664: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096399.34861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096399.34901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096399.34930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096399.35006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096399.35092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096399.35319: variable '__network_is_ostree' from source: set_fact 24134 1727096399.35509: Evaluated conditional (not __network_is_ostree | d(false)): True 24134 1727096399.35512: _execute() done 24134 1727096399.35514: dumping result to json 24134 1727096399.35517: done dumping result, returning 24134 1727096399.35519: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0afff68d-5257-1673-d3fc-0000000000ba] 24134 1727096399.35521: sending task result for task 0afff68d-5257-1673-d3fc-0000000000ba 24134 1727096399.35594: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000ba 24134 1727096399.35597: WORKER PROCESS EXITING 24134 1727096399.35640: no more pending results, returning what we have 24134 1727096399.35645: in VariableManager get_vars() 24134 1727096399.35683: Calling all_inventory to load vars for managed_node1 24134 1727096399.35686: Calling groups_inventory to load vars for managed_node1 24134 1727096399.35690: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.35700: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.35703: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.35706: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.36095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.36658: done with get_vars() 24134 1727096399.36671: variable 'ansible_search_path' from source: unknown 24134 1727096399.36672: variable 'ansible_search_path' from source: unknown 24134 1727096399.36708: we have included files to process 24134 1727096399.36710: generating all_blocks data 24134 1727096399.36711: done generating all_blocks data 24134 1727096399.36719: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24134 1727096399.36721: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24134 1727096399.36724: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24134 1727096399.38240: done processing included file 24134 1727096399.38244: iterating over new_blocks loaded from include file 24134 1727096399.38245: in VariableManager get_vars() 24134 1727096399.38257: done with get_vars() 24134 1727096399.38259: filtering new block on tags 24134 1727096399.38286: done filtering new block on tags 24134 1727096399.38289: in VariableManager get_vars() 24134 1727096399.38300: done with get_vars() 24134 1727096399.38301: filtering new block on tags 24134 1727096399.38312: done filtering new block on tags 24134 1727096399.38314: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 24134 1727096399.38319: extending task lists for all hosts with included blocks 24134 1727096399.38623: done extending task lists 24134 1727096399.38624: done processing included files 24134 1727096399.38625: results queue empty 24134 1727096399.38626: checking for any_errors_fatal 24134 1727096399.38630: done checking for any_errors_fatal 24134 1727096399.38630: checking for max_fail_percentage 24134 1727096399.38631: done checking for max_fail_percentage 24134 1727096399.38632: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.38633: done checking to see if all hosts have failed 24134 1727096399.38633: getting the remaining hosts for this loop 24134 1727096399.38634: done getting the remaining hosts for this loop 24134 1727096399.38637: getting the next task for host managed_node1 24134 1727096399.38641: done getting next task for host managed_node1 24134 1727096399.38644: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 24134 1727096399.38646: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.38648: getting variables 24134 1727096399.38649: in VariableManager get_vars() 24134 1727096399.38657: Calling all_inventory to load vars for managed_node1 24134 1727096399.38659: Calling groups_inventory to load vars for managed_node1 24134 1727096399.38662: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.38666: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.38678: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.38681: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.39040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.39437: done with get_vars() 24134 1727096399.39447: done getting variables 24134 1727096399.39515: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 24134 1727096399.39916: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Monday 23 September 2024 08:59:59 -0400 (0:00:00.132) 0:00:03.613 ****** 24134 1727096399.39961: entering _queue_task() for managed_node1/command 24134 1727096399.39963: Creating lock for command 24134 1727096399.40706: worker is 1 (out of 1 available) 24134 1727096399.40720: exiting _queue_task() for managed_node1/command 24134 1727096399.40734: done queuing things up, now waiting for results queue to drain 24134 1727096399.40735: waiting for pending results... 24134 1727096399.41482: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 24134 1727096399.41528: in run() - task 0afff68d-5257-1673-d3fc-0000000000d4 24134 1727096399.41875: variable 'ansible_search_path' from source: unknown 24134 1727096399.41879: variable 'ansible_search_path' from source: unknown 24134 1727096399.41882: calling self._execute() 24134 1727096399.41885: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.41888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.41891: variable 'omit' from source: magic vars 24134 1727096399.42926: variable 'ansible_distribution' from source: facts 24134 1727096399.43015: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24134 1727096399.43257: variable 'ansible_distribution_major_version' from source: facts 24134 1727096399.43267: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24134 1727096399.43278: when evaluation is False, skipping this task 24134 1727096399.43284: _execute() done 24134 1727096399.43435: dumping result to json 24134 1727096399.43438: done dumping result, returning 24134 1727096399.43441: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [0afff68d-5257-1673-d3fc-0000000000d4] 24134 1727096399.43444: sending task result for task 0afff68d-5257-1673-d3fc-0000000000d4 24134 1727096399.43519: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000d4 24134 1727096399.43522: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24134 1727096399.43598: no more pending results, returning what we have 24134 1727096399.43601: results queue empty 24134 1727096399.43602: checking for any_errors_fatal 24134 1727096399.43604: done checking for any_errors_fatal 24134 1727096399.43604: checking for max_fail_percentage 24134 1727096399.43606: done checking for max_fail_percentage 24134 1727096399.43606: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.43607: done checking to see if all hosts have failed 24134 1727096399.43608: getting the remaining hosts for this loop 24134 1727096399.43609: done getting the remaining hosts for this loop 24134 1727096399.43613: getting the next task for host managed_node1 24134 1727096399.43619: done getting next task for host managed_node1 24134 1727096399.43621: ^ task is: TASK: Install yum-utils package 24134 1727096399.43625: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.43629: getting variables 24134 1727096399.43631: in VariableManager get_vars() 24134 1727096399.43663: Calling all_inventory to load vars for managed_node1 24134 1727096399.43666: Calling groups_inventory to load vars for managed_node1 24134 1727096399.43673: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.43686: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.43689: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.43692: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.44202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.44704: done with get_vars() 24134 1727096399.44715: done getting variables 24134 1727096399.45017: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Monday 23 September 2024 08:59:59 -0400 (0:00:00.050) 0:00:03.663 ****** 24134 1727096399.45047: entering _queue_task() for managed_node1/package 24134 1727096399.45049: Creating lock for package 24134 1727096399.45694: worker is 1 (out of 1 available) 24134 1727096399.45708: exiting _queue_task() for managed_node1/package 24134 1727096399.45720: done queuing things up, now waiting for results queue to drain 24134 1727096399.45721: waiting for pending results... 24134 1727096399.46185: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 24134 1727096399.46343: in run() - task 0afff68d-5257-1673-d3fc-0000000000d5 24134 1727096399.46374: variable 'ansible_search_path' from source: unknown 24134 1727096399.46377: variable 'ansible_search_path' from source: unknown 24134 1727096399.46508: calling self._execute() 24134 1727096399.46849: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.46852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.46855: variable 'omit' from source: magic vars 24134 1727096399.47776: variable 'ansible_distribution' from source: facts 24134 1727096399.47780: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24134 1727096399.47848: variable 'ansible_distribution_major_version' from source: facts 24134 1727096399.47981: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24134 1727096399.47990: when evaluation is False, skipping this task 24134 1727096399.47998: _execute() done 24134 1727096399.48015: dumping result to json 24134 1727096399.48022: done dumping result, returning 24134 1727096399.48031: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0afff68d-5257-1673-d3fc-0000000000d5] 24134 1727096399.48040: sending task result for task 0afff68d-5257-1673-d3fc-0000000000d5 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24134 1727096399.48249: no more pending results, returning what we have 24134 1727096399.48252: results queue empty 24134 1727096399.48253: checking for any_errors_fatal 24134 1727096399.48256: done checking for any_errors_fatal 24134 1727096399.48257: checking for max_fail_percentage 24134 1727096399.48259: done checking for max_fail_percentage 24134 1727096399.48259: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.48260: done checking to see if all hosts have failed 24134 1727096399.48260: getting the remaining hosts for this loop 24134 1727096399.48262: done getting the remaining hosts for this loop 24134 1727096399.48265: getting the next task for host managed_node1 24134 1727096399.48274: done getting next task for host managed_node1 24134 1727096399.48277: ^ task is: TASK: Enable EPEL 7 24134 1727096399.48280: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.48283: getting variables 24134 1727096399.48285: in VariableManager get_vars() 24134 1727096399.48307: Calling all_inventory to load vars for managed_node1 24134 1727096399.48310: Calling groups_inventory to load vars for managed_node1 24134 1727096399.48313: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.48324: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.48326: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.48329: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.48787: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000d5 24134 1727096399.48791: WORKER PROCESS EXITING 24134 1727096399.48803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.49230: done with get_vars() 24134 1727096399.49244: done getting variables 24134 1727096399.49360: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Monday 23 September 2024 08:59:59 -0400 (0:00:00.043) 0:00:03.707 ****** 24134 1727096399.49392: entering _queue_task() for managed_node1/command 24134 1727096399.50004: worker is 1 (out of 1 available) 24134 1727096399.50129: exiting _queue_task() for managed_node1/command 24134 1727096399.50141: done queuing things up, now waiting for results queue to drain 24134 1727096399.50143: waiting for pending results... 24134 1727096399.50324: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 24134 1727096399.50795: in run() - task 0afff68d-5257-1673-d3fc-0000000000d6 24134 1727096399.50799: variable 'ansible_search_path' from source: unknown 24134 1727096399.50802: variable 'ansible_search_path' from source: unknown 24134 1727096399.50805: calling self._execute() 24134 1727096399.50875: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.51261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.51265: variable 'omit' from source: magic vars 24134 1727096399.52265: variable 'ansible_distribution' from source: facts 24134 1727096399.52289: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24134 1727096399.52974: variable 'ansible_distribution_major_version' from source: facts 24134 1727096399.52978: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24134 1727096399.52981: when evaluation is False, skipping this task 24134 1727096399.52983: _execute() done 24134 1727096399.52985: dumping result to json 24134 1727096399.52987: done dumping result, returning 24134 1727096399.52990: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0afff68d-5257-1673-d3fc-0000000000d6] 24134 1727096399.52992: sending task result for task 0afff68d-5257-1673-d3fc-0000000000d6 24134 1727096399.53059: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000d6 24134 1727096399.53062: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24134 1727096399.53336: no more pending results, returning what we have 24134 1727096399.53340: results queue empty 24134 1727096399.53341: checking for any_errors_fatal 24134 1727096399.53347: done checking for any_errors_fatal 24134 1727096399.53348: checking for max_fail_percentage 24134 1727096399.53350: done checking for max_fail_percentage 24134 1727096399.53351: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.53352: done checking to see if all hosts have failed 24134 1727096399.53352: getting the remaining hosts for this loop 24134 1727096399.53354: done getting the remaining hosts for this loop 24134 1727096399.53359: getting the next task for host managed_node1 24134 1727096399.53366: done getting next task for host managed_node1 24134 1727096399.53373: ^ task is: TASK: Enable EPEL 8 24134 1727096399.53378: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.53383: getting variables 24134 1727096399.53385: in VariableManager get_vars() 24134 1727096399.53419: Calling all_inventory to load vars for managed_node1 24134 1727096399.53422: Calling groups_inventory to load vars for managed_node1 24134 1727096399.53426: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.53438: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.53441: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.53446: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.53920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.54340: done with get_vars() 24134 1727096399.54349: done getting variables 24134 1727096399.54612: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Monday 23 September 2024 08:59:59 -0400 (0:00:00.052) 0:00:03.759 ****** 24134 1727096399.54641: entering _queue_task() for managed_node1/command 24134 1727096399.55174: worker is 1 (out of 1 available) 24134 1727096399.55185: exiting _queue_task() for managed_node1/command 24134 1727096399.55196: done queuing things up, now waiting for results queue to drain 24134 1727096399.55197: waiting for pending results... 24134 1727096399.55886: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 24134 1727096399.55978: in run() - task 0afff68d-5257-1673-d3fc-0000000000d7 24134 1727096399.56050: variable 'ansible_search_path' from source: unknown 24134 1727096399.56059: variable 'ansible_search_path' from source: unknown 24134 1727096399.56063: calling self._execute() 24134 1727096399.56131: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.56142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.56168: variable 'omit' from source: magic vars 24134 1727096399.56557: variable 'ansible_distribution' from source: facts 24134 1727096399.56578: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24134 1727096399.56729: variable 'ansible_distribution_major_version' from source: facts 24134 1727096399.56773: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24134 1727096399.56776: when evaluation is False, skipping this task 24134 1727096399.56779: _execute() done 24134 1727096399.56781: dumping result to json 24134 1727096399.56783: done dumping result, returning 24134 1727096399.56790: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0afff68d-5257-1673-d3fc-0000000000d7] 24134 1727096399.56797: sending task result for task 0afff68d-5257-1673-d3fc-0000000000d7 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24134 1727096399.57008: no more pending results, returning what we have 24134 1727096399.57011: results queue empty 24134 1727096399.57012: checking for any_errors_fatal 24134 1727096399.57018: done checking for any_errors_fatal 24134 1727096399.57019: checking for max_fail_percentage 24134 1727096399.57021: done checking for max_fail_percentage 24134 1727096399.57021: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.57022: done checking to see if all hosts have failed 24134 1727096399.57023: getting the remaining hosts for this loop 24134 1727096399.57024: done getting the remaining hosts for this loop 24134 1727096399.57027: getting the next task for host managed_node1 24134 1727096399.57036: done getting next task for host managed_node1 24134 1727096399.57038: ^ task is: TASK: Enable EPEL 6 24134 1727096399.57043: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.57048: getting variables 24134 1727096399.57049: in VariableManager get_vars() 24134 1727096399.57081: Calling all_inventory to load vars for managed_node1 24134 1727096399.57084: Calling groups_inventory to load vars for managed_node1 24134 1727096399.57087: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.57096: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.57098: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.57101: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.57244: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000d7 24134 1727096399.57247: WORKER PROCESS EXITING 24134 1727096399.57273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.57465: done with get_vars() 24134 1727096399.57478: done getting variables 24134 1727096399.57537: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Monday 23 September 2024 08:59:59 -0400 (0:00:00.029) 0:00:03.789 ****** 24134 1727096399.57573: entering _queue_task() for managed_node1/copy 24134 1727096399.58003: worker is 1 (out of 1 available) 24134 1727096399.58012: exiting _queue_task() for managed_node1/copy 24134 1727096399.58022: done queuing things up, now waiting for results queue to drain 24134 1727096399.58023: waiting for pending results... 24134 1727096399.58089: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 24134 1727096399.58193: in run() - task 0afff68d-5257-1673-d3fc-0000000000d9 24134 1727096399.58212: variable 'ansible_search_path' from source: unknown 24134 1727096399.58226: variable 'ansible_search_path' from source: unknown 24134 1727096399.58275: calling self._execute() 24134 1727096399.58349: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.58374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.58388: variable 'omit' from source: magic vars 24134 1727096399.58798: variable 'ansible_distribution' from source: facts 24134 1727096399.59126: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24134 1727096399.59197: variable 'ansible_distribution_major_version' from source: facts 24134 1727096399.59273: Evaluated conditional (ansible_distribution_major_version == '6'): False 24134 1727096399.59285: when evaluation is False, skipping this task 24134 1727096399.59298: _execute() done 24134 1727096399.59308: dumping result to json 24134 1727096399.59347: done dumping result, returning 24134 1727096399.59363: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0afff68d-5257-1673-d3fc-0000000000d9] 24134 1727096399.59383: sending task result for task 0afff68d-5257-1673-d3fc-0000000000d9 24134 1727096399.59616: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000d9 24134 1727096399.59619: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 24134 1727096399.59704: no more pending results, returning what we have 24134 1727096399.59707: results queue empty 24134 1727096399.59708: checking for any_errors_fatal 24134 1727096399.59712: done checking for any_errors_fatal 24134 1727096399.59713: checking for max_fail_percentage 24134 1727096399.59715: done checking for max_fail_percentage 24134 1727096399.59715: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.59716: done checking to see if all hosts have failed 24134 1727096399.59717: getting the remaining hosts for this loop 24134 1727096399.59719: done getting the remaining hosts for this loop 24134 1727096399.59722: getting the next task for host managed_node1 24134 1727096399.59730: done getting next task for host managed_node1 24134 1727096399.59733: ^ task is: TASK: Set network provider to 'nm' 24134 1727096399.59736: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.59741: getting variables 24134 1727096399.59742: in VariableManager get_vars() 24134 1727096399.59775: Calling all_inventory to load vars for managed_node1 24134 1727096399.59779: Calling groups_inventory to load vars for managed_node1 24134 1727096399.59783: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.59794: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.59797: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.59800: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.60890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.61276: done with get_vars() 24134 1727096399.61285: done getting variables 24134 1727096399.61340: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:13 Monday 23 September 2024 08:59:59 -0400 (0:00:00.037) 0:00:03.827 ****** 24134 1727096399.61364: entering _queue_task() for managed_node1/set_fact 24134 1727096399.61773: worker is 1 (out of 1 available) 24134 1727096399.61786: exiting _queue_task() for managed_node1/set_fact 24134 1727096399.61797: done queuing things up, now waiting for results queue to drain 24134 1727096399.61798: waiting for pending results... 24134 1727096399.62188: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 24134 1727096399.62192: in run() - task 0afff68d-5257-1673-d3fc-000000000007 24134 1727096399.62195: variable 'ansible_search_path' from source: unknown 24134 1727096399.62198: calling self._execute() 24134 1727096399.62250: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.62262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.62282: variable 'omit' from source: magic vars 24134 1727096399.62401: variable 'omit' from source: magic vars 24134 1727096399.62438: variable 'omit' from source: magic vars 24134 1727096399.62485: variable 'omit' from source: magic vars 24134 1727096399.62535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096399.62582: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096399.62610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096399.62636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096399.62655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096399.62692: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096399.62700: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.62708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.62820: Set connection var ansible_shell_executable to /bin/sh 24134 1727096399.62835: Set connection var ansible_pipelining to False 24134 1727096399.62943: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096399.62946: Set connection var ansible_timeout to 10 24134 1727096399.62949: Set connection var ansible_connection to ssh 24134 1727096399.62951: Set connection var ansible_shell_type to sh 24134 1727096399.62953: variable 'ansible_shell_executable' from source: unknown 24134 1727096399.62954: variable 'ansible_connection' from source: unknown 24134 1727096399.62956: variable 'ansible_module_compression' from source: unknown 24134 1727096399.62958: variable 'ansible_shell_type' from source: unknown 24134 1727096399.62960: variable 'ansible_shell_executable' from source: unknown 24134 1727096399.62963: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.62964: variable 'ansible_pipelining' from source: unknown 24134 1727096399.62966: variable 'ansible_timeout' from source: unknown 24134 1727096399.62972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.63114: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096399.63170: variable 'omit' from source: magic vars 24134 1727096399.63219: starting attempt loop 24134 1727096399.63244: running the handler 24134 1727096399.63265: handler run complete 24134 1727096399.63573: attempt loop complete, returning result 24134 1727096399.63576: _execute() done 24134 1727096399.63578: dumping result to json 24134 1727096399.63580: done dumping result, returning 24134 1727096399.63581: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0afff68d-5257-1673-d3fc-000000000007] 24134 1727096399.63583: sending task result for task 0afff68d-5257-1673-d3fc-000000000007 24134 1727096399.63643: done sending task result for task 0afff68d-5257-1673-d3fc-000000000007 24134 1727096399.63646: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 24134 1727096399.63707: no more pending results, returning what we have 24134 1727096399.63710: results queue empty 24134 1727096399.63711: checking for any_errors_fatal 24134 1727096399.63717: done checking for any_errors_fatal 24134 1727096399.63718: checking for max_fail_percentage 24134 1727096399.63719: done checking for max_fail_percentage 24134 1727096399.63720: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.63721: done checking to see if all hosts have failed 24134 1727096399.63722: getting the remaining hosts for this loop 24134 1727096399.63723: done getting the remaining hosts for this loop 24134 1727096399.63727: getting the next task for host managed_node1 24134 1727096399.63732: done getting next task for host managed_node1 24134 1727096399.63734: ^ task is: TASK: meta (flush_handlers) 24134 1727096399.63736: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.63740: getting variables 24134 1727096399.63742: in VariableManager get_vars() 24134 1727096399.63775: Calling all_inventory to load vars for managed_node1 24134 1727096399.63778: Calling groups_inventory to load vars for managed_node1 24134 1727096399.63782: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.63792: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.63795: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.63798: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.64472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.64856: done with get_vars() 24134 1727096399.64866: done getting variables 24134 1727096399.64930: in VariableManager get_vars() 24134 1727096399.64939: Calling all_inventory to load vars for managed_node1 24134 1727096399.64942: Calling groups_inventory to load vars for managed_node1 24134 1727096399.64944: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.64948: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.64951: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.64953: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.65224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.65446: done with get_vars() 24134 1727096399.65460: done queuing things up, now waiting for results queue to drain 24134 1727096399.65462: results queue empty 24134 1727096399.65463: checking for any_errors_fatal 24134 1727096399.65465: done checking for any_errors_fatal 24134 1727096399.65465: checking for max_fail_percentage 24134 1727096399.65466: done checking for max_fail_percentage 24134 1727096399.65469: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.65471: done checking to see if all hosts have failed 24134 1727096399.65471: getting the remaining hosts for this loop 24134 1727096399.65472: done getting the remaining hosts for this loop 24134 1727096399.65475: getting the next task for host managed_node1 24134 1727096399.65479: done getting next task for host managed_node1 24134 1727096399.65480: ^ task is: TASK: meta (flush_handlers) 24134 1727096399.65482: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.65489: getting variables 24134 1727096399.65491: in VariableManager get_vars() 24134 1727096399.65498: Calling all_inventory to load vars for managed_node1 24134 1727096399.65501: Calling groups_inventory to load vars for managed_node1 24134 1727096399.65503: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.65507: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.65509: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.65511: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.65637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.65814: done with get_vars() 24134 1727096399.65821: done getting variables 24134 1727096399.65864: in VariableManager get_vars() 24134 1727096399.65874: Calling all_inventory to load vars for managed_node1 24134 1727096399.65876: Calling groups_inventory to load vars for managed_node1 24134 1727096399.65878: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.65883: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.65885: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.65888: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.66007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.66197: done with get_vars() 24134 1727096399.66207: done queuing things up, now waiting for results queue to drain 24134 1727096399.66209: results queue empty 24134 1727096399.66210: checking for any_errors_fatal 24134 1727096399.66211: done checking for any_errors_fatal 24134 1727096399.66212: checking for max_fail_percentage 24134 1727096399.66212: done checking for max_fail_percentage 24134 1727096399.66213: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.66214: done checking to see if all hosts have failed 24134 1727096399.66214: getting the remaining hosts for this loop 24134 1727096399.66215: done getting the remaining hosts for this loop 24134 1727096399.66217: getting the next task for host managed_node1 24134 1727096399.66220: done getting next task for host managed_node1 24134 1727096399.66221: ^ task is: None 24134 1727096399.66222: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.66224: done queuing things up, now waiting for results queue to drain 24134 1727096399.66224: results queue empty 24134 1727096399.66225: checking for any_errors_fatal 24134 1727096399.66226: done checking for any_errors_fatal 24134 1727096399.66226: checking for max_fail_percentage 24134 1727096399.66227: done checking for max_fail_percentage 24134 1727096399.66228: checking to see if all hosts have failed and the running result is not ok 24134 1727096399.66228: done checking to see if all hosts have failed 24134 1727096399.66230: getting the next task for host managed_node1 24134 1727096399.66232: done getting next task for host managed_node1 24134 1727096399.66233: ^ task is: None 24134 1727096399.66234: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.66280: in VariableManager get_vars() 24134 1727096399.66299: done with get_vars() 24134 1727096399.66304: in VariableManager get_vars() 24134 1727096399.66317: done with get_vars() 24134 1727096399.66321: variable 'omit' from source: magic vars 24134 1727096399.66351: in VariableManager get_vars() 24134 1727096399.66363: done with get_vars() 24134 1727096399.66388: variable 'omit' from source: magic vars PLAY [Play for testing ipv6 disabled] ****************************************** 24134 1727096399.66701: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24134 1727096399.66733: getting the remaining hosts for this loop 24134 1727096399.66734: done getting the remaining hosts for this loop 24134 1727096399.66737: getting the next task for host managed_node1 24134 1727096399.66739: done getting next task for host managed_node1 24134 1727096399.66741: ^ task is: TASK: Gathering Facts 24134 1727096399.66742: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096399.66744: getting variables 24134 1727096399.66745: in VariableManager get_vars() 24134 1727096399.66755: Calling all_inventory to load vars for managed_node1 24134 1727096399.66757: Calling groups_inventory to load vars for managed_node1 24134 1727096399.66759: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096399.66763: Calling all_plugins_play to load vars for managed_node1 24134 1727096399.66787: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096399.66791: Calling groups_plugins_play to load vars for managed_node1 24134 1727096399.66944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096399.67182: done with get_vars() 24134 1727096399.67190: done getting variables 24134 1727096399.67227: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 Monday 23 September 2024 08:59:59 -0400 (0:00:00.058) 0:00:03.885 ****** 24134 1727096399.67249: entering _queue_task() for managed_node1/gather_facts 24134 1727096399.67524: worker is 1 (out of 1 available) 24134 1727096399.67537: exiting _queue_task() for managed_node1/gather_facts 24134 1727096399.67550: done queuing things up, now waiting for results queue to drain 24134 1727096399.67552: waiting for pending results... 24134 1727096399.67773: running TaskExecutor() for managed_node1/TASK: Gathering Facts 24134 1727096399.67881: in run() - task 0afff68d-5257-1673-d3fc-0000000000ff 24134 1727096399.67901: variable 'ansible_search_path' from source: unknown 24134 1727096399.67945: calling self._execute() 24134 1727096399.68042: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.68073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.68088: variable 'omit' from source: magic vars 24134 1727096399.68574: variable 'ansible_distribution_major_version' from source: facts 24134 1727096399.68578: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096399.68580: variable 'omit' from source: magic vars 24134 1727096399.68582: variable 'omit' from source: magic vars 24134 1727096399.68593: variable 'omit' from source: magic vars 24134 1727096399.68648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096399.68693: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096399.68731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096399.68754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096399.68775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096399.68809: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096399.68840: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.68907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.69184: Set connection var ansible_shell_executable to /bin/sh 24134 1727096399.69196: Set connection var ansible_pipelining to False 24134 1727096399.69207: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096399.69221: Set connection var ansible_timeout to 10 24134 1727096399.69228: Set connection var ansible_connection to ssh 24134 1727096399.69265: Set connection var ansible_shell_type to sh 24134 1727096399.69281: variable 'ansible_shell_executable' from source: unknown 24134 1727096399.69290: variable 'ansible_connection' from source: unknown 24134 1727096399.69377: variable 'ansible_module_compression' from source: unknown 24134 1727096399.69381: variable 'ansible_shell_type' from source: unknown 24134 1727096399.69383: variable 'ansible_shell_executable' from source: unknown 24134 1727096399.69386: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096399.69388: variable 'ansible_pipelining' from source: unknown 24134 1727096399.69391: variable 'ansible_timeout' from source: unknown 24134 1727096399.69393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096399.69533: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096399.69550: variable 'omit' from source: magic vars 24134 1727096399.69561: starting attempt loop 24134 1727096399.69573: running the handler 24134 1727096399.69608: variable 'ansible_facts' from source: unknown 24134 1727096399.69633: _low_level_execute_command(): starting 24134 1727096399.69646: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096399.70489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096399.70513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096399.70535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096399.70554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096399.70689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096399.73133: stdout chunk (state=3): >>>/root <<< 24134 1727096399.73333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096399.73336: stdout chunk (state=3): >>><<< 24134 1727096399.73339: stderr chunk (state=3): >>><<< 24134 1727096399.73374: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096399.73465: _low_level_execute_command(): starting 24134 1727096399.73474: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626 `" && echo ansible-tmp-1727096399.7336655-24368-30284903207626="` echo /root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626 `" ) && sleep 0' 24134 1727096399.73937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096399.73953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096399.73984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096399.74027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096399.74033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096399.74036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096399.74105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096399.76857: stdout chunk (state=3): >>>ansible-tmp-1727096399.7336655-24368-30284903207626=/root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626 <<< 24134 1727096399.77022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096399.77046: stderr chunk (state=3): >>><<< 24134 1727096399.77049: stdout chunk (state=3): >>><<< 24134 1727096399.77070: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096399.7336655-24368-30284903207626=/root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096399.77103: variable 'ansible_module_compression' from source: unknown 24134 1727096399.77138: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24134 1727096399.77191: variable 'ansible_facts' from source: unknown 24134 1727096399.77326: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626/AnsiballZ_setup.py 24134 1727096399.77428: Sending initial data 24134 1727096399.77431: Sent initial data (153 bytes) 24134 1727096399.77870: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096399.77875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096399.77877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24134 1727096399.77879: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096399.77881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096399.77932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096399.77939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096399.78012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096399.80380: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096399.80443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096399.80515: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpdx9c1su8 /root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626/AnsiballZ_setup.py <<< 24134 1727096399.80522: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626/AnsiballZ_setup.py" <<< 24134 1727096399.80588: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpdx9c1su8" to remote "/root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626/AnsiballZ_setup.py" <<< 24134 1727096399.80591: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626/AnsiballZ_setup.py" <<< 24134 1727096399.81791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096399.81836: stderr chunk (state=3): >>><<< 24134 1727096399.81840: stdout chunk (state=3): >>><<< 24134 1727096399.81858: done transferring module to remote 24134 1727096399.81867: _low_level_execute_command(): starting 24134 1727096399.81873: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626/ /root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626/AnsiballZ_setup.py && sleep 0' 24134 1727096399.82316: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096399.82319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096399.82322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096399.82324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096399.82326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096399.82375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096399.82390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096399.82396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096399.82461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096399.85099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096399.85124: stderr chunk (state=3): >>><<< 24134 1727096399.85129: stdout chunk (state=3): >>><<< 24134 1727096399.85149: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096399.85152: _low_level_execute_command(): starting 24134 1727096399.85156: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626/AnsiballZ_setup.py && sleep 0' 24134 1727096399.85617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096399.85620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096399.85623: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096399.85625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 24134 1727096399.85627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096399.85675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096399.85680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096399.85691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096399.85764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096400.68821: stdout chunk (state=3): >>> <<< 24134 1727096400.68826: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "00", "epoch": "1727096400", "epoch_int": "1727096400", "date": "2024-09-23", "time": "09:00:00", "iso8601_micro": "2024-09-23T13:00:00.282964Z", "iso8601": "2024-09-23T13:00:00Z", "iso8601_basic": "20240923T090000282964", "iso8601_basic_short": "20240923T090000", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/cento<<< 24134 1727096400.68829: stdout chunk (state=3): >>>s-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.64990234375, "5m": 0.4482421875, "15m": 0.2275390625}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELI<<< 24134 1727096400.68949: stdout chunk (state=3): >>>NUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_is_chroot": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vl<<< 24134 1727096400.68996: stdout chunk (state=3): >>>an_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2947, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 584, "free": 2947}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 553, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794922496, "block_size": 4096, "block_total": 65519099, "block_available": 63914776, "block_used": 1604323, "inode_total": 131070960, "inode_available": 131029096, "inode_used": 41864, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24134 1727096400.71839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096400.71955: stderr chunk (state=3): >>>Shared connection to 10.31.11.125 closed. <<< 24134 1727096400.71959: stdout chunk (state=3): >>><<< 24134 1727096400.71961: stderr chunk (state=3): >>><<< 24134 1727096400.71966: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "00", "epoch": "1727096400", "epoch_int": "1727096400", "date": "2024-09-23", "time": "09:00:00", "iso8601_micro": "2024-09-23T13:00:00.282964Z", "iso8601": "2024-09-23T13:00:00Z", "iso8601_basic": "20240923T090000282964", "iso8601_basic_short": "20240923T090000", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.64990234375, "5m": 0.4482421875, "15m": 0.2275390625}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_is_chroot": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2947, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 584, "free": 2947}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 553, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794922496, "block_size": 4096, "block_total": 65519099, "block_available": 63914776, "block_used": 1604323, "inode_total": 131070960, "inode_available": 131029096, "inode_used": 41864, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096400.72162: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096400.72183: _low_level_execute_command(): starting 24134 1727096400.72187: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096399.7336655-24368-30284903207626/ > /dev/null 2>&1 && sleep 0' 24134 1727096400.72620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096400.72624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096400.72626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24134 1727096400.72628: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096400.72633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096400.72685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096400.72688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096400.72772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096400.75557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096400.75581: stderr chunk (state=3): >>><<< 24134 1727096400.75585: stdout chunk (state=3): >>><<< 24134 1727096400.75601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096400.75612: handler run complete 24134 1727096400.75687: variable 'ansible_facts' from source: unknown 24134 1727096400.75750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096400.75926: variable 'ansible_facts' from source: unknown 24134 1727096400.75984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096400.76072: attempt loop complete, returning result 24134 1727096400.76076: _execute() done 24134 1727096400.76078: dumping result to json 24134 1727096400.76096: done dumping result, returning 24134 1727096400.76103: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-1673-d3fc-0000000000ff] 24134 1727096400.76107: sending task result for task 0afff68d-5257-1673-d3fc-0000000000ff ok: [managed_node1] 24134 1727096400.76598: no more pending results, returning what we have 24134 1727096400.76600: results queue empty 24134 1727096400.76601: checking for any_errors_fatal 24134 1727096400.76601: done checking for any_errors_fatal 24134 1727096400.76602: checking for max_fail_percentage 24134 1727096400.76603: done checking for max_fail_percentage 24134 1727096400.76603: checking to see if all hosts have failed and the running result is not ok 24134 1727096400.76604: done checking to see if all hosts have failed 24134 1727096400.76604: getting the remaining hosts for this loop 24134 1727096400.76605: done getting the remaining hosts for this loop 24134 1727096400.76608: getting the next task for host managed_node1 24134 1727096400.76611: done getting next task for host managed_node1 24134 1727096400.76612: ^ task is: TASK: meta (flush_handlers) 24134 1727096400.76613: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096400.76615: getting variables 24134 1727096400.76616: in VariableManager get_vars() 24134 1727096400.76636: Calling all_inventory to load vars for managed_node1 24134 1727096400.76637: Calling groups_inventory to load vars for managed_node1 24134 1727096400.76639: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096400.76648: Calling all_plugins_play to load vars for managed_node1 24134 1727096400.76649: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096400.76652: Calling groups_plugins_play to load vars for managed_node1 24134 1727096400.76758: done sending task result for task 0afff68d-5257-1673-d3fc-0000000000ff 24134 1727096400.76761: WORKER PROCESS EXITING 24134 1727096400.76773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096400.76884: done with get_vars() 24134 1727096400.76891: done getting variables 24134 1727096400.76942: in VariableManager get_vars() 24134 1727096400.76950: Calling all_inventory to load vars for managed_node1 24134 1727096400.76951: Calling groups_inventory to load vars for managed_node1 24134 1727096400.76952: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096400.76955: Calling all_plugins_play to load vars for managed_node1 24134 1727096400.76957: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096400.76958: Calling groups_plugins_play to load vars for managed_node1 24134 1727096400.77040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096400.77148: done with get_vars() 24134 1727096400.77157: done queuing things up, now waiting for results queue to drain 24134 1727096400.77158: results queue empty 24134 1727096400.77158: checking for any_errors_fatal 24134 1727096400.77160: done checking for any_errors_fatal 24134 1727096400.77161: checking for max_fail_percentage 24134 1727096400.77165: done checking for max_fail_percentage 24134 1727096400.77165: checking to see if all hosts have failed and the running result is not ok 24134 1727096400.77166: done checking to see if all hosts have failed 24134 1727096400.77166: getting the remaining hosts for this loop 24134 1727096400.77169: done getting the remaining hosts for this loop 24134 1727096400.77171: getting the next task for host managed_node1 24134 1727096400.77174: done getting next task for host managed_node1 24134 1727096400.77175: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 24134 1727096400.77176: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096400.77177: getting variables 24134 1727096400.77178: in VariableManager get_vars() 24134 1727096400.77185: Calling all_inventory to load vars for managed_node1 24134 1727096400.77186: Calling groups_inventory to load vars for managed_node1 24134 1727096400.77187: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096400.77190: Calling all_plugins_play to load vars for managed_node1 24134 1727096400.77191: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096400.77193: Calling groups_plugins_play to load vars for managed_node1 24134 1727096400.77273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096400.77393: done with get_vars() 24134 1727096400.77399: done getting variables 24134 1727096400.77425: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096400.77524: variable 'type' from source: play vars 24134 1727096400.77528: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:10 Monday 23 September 2024 09:00:00 -0400 (0:00:01.103) 0:00:04.989 ****** 24134 1727096400.77554: entering _queue_task() for managed_node1/set_fact 24134 1727096400.77765: worker is 1 (out of 1 available) 24134 1727096400.77780: exiting _queue_task() for managed_node1/set_fact 24134 1727096400.77790: done queuing things up, now waiting for results queue to drain 24134 1727096400.77792: waiting for pending results... 24134 1727096400.77939: running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=ethtest0 24134 1727096400.78001: in run() - task 0afff68d-5257-1673-d3fc-00000000000b 24134 1727096400.78014: variable 'ansible_search_path' from source: unknown 24134 1727096400.78043: calling self._execute() 24134 1727096400.78105: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096400.78109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096400.78117: variable 'omit' from source: magic vars 24134 1727096400.78385: variable 'ansible_distribution_major_version' from source: facts 24134 1727096400.78395: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096400.78401: variable 'omit' from source: magic vars 24134 1727096400.78415: variable 'omit' from source: magic vars 24134 1727096400.78435: variable 'type' from source: play vars 24134 1727096400.78491: variable 'type' from source: play vars 24134 1727096400.78499: variable 'interface' from source: play vars 24134 1727096400.78543: variable 'interface' from source: play vars 24134 1727096400.78554: variable 'omit' from source: magic vars 24134 1727096400.78591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096400.78618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096400.78633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096400.78648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096400.78657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096400.78691: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096400.78694: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096400.78697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096400.78760: Set connection var ansible_shell_executable to /bin/sh 24134 1727096400.78764: Set connection var ansible_pipelining to False 24134 1727096400.78770: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096400.78780: Set connection var ansible_timeout to 10 24134 1727096400.78794: Set connection var ansible_connection to ssh 24134 1727096400.78798: Set connection var ansible_shell_type to sh 24134 1727096400.78805: variable 'ansible_shell_executable' from source: unknown 24134 1727096400.78808: variable 'ansible_connection' from source: unknown 24134 1727096400.78810: variable 'ansible_module_compression' from source: unknown 24134 1727096400.78812: variable 'ansible_shell_type' from source: unknown 24134 1727096400.78815: variable 'ansible_shell_executable' from source: unknown 24134 1727096400.78817: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096400.78821: variable 'ansible_pipelining' from source: unknown 24134 1727096400.78824: variable 'ansible_timeout' from source: unknown 24134 1727096400.78827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096400.78930: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096400.78938: variable 'omit' from source: magic vars 24134 1727096400.78943: starting attempt loop 24134 1727096400.78946: running the handler 24134 1727096400.78957: handler run complete 24134 1727096400.78964: attempt loop complete, returning result 24134 1727096400.78966: _execute() done 24134 1727096400.78971: dumping result to json 24134 1727096400.78976: done dumping result, returning 24134 1727096400.78982: done running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=ethtest0 [0afff68d-5257-1673-d3fc-00000000000b] 24134 1727096400.78986: sending task result for task 0afff68d-5257-1673-d3fc-00000000000b 24134 1727096400.79068: done sending task result for task 0afff68d-5257-1673-d3fc-00000000000b 24134 1727096400.79072: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 24134 1727096400.79120: no more pending results, returning what we have 24134 1727096400.79123: results queue empty 24134 1727096400.79124: checking for any_errors_fatal 24134 1727096400.79126: done checking for any_errors_fatal 24134 1727096400.79127: checking for max_fail_percentage 24134 1727096400.79128: done checking for max_fail_percentage 24134 1727096400.79129: checking to see if all hosts have failed and the running result is not ok 24134 1727096400.79130: done checking to see if all hosts have failed 24134 1727096400.79131: getting the remaining hosts for this loop 24134 1727096400.79132: done getting the remaining hosts for this loop 24134 1727096400.79135: getting the next task for host managed_node1 24134 1727096400.79140: done getting next task for host managed_node1 24134 1727096400.79142: ^ task is: TASK: Include the task 'show_interfaces.yml' 24134 1727096400.79144: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096400.79147: getting variables 24134 1727096400.79148: in VariableManager get_vars() 24134 1727096400.79191: Calling all_inventory to load vars for managed_node1 24134 1727096400.79194: Calling groups_inventory to load vars for managed_node1 24134 1727096400.79196: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096400.79204: Calling all_plugins_play to load vars for managed_node1 24134 1727096400.79206: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096400.79209: Calling groups_plugins_play to load vars for managed_node1 24134 1727096400.79333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096400.79451: done with get_vars() 24134 1727096400.79458: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:14 Monday 23 September 2024 09:00:00 -0400 (0:00:00.019) 0:00:05.008 ****** 24134 1727096400.79523: entering _queue_task() for managed_node1/include_tasks 24134 1727096400.79718: worker is 1 (out of 1 available) 24134 1727096400.79730: exiting _queue_task() for managed_node1/include_tasks 24134 1727096400.79740: done queuing things up, now waiting for results queue to drain 24134 1727096400.79741: waiting for pending results... 24134 1727096400.79892: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 24134 1727096400.79945: in run() - task 0afff68d-5257-1673-d3fc-00000000000c 24134 1727096400.79955: variable 'ansible_search_path' from source: unknown 24134 1727096400.79989: calling self._execute() 24134 1727096400.80044: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096400.80047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096400.80055: variable 'omit' from source: magic vars 24134 1727096400.80316: variable 'ansible_distribution_major_version' from source: facts 24134 1727096400.80326: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096400.80331: _execute() done 24134 1727096400.80334: dumping result to json 24134 1727096400.80336: done dumping result, returning 24134 1727096400.80343: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-1673-d3fc-00000000000c] 24134 1727096400.80348: sending task result for task 0afff68d-5257-1673-d3fc-00000000000c 24134 1727096400.80429: done sending task result for task 0afff68d-5257-1673-d3fc-00000000000c 24134 1727096400.80432: WORKER PROCESS EXITING 24134 1727096400.80458: no more pending results, returning what we have 24134 1727096400.80462: in VariableManager get_vars() 24134 1727096400.80503: Calling all_inventory to load vars for managed_node1 24134 1727096400.80506: Calling groups_inventory to load vars for managed_node1 24134 1727096400.80508: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096400.80519: Calling all_plugins_play to load vars for managed_node1 24134 1727096400.80522: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096400.80524: Calling groups_plugins_play to load vars for managed_node1 24134 1727096400.80689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096400.80801: done with get_vars() 24134 1727096400.80806: variable 'ansible_search_path' from source: unknown 24134 1727096400.80817: we have included files to process 24134 1727096400.80818: generating all_blocks data 24134 1727096400.80819: done generating all_blocks data 24134 1727096400.80819: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24134 1727096400.80820: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24134 1727096400.80821: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24134 1727096400.80925: in VariableManager get_vars() 24134 1727096400.80937: done with get_vars() 24134 1727096400.81009: done processing included file 24134 1727096400.81011: iterating over new_blocks loaded from include file 24134 1727096400.81012: in VariableManager get_vars() 24134 1727096400.81020: done with get_vars() 24134 1727096400.81021: filtering new block on tags 24134 1727096400.81035: done filtering new block on tags 24134 1727096400.81036: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 24134 1727096400.81040: extending task lists for all hosts with included blocks 24134 1727096400.81626: done extending task lists 24134 1727096400.81628: done processing included files 24134 1727096400.81628: results queue empty 24134 1727096400.81629: checking for any_errors_fatal 24134 1727096400.81631: done checking for any_errors_fatal 24134 1727096400.81631: checking for max_fail_percentage 24134 1727096400.81632: done checking for max_fail_percentage 24134 1727096400.81632: checking to see if all hosts have failed and the running result is not ok 24134 1727096400.81633: done checking to see if all hosts have failed 24134 1727096400.81633: getting the remaining hosts for this loop 24134 1727096400.81634: done getting the remaining hosts for this loop 24134 1727096400.81636: getting the next task for host managed_node1 24134 1727096400.81638: done getting next task for host managed_node1 24134 1727096400.81639: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 24134 1727096400.81641: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096400.81642: getting variables 24134 1727096400.81643: in VariableManager get_vars() 24134 1727096400.81650: Calling all_inventory to load vars for managed_node1 24134 1727096400.81651: Calling groups_inventory to load vars for managed_node1 24134 1727096400.81653: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096400.81656: Calling all_plugins_play to load vars for managed_node1 24134 1727096400.81658: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096400.81659: Calling groups_plugins_play to load vars for managed_node1 24134 1727096400.81762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096400.81874: done with get_vars() 24134 1727096400.81880: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:00:00 -0400 (0:00:00.024) 0:00:05.032 ****** 24134 1727096400.81929: entering _queue_task() for managed_node1/include_tasks 24134 1727096400.82129: worker is 1 (out of 1 available) 24134 1727096400.82141: exiting _queue_task() for managed_node1/include_tasks 24134 1727096400.82152: done queuing things up, now waiting for results queue to drain 24134 1727096400.82153: waiting for pending results... 24134 1727096400.82299: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 24134 1727096400.82359: in run() - task 0afff68d-5257-1673-d3fc-000000000115 24134 1727096400.82373: variable 'ansible_search_path' from source: unknown 24134 1727096400.82378: variable 'ansible_search_path' from source: unknown 24134 1727096400.82406: calling self._execute() 24134 1727096400.82459: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096400.82462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096400.82473: variable 'omit' from source: magic vars 24134 1727096400.82735: variable 'ansible_distribution_major_version' from source: facts 24134 1727096400.82744: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096400.82750: _execute() done 24134 1727096400.82753: dumping result to json 24134 1727096400.82755: done dumping result, returning 24134 1727096400.82762: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-1673-d3fc-000000000115] 24134 1727096400.82766: sending task result for task 0afff68d-5257-1673-d3fc-000000000115 24134 1727096400.82847: done sending task result for task 0afff68d-5257-1673-d3fc-000000000115 24134 1727096400.82850: WORKER PROCESS EXITING 24134 1727096400.82881: no more pending results, returning what we have 24134 1727096400.82886: in VariableManager get_vars() 24134 1727096400.82925: Calling all_inventory to load vars for managed_node1 24134 1727096400.82928: Calling groups_inventory to load vars for managed_node1 24134 1727096400.82930: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096400.82940: Calling all_plugins_play to load vars for managed_node1 24134 1727096400.82943: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096400.82946: Calling groups_plugins_play to load vars for managed_node1 24134 1727096400.83089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096400.83219: done with get_vars() 24134 1727096400.83224: variable 'ansible_search_path' from source: unknown 24134 1727096400.83225: variable 'ansible_search_path' from source: unknown 24134 1727096400.83248: we have included files to process 24134 1727096400.83249: generating all_blocks data 24134 1727096400.83250: done generating all_blocks data 24134 1727096400.83251: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24134 1727096400.83251: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24134 1727096400.83253: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24134 1727096400.83457: done processing included file 24134 1727096400.83459: iterating over new_blocks loaded from include file 24134 1727096400.83460: in VariableManager get_vars() 24134 1727096400.83472: done with get_vars() 24134 1727096400.83473: filtering new block on tags 24134 1727096400.83484: done filtering new block on tags 24134 1727096400.83485: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 24134 1727096400.83488: extending task lists for all hosts with included blocks 24134 1727096400.83547: done extending task lists 24134 1727096400.83547: done processing included files 24134 1727096400.83548: results queue empty 24134 1727096400.83548: checking for any_errors_fatal 24134 1727096400.83550: done checking for any_errors_fatal 24134 1727096400.83551: checking for max_fail_percentage 24134 1727096400.83551: done checking for max_fail_percentage 24134 1727096400.83552: checking to see if all hosts have failed and the running result is not ok 24134 1727096400.83552: done checking to see if all hosts have failed 24134 1727096400.83552: getting the remaining hosts for this loop 24134 1727096400.83553: done getting the remaining hosts for this loop 24134 1727096400.83555: getting the next task for host managed_node1 24134 1727096400.83557: done getting next task for host managed_node1 24134 1727096400.83559: ^ task is: TASK: Gather current interface info 24134 1727096400.83560: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096400.83562: getting variables 24134 1727096400.83563: in VariableManager get_vars() 24134 1727096400.83572: Calling all_inventory to load vars for managed_node1 24134 1727096400.83573: Calling groups_inventory to load vars for managed_node1 24134 1727096400.83575: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096400.83578: Calling all_plugins_play to load vars for managed_node1 24134 1727096400.83579: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096400.83581: Calling groups_plugins_play to load vars for managed_node1 24134 1727096400.83661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096400.83771: done with get_vars() 24134 1727096400.83776: done getting variables 24134 1727096400.83801: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:00:00 -0400 (0:00:00.018) 0:00:05.051 ****** 24134 1727096400.83820: entering _queue_task() for managed_node1/command 24134 1727096400.84017: worker is 1 (out of 1 available) 24134 1727096400.84029: exiting _queue_task() for managed_node1/command 24134 1727096400.84041: done queuing things up, now waiting for results queue to drain 24134 1727096400.84043: waiting for pending results... 24134 1727096400.84185: running TaskExecutor() for managed_node1/TASK: Gather current interface info 24134 1727096400.84250: in run() - task 0afff68d-5257-1673-d3fc-000000000192 24134 1727096400.84259: variable 'ansible_search_path' from source: unknown 24134 1727096400.84264: variable 'ansible_search_path' from source: unknown 24134 1727096400.84297: calling self._execute() 24134 1727096400.84352: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096400.84356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096400.84363: variable 'omit' from source: magic vars 24134 1727096400.84679: variable 'ansible_distribution_major_version' from source: facts 24134 1727096400.84689: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096400.84694: variable 'omit' from source: magic vars 24134 1727096400.84723: variable 'omit' from source: magic vars 24134 1727096400.84750: variable 'omit' from source: magic vars 24134 1727096400.84784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096400.84811: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096400.84827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096400.84839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096400.84854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096400.84876: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096400.84880: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096400.84882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096400.84948: Set connection var ansible_shell_executable to /bin/sh 24134 1727096400.84953: Set connection var ansible_pipelining to False 24134 1727096400.84956: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096400.84969: Set connection var ansible_timeout to 10 24134 1727096400.84972: Set connection var ansible_connection to ssh 24134 1727096400.84975: Set connection var ansible_shell_type to sh 24134 1727096400.84993: variable 'ansible_shell_executable' from source: unknown 24134 1727096400.84996: variable 'ansible_connection' from source: unknown 24134 1727096400.84999: variable 'ansible_module_compression' from source: unknown 24134 1727096400.85001: variable 'ansible_shell_type' from source: unknown 24134 1727096400.85004: variable 'ansible_shell_executable' from source: unknown 24134 1727096400.85006: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096400.85008: variable 'ansible_pipelining' from source: unknown 24134 1727096400.85013: variable 'ansible_timeout' from source: unknown 24134 1727096400.85016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096400.85118: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096400.85126: variable 'omit' from source: magic vars 24134 1727096400.85133: starting attempt loop 24134 1727096400.85135: running the handler 24134 1727096400.85149: _low_level_execute_command(): starting 24134 1727096400.85156: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096400.85670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096400.85676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096400.85679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096400.85732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096400.85742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096400.85747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096400.85820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096400.88374: stdout chunk (state=3): >>>/root <<< 24134 1727096400.88515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096400.88550: stderr chunk (state=3): >>><<< 24134 1727096400.88553: stdout chunk (state=3): >>><<< 24134 1727096400.88580: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096400.88591: _low_level_execute_command(): starting 24134 1727096400.88597: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330 `" && echo ansible-tmp-1727096400.8858027-24423-98535011977330="` echo /root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330 `" ) && sleep 0' 24134 1727096400.89056: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096400.89059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096400.89062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24134 1727096400.89088: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096400.89091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096400.89126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096400.89129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096400.89134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096400.89209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096400.92059: stdout chunk (state=3): >>>ansible-tmp-1727096400.8858027-24423-98535011977330=/root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330 <<< 24134 1727096400.92391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096400.92395: stdout chunk (state=3): >>><<< 24134 1727096400.92397: stderr chunk (state=3): >>><<< 24134 1727096400.92401: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096400.8858027-24423-98535011977330=/root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096400.92403: variable 'ansible_module_compression' from source: unknown 24134 1727096400.92405: ANSIBALLZ: Using generic lock for ansible.legacy.command 24134 1727096400.92407: ANSIBALLZ: Acquiring lock 24134 1727096400.92409: ANSIBALLZ: Lock acquired: 140085163806880 24134 1727096400.92411: ANSIBALLZ: Creating module 24134 1727096401.03190: ANSIBALLZ: Writing module into payload 24134 1727096401.03289: ANSIBALLZ: Writing module 24134 1727096401.03315: ANSIBALLZ: Renaming module 24134 1727096401.03325: ANSIBALLZ: Done creating module 24134 1727096401.03348: variable 'ansible_facts' from source: unknown 24134 1727096401.03430: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330/AnsiballZ_command.py 24134 1727096401.03660: Sending initial data 24134 1727096401.03676: Sent initial data (155 bytes) 24134 1727096401.04107: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096401.04130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096401.04145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.04193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096401.04196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096401.04199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096401.04282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096401.06795: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096401.06912: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096401.06990: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpqc8zbqd1 /root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330/AnsiballZ_command.py <<< 24134 1727096401.06993: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330/AnsiballZ_command.py" <<< 24134 1727096401.07104: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpqc8zbqd1" to remote "/root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330/AnsiballZ_command.py" <<< 24134 1727096401.08082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096401.08253: stderr chunk (state=3): >>><<< 24134 1727096401.08257: stdout chunk (state=3): >>><<< 24134 1727096401.08259: done transferring module to remote 24134 1727096401.08261: _low_level_execute_command(): starting 24134 1727096401.08263: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330/ /root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330/AnsiballZ_command.py && sleep 0' 24134 1727096401.08872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096401.08889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096401.08965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096401.08985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096401.09047: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.09126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096401.09190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096401.09353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096401.11961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096401.11995: stderr chunk (state=3): >>><<< 24134 1727096401.12007: stdout chunk (state=3): >>><<< 24134 1727096401.12062: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096401.12070: _low_level_execute_command(): starting 24134 1727096401.12074: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330/AnsiballZ_command.py && sleep 0' 24134 1727096401.12660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096401.12708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096401.12711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096401.12713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.12715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096401.12733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096401.12740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.12775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096401.12806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096401.12885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096401.38262: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:00:01.375594", "end": "2024-09-23 09:00:01.380906", "delta": "0:00:00.005312", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 24134 1727096401.38306: stdout chunk (state=3): >>> <<< 24134 1727096401.40704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096401.40739: stderr chunk (state=3): >>><<< 24134 1727096401.40743: stdout chunk (state=3): >>><<< 24134 1727096401.40759: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:00:01.375594", "end": "2024-09-23 09:00:01.380906", "delta": "0:00:00.005312", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096401.40792: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096401.40799: _low_level_execute_command(): starting 24134 1727096401.40804: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096400.8858027-24423-98535011977330/ > /dev/null 2>&1 && sleep 0' 24134 1727096401.41271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096401.41275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096401.41277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.41279: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096401.41287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.41332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096401.41336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096401.41417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096401.44080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096401.44109: stderr chunk (state=3): >>><<< 24134 1727096401.44112: stdout chunk (state=3): >>><<< 24134 1727096401.44123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096401.44129: handler run complete 24134 1727096401.44146: Evaluated conditional (False): False 24134 1727096401.44154: attempt loop complete, returning result 24134 1727096401.44156: _execute() done 24134 1727096401.44159: dumping result to json 24134 1727096401.44164: done dumping result, returning 24134 1727096401.44173: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0afff68d-5257-1673-d3fc-000000000192] 24134 1727096401.44194: sending task result for task 0afff68d-5257-1673-d3fc-000000000192 24134 1727096401.44279: done sending task result for task 0afff68d-5257-1673-d3fc-000000000192 24134 1727096401.44281: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.005312", "end": "2024-09-23 09:00:01.380906", "rc": 0, "start": "2024-09-23 09:00:01.375594" } STDOUT: bonding_masters eth0 lo 24134 1727096401.44363: no more pending results, returning what we have 24134 1727096401.44367: results queue empty 24134 1727096401.44371: checking for any_errors_fatal 24134 1727096401.44373: done checking for any_errors_fatal 24134 1727096401.44373: checking for max_fail_percentage 24134 1727096401.44375: done checking for max_fail_percentage 24134 1727096401.44376: checking to see if all hosts have failed and the running result is not ok 24134 1727096401.44377: done checking to see if all hosts have failed 24134 1727096401.44377: getting the remaining hosts for this loop 24134 1727096401.44379: done getting the remaining hosts for this loop 24134 1727096401.44382: getting the next task for host managed_node1 24134 1727096401.44387: done getting next task for host managed_node1 24134 1727096401.44389: ^ task is: TASK: Set current_interfaces 24134 1727096401.44393: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096401.44396: getting variables 24134 1727096401.44452: in VariableManager get_vars() 24134 1727096401.44488: Calling all_inventory to load vars for managed_node1 24134 1727096401.44491: Calling groups_inventory to load vars for managed_node1 24134 1727096401.44493: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096401.44501: Calling all_plugins_play to load vars for managed_node1 24134 1727096401.44503: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096401.44506: Calling groups_plugins_play to load vars for managed_node1 24134 1727096401.44626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096401.44745: done with get_vars() 24134 1727096401.44753: done getting variables 24134 1727096401.44801: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:00:01 -0400 (0:00:00.610) 0:00:05.661 ****** 24134 1727096401.44834: entering _queue_task() for managed_node1/set_fact 24134 1727096401.45047: worker is 1 (out of 1 available) 24134 1727096401.45060: exiting _queue_task() for managed_node1/set_fact 24134 1727096401.45073: done queuing things up, now waiting for results queue to drain 24134 1727096401.45075: waiting for pending results... 24134 1727096401.45219: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 24134 1727096401.45292: in run() - task 0afff68d-5257-1673-d3fc-000000000193 24134 1727096401.45305: variable 'ansible_search_path' from source: unknown 24134 1727096401.45310: variable 'ansible_search_path' from source: unknown 24134 1727096401.45341: calling self._execute() 24134 1727096401.45401: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.45406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.45414: variable 'omit' from source: magic vars 24134 1727096401.45684: variable 'ansible_distribution_major_version' from source: facts 24134 1727096401.45694: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096401.45700: variable 'omit' from source: magic vars 24134 1727096401.45730: variable 'omit' from source: magic vars 24134 1727096401.45809: variable '_current_interfaces' from source: set_fact 24134 1727096401.45861: variable 'omit' from source: magic vars 24134 1727096401.45891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096401.45925: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096401.45939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096401.45954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096401.45971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096401.45991: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096401.45994: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.45996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.46061: Set connection var ansible_shell_executable to /bin/sh 24134 1727096401.46066: Set connection var ansible_pipelining to False 24134 1727096401.46080: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096401.46083: Set connection var ansible_timeout to 10 24134 1727096401.46086: Set connection var ansible_connection to ssh 24134 1727096401.46088: Set connection var ansible_shell_type to sh 24134 1727096401.46105: variable 'ansible_shell_executable' from source: unknown 24134 1727096401.46108: variable 'ansible_connection' from source: unknown 24134 1727096401.46111: variable 'ansible_module_compression' from source: unknown 24134 1727096401.46113: variable 'ansible_shell_type' from source: unknown 24134 1727096401.46115: variable 'ansible_shell_executable' from source: unknown 24134 1727096401.46117: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.46120: variable 'ansible_pipelining' from source: unknown 24134 1727096401.46122: variable 'ansible_timeout' from source: unknown 24134 1727096401.46127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.46229: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096401.46237: variable 'omit' from source: magic vars 24134 1727096401.46243: starting attempt loop 24134 1727096401.46247: running the handler 24134 1727096401.46256: handler run complete 24134 1727096401.46263: attempt loop complete, returning result 24134 1727096401.46265: _execute() done 24134 1727096401.46270: dumping result to json 24134 1727096401.46275: done dumping result, returning 24134 1727096401.46282: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0afff68d-5257-1673-d3fc-000000000193] 24134 1727096401.46288: sending task result for task 0afff68d-5257-1673-d3fc-000000000193 24134 1727096401.46364: done sending task result for task 0afff68d-5257-1673-d3fc-000000000193 24134 1727096401.46366: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 24134 1727096401.46464: no more pending results, returning what we have 24134 1727096401.46466: results queue empty 24134 1727096401.46469: checking for any_errors_fatal 24134 1727096401.46475: done checking for any_errors_fatal 24134 1727096401.46475: checking for max_fail_percentage 24134 1727096401.46477: done checking for max_fail_percentage 24134 1727096401.46477: checking to see if all hosts have failed and the running result is not ok 24134 1727096401.46478: done checking to see if all hosts have failed 24134 1727096401.46479: getting the remaining hosts for this loop 24134 1727096401.46480: done getting the remaining hosts for this loop 24134 1727096401.46483: getting the next task for host managed_node1 24134 1727096401.46489: done getting next task for host managed_node1 24134 1727096401.46491: ^ task is: TASK: Show current_interfaces 24134 1727096401.46494: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096401.46497: getting variables 24134 1727096401.46498: in VariableManager get_vars() 24134 1727096401.46526: Calling all_inventory to load vars for managed_node1 24134 1727096401.46528: Calling groups_inventory to load vars for managed_node1 24134 1727096401.46530: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096401.46538: Calling all_plugins_play to load vars for managed_node1 24134 1727096401.46540: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096401.46542: Calling groups_plugins_play to load vars for managed_node1 24134 1727096401.46712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096401.46982: done with get_vars() 24134 1727096401.46991: done getting variables 24134 1727096401.47089: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:00:01 -0400 (0:00:00.022) 0:00:05.684 ****** 24134 1727096401.47116: entering _queue_task() for managed_node1/debug 24134 1727096401.47118: Creating lock for debug 24134 1727096401.47532: worker is 1 (out of 1 available) 24134 1727096401.47542: exiting _queue_task() for managed_node1/debug 24134 1727096401.47553: done queuing things up, now waiting for results queue to drain 24134 1727096401.47554: waiting for pending results... 24134 1727096401.47731: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 24134 1727096401.47803: in run() - task 0afff68d-5257-1673-d3fc-000000000116 24134 1727096401.47812: variable 'ansible_search_path' from source: unknown 24134 1727096401.47815: variable 'ansible_search_path' from source: unknown 24134 1727096401.47842: calling self._execute() 24134 1727096401.47906: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.47910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.47917: variable 'omit' from source: magic vars 24134 1727096401.48174: variable 'ansible_distribution_major_version' from source: facts 24134 1727096401.48188: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096401.48195: variable 'omit' from source: magic vars 24134 1727096401.48221: variable 'omit' from source: magic vars 24134 1727096401.48291: variable 'current_interfaces' from source: set_fact 24134 1727096401.48312: variable 'omit' from source: magic vars 24134 1727096401.48342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096401.48373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096401.48387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096401.48402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096401.48412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096401.48437: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096401.48440: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.48443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.48510: Set connection var ansible_shell_executable to /bin/sh 24134 1727096401.48515: Set connection var ansible_pipelining to False 24134 1727096401.48521: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096401.48528: Set connection var ansible_timeout to 10 24134 1727096401.48531: Set connection var ansible_connection to ssh 24134 1727096401.48533: Set connection var ansible_shell_type to sh 24134 1727096401.48551: variable 'ansible_shell_executable' from source: unknown 24134 1727096401.48554: variable 'ansible_connection' from source: unknown 24134 1727096401.48557: variable 'ansible_module_compression' from source: unknown 24134 1727096401.48559: variable 'ansible_shell_type' from source: unknown 24134 1727096401.48563: variable 'ansible_shell_executable' from source: unknown 24134 1727096401.48565: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.48571: variable 'ansible_pipelining' from source: unknown 24134 1727096401.48574: variable 'ansible_timeout' from source: unknown 24134 1727096401.48576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.48676: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096401.48684: variable 'omit' from source: magic vars 24134 1727096401.48688: starting attempt loop 24134 1727096401.48691: running the handler 24134 1727096401.48727: handler run complete 24134 1727096401.48739: attempt loop complete, returning result 24134 1727096401.48742: _execute() done 24134 1727096401.48745: dumping result to json 24134 1727096401.48748: done dumping result, returning 24134 1727096401.48750: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0afff68d-5257-1673-d3fc-000000000116] 24134 1727096401.48757: sending task result for task 0afff68d-5257-1673-d3fc-000000000116 24134 1727096401.48833: done sending task result for task 0afff68d-5257-1673-d3fc-000000000116 24134 1727096401.48836: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 24134 1727096401.48913: no more pending results, returning what we have 24134 1727096401.48916: results queue empty 24134 1727096401.48917: checking for any_errors_fatal 24134 1727096401.48921: done checking for any_errors_fatal 24134 1727096401.48922: checking for max_fail_percentage 24134 1727096401.48923: done checking for max_fail_percentage 24134 1727096401.48924: checking to see if all hosts have failed and the running result is not ok 24134 1727096401.48925: done checking to see if all hosts have failed 24134 1727096401.48925: getting the remaining hosts for this loop 24134 1727096401.48927: done getting the remaining hosts for this loop 24134 1727096401.48930: getting the next task for host managed_node1 24134 1727096401.48936: done getting next task for host managed_node1 24134 1727096401.48938: ^ task is: TASK: Include the task 'manage_test_interface.yml' 24134 1727096401.48940: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096401.48943: getting variables 24134 1727096401.48946: in VariableManager get_vars() 24134 1727096401.48979: Calling all_inventory to load vars for managed_node1 24134 1727096401.48981: Calling groups_inventory to load vars for managed_node1 24134 1727096401.48983: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096401.48991: Calling all_plugins_play to load vars for managed_node1 24134 1727096401.48994: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096401.48997: Calling groups_plugins_play to load vars for managed_node1 24134 1727096401.49101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096401.49217: done with get_vars() 24134 1727096401.49224: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:16 Monday 23 September 2024 09:00:01 -0400 (0:00:00.021) 0:00:05.706 ****** 24134 1727096401.49284: entering _queue_task() for managed_node1/include_tasks 24134 1727096401.49462: worker is 1 (out of 1 available) 24134 1727096401.49480: exiting _queue_task() for managed_node1/include_tasks 24134 1727096401.49491: done queuing things up, now waiting for results queue to drain 24134 1727096401.49492: waiting for pending results... 24134 1727096401.49885: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 24134 1727096401.49889: in run() - task 0afff68d-5257-1673-d3fc-00000000000d 24134 1727096401.49892: variable 'ansible_search_path' from source: unknown 24134 1727096401.49895: calling self._execute() 24134 1727096401.49897: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.49899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.49902: variable 'omit' from source: magic vars 24134 1727096401.50206: variable 'ansible_distribution_major_version' from source: facts 24134 1727096401.50221: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096401.50229: _execute() done 24134 1727096401.50235: dumping result to json 24134 1727096401.50241: done dumping result, returning 24134 1727096401.50250: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [0afff68d-5257-1673-d3fc-00000000000d] 24134 1727096401.50258: sending task result for task 0afff68d-5257-1673-d3fc-00000000000d 24134 1727096401.50353: done sending task result for task 0afff68d-5257-1673-d3fc-00000000000d 24134 1727096401.50360: WORKER PROCESS EXITING 24134 1727096401.50408: no more pending results, returning what we have 24134 1727096401.50413: in VariableManager get_vars() 24134 1727096401.50450: Calling all_inventory to load vars for managed_node1 24134 1727096401.50453: Calling groups_inventory to load vars for managed_node1 24134 1727096401.50455: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096401.50466: Calling all_plugins_play to load vars for managed_node1 24134 1727096401.50472: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096401.50475: Calling groups_plugins_play to load vars for managed_node1 24134 1727096401.50730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096401.50933: done with get_vars() 24134 1727096401.50940: variable 'ansible_search_path' from source: unknown 24134 1727096401.50952: we have included files to process 24134 1727096401.50953: generating all_blocks data 24134 1727096401.50954: done generating all_blocks data 24134 1727096401.50958: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24134 1727096401.50959: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24134 1727096401.50961: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24134 1727096401.51499: in VariableManager get_vars() 24134 1727096401.51517: done with get_vars() 24134 1727096401.51943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 24134 1727096401.52535: done processing included file 24134 1727096401.52537: iterating over new_blocks loaded from include file 24134 1727096401.52539: in VariableManager get_vars() 24134 1727096401.52552: done with get_vars() 24134 1727096401.52554: filtering new block on tags 24134 1727096401.52594: done filtering new block on tags 24134 1727096401.52597: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 24134 1727096401.52601: extending task lists for all hosts with included blocks 24134 1727096401.53745: done extending task lists 24134 1727096401.53747: done processing included files 24134 1727096401.53747: results queue empty 24134 1727096401.53748: checking for any_errors_fatal 24134 1727096401.53750: done checking for any_errors_fatal 24134 1727096401.53751: checking for max_fail_percentage 24134 1727096401.53752: done checking for max_fail_percentage 24134 1727096401.53753: checking to see if all hosts have failed and the running result is not ok 24134 1727096401.53754: done checking to see if all hosts have failed 24134 1727096401.53754: getting the remaining hosts for this loop 24134 1727096401.53762: done getting the remaining hosts for this loop 24134 1727096401.53765: getting the next task for host managed_node1 24134 1727096401.53773: done getting next task for host managed_node1 24134 1727096401.53775: ^ task is: TASK: Ensure state in ["present", "absent"] 24134 1727096401.53777: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096401.53779: getting variables 24134 1727096401.53780: in VariableManager get_vars() 24134 1727096401.53790: Calling all_inventory to load vars for managed_node1 24134 1727096401.53792: Calling groups_inventory to load vars for managed_node1 24134 1727096401.53794: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096401.53799: Calling all_plugins_play to load vars for managed_node1 24134 1727096401.53801: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096401.53804: Calling groups_plugins_play to load vars for managed_node1 24134 1727096401.53977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096401.54182: done with get_vars() 24134 1727096401.54190: done getting variables 24134 1727096401.54257: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 09:00:01 -0400 (0:00:00.049) 0:00:05.756 ****** 24134 1727096401.54286: entering _queue_task() for managed_node1/fail 24134 1727096401.54287: Creating lock for fail 24134 1727096401.54594: worker is 1 (out of 1 available) 24134 1727096401.54605: exiting _queue_task() for managed_node1/fail 24134 1727096401.54617: done queuing things up, now waiting for results queue to drain 24134 1727096401.54618: waiting for pending results... 24134 1727096401.54839: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 24134 1727096401.54931: in run() - task 0afff68d-5257-1673-d3fc-0000000001ae 24134 1727096401.54948: variable 'ansible_search_path' from source: unknown 24134 1727096401.54955: variable 'ansible_search_path' from source: unknown 24134 1727096401.55005: calling self._execute() 24134 1727096401.55097: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.55108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.55119: variable 'omit' from source: magic vars 24134 1727096401.55496: variable 'ansible_distribution_major_version' from source: facts 24134 1727096401.55521: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096401.55732: variable 'state' from source: include params 24134 1727096401.55736: Evaluated conditional (state not in ["present", "absent"]): False 24134 1727096401.55738: when evaluation is False, skipping this task 24134 1727096401.55740: _execute() done 24134 1727096401.55742: dumping result to json 24134 1727096401.55744: done dumping result, returning 24134 1727096401.55746: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-1673-d3fc-0000000001ae] 24134 1727096401.55748: sending task result for task 0afff68d-5257-1673-d3fc-0000000001ae 24134 1727096401.55812: done sending task result for task 0afff68d-5257-1673-d3fc-0000000001ae 24134 1727096401.55815: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 24134 1727096401.55883: no more pending results, returning what we have 24134 1727096401.55886: results queue empty 24134 1727096401.55887: checking for any_errors_fatal 24134 1727096401.55889: done checking for any_errors_fatal 24134 1727096401.55889: checking for max_fail_percentage 24134 1727096401.55891: done checking for max_fail_percentage 24134 1727096401.55892: checking to see if all hosts have failed and the running result is not ok 24134 1727096401.55893: done checking to see if all hosts have failed 24134 1727096401.55894: getting the remaining hosts for this loop 24134 1727096401.55895: done getting the remaining hosts for this loop 24134 1727096401.55899: getting the next task for host managed_node1 24134 1727096401.55904: done getting next task for host managed_node1 24134 1727096401.55907: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 24134 1727096401.55910: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096401.55913: getting variables 24134 1727096401.55916: in VariableManager get_vars() 24134 1727096401.55953: Calling all_inventory to load vars for managed_node1 24134 1727096401.55956: Calling groups_inventory to load vars for managed_node1 24134 1727096401.55958: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096401.55973: Calling all_plugins_play to load vars for managed_node1 24134 1727096401.55976: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096401.55980: Calling groups_plugins_play to load vars for managed_node1 24134 1727096401.56359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096401.56605: done with get_vars() 24134 1727096401.56620: done getting variables 24134 1727096401.56681: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 09:00:01 -0400 (0:00:00.024) 0:00:05.780 ****** 24134 1727096401.56707: entering _queue_task() for managed_node1/fail 24134 1727096401.56937: worker is 1 (out of 1 available) 24134 1727096401.57064: exiting _queue_task() for managed_node1/fail 24134 1727096401.57079: done queuing things up, now waiting for results queue to drain 24134 1727096401.57080: waiting for pending results... 24134 1727096401.57290: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 24134 1727096401.57323: in run() - task 0afff68d-5257-1673-d3fc-0000000001af 24134 1727096401.57339: variable 'ansible_search_path' from source: unknown 24134 1727096401.57346: variable 'ansible_search_path' from source: unknown 24134 1727096401.57393: calling self._execute() 24134 1727096401.57478: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.57521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.57524: variable 'omit' from source: magic vars 24134 1727096401.57877: variable 'ansible_distribution_major_version' from source: facts 24134 1727096401.57893: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096401.58052: variable 'type' from source: set_fact 24134 1727096401.58149: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 24134 1727096401.58153: when evaluation is False, skipping this task 24134 1727096401.58155: _execute() done 24134 1727096401.58157: dumping result to json 24134 1727096401.58160: done dumping result, returning 24134 1727096401.58162: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-1673-d3fc-0000000001af] 24134 1727096401.58166: sending task result for task 0afff68d-5257-1673-d3fc-0000000001af 24134 1727096401.58228: done sending task result for task 0afff68d-5257-1673-d3fc-0000000001af 24134 1727096401.58231: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 24134 1727096401.58286: no more pending results, returning what we have 24134 1727096401.58290: results queue empty 24134 1727096401.58291: checking for any_errors_fatal 24134 1727096401.58296: done checking for any_errors_fatal 24134 1727096401.58297: checking for max_fail_percentage 24134 1727096401.58298: done checking for max_fail_percentage 24134 1727096401.58299: checking to see if all hosts have failed and the running result is not ok 24134 1727096401.58300: done checking to see if all hosts have failed 24134 1727096401.58301: getting the remaining hosts for this loop 24134 1727096401.58302: done getting the remaining hosts for this loop 24134 1727096401.58305: getting the next task for host managed_node1 24134 1727096401.58311: done getting next task for host managed_node1 24134 1727096401.58314: ^ task is: TASK: Include the task 'show_interfaces.yml' 24134 1727096401.58317: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096401.58320: getting variables 24134 1727096401.58321: in VariableManager get_vars() 24134 1727096401.58355: Calling all_inventory to load vars for managed_node1 24134 1727096401.58477: Calling groups_inventory to load vars for managed_node1 24134 1727096401.58481: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096401.58490: Calling all_plugins_play to load vars for managed_node1 24134 1727096401.58492: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096401.58495: Calling groups_plugins_play to load vars for managed_node1 24134 1727096401.58730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096401.58959: done with get_vars() 24134 1727096401.58972: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 09:00:01 -0400 (0:00:00.023) 0:00:05.804 ****** 24134 1727096401.59072: entering _queue_task() for managed_node1/include_tasks 24134 1727096401.59377: worker is 1 (out of 1 available) 24134 1727096401.59394: exiting _queue_task() for managed_node1/include_tasks 24134 1727096401.59405: done queuing things up, now waiting for results queue to drain 24134 1727096401.59406: waiting for pending results... 24134 1727096401.59689: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 24134 1727096401.59707: in run() - task 0afff68d-5257-1673-d3fc-0000000001b0 24134 1727096401.59729: variable 'ansible_search_path' from source: unknown 24134 1727096401.59737: variable 'ansible_search_path' from source: unknown 24134 1727096401.59776: calling self._execute() 24134 1727096401.59873: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.59938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.59942: variable 'omit' from source: magic vars 24134 1727096401.60287: variable 'ansible_distribution_major_version' from source: facts 24134 1727096401.60303: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096401.60312: _execute() done 24134 1727096401.60319: dumping result to json 24134 1727096401.60332: done dumping result, returning 24134 1727096401.60341: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-1673-d3fc-0000000001b0] 24134 1727096401.60350: sending task result for task 0afff68d-5257-1673-d3fc-0000000001b0 24134 1727096401.60514: done sending task result for task 0afff68d-5257-1673-d3fc-0000000001b0 24134 1727096401.60517: WORKER PROCESS EXITING 24134 1727096401.60565: no more pending results, returning what we have 24134 1727096401.60574: in VariableManager get_vars() 24134 1727096401.60621: Calling all_inventory to load vars for managed_node1 24134 1727096401.60624: Calling groups_inventory to load vars for managed_node1 24134 1727096401.60627: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096401.60639: Calling all_plugins_play to load vars for managed_node1 24134 1727096401.60642: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096401.60646: Calling groups_plugins_play to load vars for managed_node1 24134 1727096401.61103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096401.61307: done with get_vars() 24134 1727096401.61322: variable 'ansible_search_path' from source: unknown 24134 1727096401.61324: variable 'ansible_search_path' from source: unknown 24134 1727096401.61358: we have included files to process 24134 1727096401.61359: generating all_blocks data 24134 1727096401.61360: done generating all_blocks data 24134 1727096401.61363: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24134 1727096401.61364: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24134 1727096401.61366: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24134 1727096401.61477: in VariableManager get_vars() 24134 1727096401.61498: done with get_vars() 24134 1727096401.61608: done processing included file 24134 1727096401.61609: iterating over new_blocks loaded from include file 24134 1727096401.61611: in VariableManager get_vars() 24134 1727096401.61625: done with get_vars() 24134 1727096401.61626: filtering new block on tags 24134 1727096401.61651: done filtering new block on tags 24134 1727096401.61653: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 24134 1727096401.61658: extending task lists for all hosts with included blocks 24134 1727096401.62057: done extending task lists 24134 1727096401.62059: done processing included files 24134 1727096401.62059: results queue empty 24134 1727096401.62060: checking for any_errors_fatal 24134 1727096401.62063: done checking for any_errors_fatal 24134 1727096401.62063: checking for max_fail_percentage 24134 1727096401.62064: done checking for max_fail_percentage 24134 1727096401.62065: checking to see if all hosts have failed and the running result is not ok 24134 1727096401.62066: done checking to see if all hosts have failed 24134 1727096401.62066: getting the remaining hosts for this loop 24134 1727096401.62072: done getting the remaining hosts for this loop 24134 1727096401.62074: getting the next task for host managed_node1 24134 1727096401.62084: done getting next task for host managed_node1 24134 1727096401.62087: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 24134 1727096401.62090: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096401.62092: getting variables 24134 1727096401.62093: in VariableManager get_vars() 24134 1727096401.62103: Calling all_inventory to load vars for managed_node1 24134 1727096401.62105: Calling groups_inventory to load vars for managed_node1 24134 1727096401.62106: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096401.62111: Calling all_plugins_play to load vars for managed_node1 24134 1727096401.62113: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096401.62116: Calling groups_plugins_play to load vars for managed_node1 24134 1727096401.62259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096401.62500: done with get_vars() 24134 1727096401.62518: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:00:01 -0400 (0:00:00.035) 0:00:05.839 ****** 24134 1727096401.62591: entering _queue_task() for managed_node1/include_tasks 24134 1727096401.62841: worker is 1 (out of 1 available) 24134 1727096401.62855: exiting _queue_task() for managed_node1/include_tasks 24134 1727096401.62869: done queuing things up, now waiting for results queue to drain 24134 1727096401.62870: waiting for pending results... 24134 1727096401.63198: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 24134 1727096401.63217: in run() - task 0afff68d-5257-1673-d3fc-000000000245 24134 1727096401.63236: variable 'ansible_search_path' from source: unknown 24134 1727096401.63243: variable 'ansible_search_path' from source: unknown 24134 1727096401.63292: calling self._execute() 24134 1727096401.63383: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.63402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.63415: variable 'omit' from source: magic vars 24134 1727096401.63834: variable 'ansible_distribution_major_version' from source: facts 24134 1727096401.63838: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096401.63841: _execute() done 24134 1727096401.63845: dumping result to json 24134 1727096401.63852: done dumping result, returning 24134 1727096401.63861: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-1673-d3fc-000000000245] 24134 1727096401.63943: sending task result for task 0afff68d-5257-1673-d3fc-000000000245 24134 1727096401.64005: done sending task result for task 0afff68d-5257-1673-d3fc-000000000245 24134 1727096401.64008: WORKER PROCESS EXITING 24134 1727096401.64073: no more pending results, returning what we have 24134 1727096401.64077: in VariableManager get_vars() 24134 1727096401.64114: Calling all_inventory to load vars for managed_node1 24134 1727096401.64117: Calling groups_inventory to load vars for managed_node1 24134 1727096401.64119: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096401.64128: Calling all_plugins_play to load vars for managed_node1 24134 1727096401.64130: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096401.64133: Calling groups_plugins_play to load vars for managed_node1 24134 1727096401.64283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096401.64723: done with get_vars() 24134 1727096401.64731: variable 'ansible_search_path' from source: unknown 24134 1727096401.64732: variable 'ansible_search_path' from source: unknown 24134 1727096401.65291: we have included files to process 24134 1727096401.65292: generating all_blocks data 24134 1727096401.65294: done generating all_blocks data 24134 1727096401.65295: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24134 1727096401.65296: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24134 1727096401.65298: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24134 1727096401.65823: done processing included file 24134 1727096401.65825: iterating over new_blocks loaded from include file 24134 1727096401.65826: in VariableManager get_vars() 24134 1727096401.65844: done with get_vars() 24134 1727096401.65846: filtering new block on tags 24134 1727096401.65863: done filtering new block on tags 24134 1727096401.65866: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 24134 1727096401.66075: extending task lists for all hosts with included blocks 24134 1727096401.66229: done extending task lists 24134 1727096401.66230: done processing included files 24134 1727096401.66231: results queue empty 24134 1727096401.66232: checking for any_errors_fatal 24134 1727096401.66235: done checking for any_errors_fatal 24134 1727096401.66235: checking for max_fail_percentage 24134 1727096401.66236: done checking for max_fail_percentage 24134 1727096401.66237: checking to see if all hosts have failed and the running result is not ok 24134 1727096401.66238: done checking to see if all hosts have failed 24134 1727096401.66239: getting the remaining hosts for this loop 24134 1727096401.66240: done getting the remaining hosts for this loop 24134 1727096401.66243: getting the next task for host managed_node1 24134 1727096401.66248: done getting next task for host managed_node1 24134 1727096401.66250: ^ task is: TASK: Gather current interface info 24134 1727096401.66253: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096401.66256: getting variables 24134 1727096401.66257: in VariableManager get_vars() 24134 1727096401.66269: Calling all_inventory to load vars for managed_node1 24134 1727096401.66271: Calling groups_inventory to load vars for managed_node1 24134 1727096401.66273: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096401.66279: Calling all_plugins_play to load vars for managed_node1 24134 1727096401.66281: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096401.66284: Calling groups_plugins_play to load vars for managed_node1 24134 1727096401.66454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096401.66652: done with get_vars() 24134 1727096401.66662: done getting variables 24134 1727096401.66707: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:00:01 -0400 (0:00:00.041) 0:00:05.880 ****** 24134 1727096401.66741: entering _queue_task() for managed_node1/command 24134 1727096401.67003: worker is 1 (out of 1 available) 24134 1727096401.67016: exiting _queue_task() for managed_node1/command 24134 1727096401.67140: done queuing things up, now waiting for results queue to drain 24134 1727096401.67142: waiting for pending results... 24134 1727096401.67588: running TaskExecutor() for managed_node1/TASK: Gather current interface info 24134 1727096401.67594: in run() - task 0afff68d-5257-1673-d3fc-00000000027c 24134 1727096401.67598: variable 'ansible_search_path' from source: unknown 24134 1727096401.67602: variable 'ansible_search_path' from source: unknown 24134 1727096401.67606: calling self._execute() 24134 1727096401.67609: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.67612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.67615: variable 'omit' from source: magic vars 24134 1727096401.67977: variable 'ansible_distribution_major_version' from source: facts 24134 1727096401.67981: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096401.67984: variable 'omit' from source: magic vars 24134 1727096401.68017: variable 'omit' from source: magic vars 24134 1727096401.68051: variable 'omit' from source: magic vars 24134 1727096401.68091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096401.68132: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096401.68152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096401.68172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096401.68181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096401.68209: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096401.68212: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.68215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.68319: Set connection var ansible_shell_executable to /bin/sh 24134 1727096401.68322: Set connection var ansible_pipelining to False 24134 1727096401.68329: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096401.68347: Set connection var ansible_timeout to 10 24134 1727096401.68350: Set connection var ansible_connection to ssh 24134 1727096401.68352: Set connection var ansible_shell_type to sh 24134 1727096401.68373: variable 'ansible_shell_executable' from source: unknown 24134 1727096401.68377: variable 'ansible_connection' from source: unknown 24134 1727096401.68380: variable 'ansible_module_compression' from source: unknown 24134 1727096401.68382: variable 'ansible_shell_type' from source: unknown 24134 1727096401.68384: variable 'ansible_shell_executable' from source: unknown 24134 1727096401.68386: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096401.68391: variable 'ansible_pipelining' from source: unknown 24134 1727096401.68393: variable 'ansible_timeout' from source: unknown 24134 1727096401.68397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096401.68535: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096401.68545: variable 'omit' from source: magic vars 24134 1727096401.68558: starting attempt loop 24134 1727096401.68561: running the handler 24134 1727096401.68738: _low_level_execute_command(): starting 24134 1727096401.68742: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096401.69761: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096401.69888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.69984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096401.69997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096401.70047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096401.70179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096401.72729: stdout chunk (state=3): >>>/root <<< 24134 1727096401.73034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096401.73049: stderr chunk (state=3): >>><<< 24134 1727096401.73059: stdout chunk (state=3): >>><<< 24134 1727096401.73362: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096401.73366: _low_level_execute_command(): starting 24134 1727096401.73373: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903 `" && echo ansible-tmp-1727096401.731634-24456-195862669352903="` echo /root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903 `" ) && sleep 0' 24134 1727096401.74461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096401.74481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.74542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096401.74657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096401.74694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096401.77479: stdout chunk (state=3): >>>ansible-tmp-1727096401.731634-24456-195862669352903=/root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903 <<< 24134 1727096401.77639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096401.77666: stderr chunk (state=3): >>><<< 24134 1727096401.77674: stdout chunk (state=3): >>><<< 24134 1727096401.77692: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096401.731634-24456-195862669352903=/root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096401.77726: variable 'ansible_module_compression' from source: unknown 24134 1727096401.77781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24134 1727096401.77817: variable 'ansible_facts' from source: unknown 24134 1727096401.77907: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903/AnsiballZ_command.py 24134 1727096401.78032: Sending initial data 24134 1727096401.78035: Sent initial data (155 bytes) 24134 1727096401.78721: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096401.78725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.78727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096401.78730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096401.78781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096401.78859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096401.81235: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24134 1727096401.81238: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096401.81375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096401.81478: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpnxoznmi2 /root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903/AnsiballZ_command.py <<< 24134 1727096401.81482: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903/AnsiballZ_command.py" <<< 24134 1727096401.81540: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpnxoznmi2" to remote "/root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903/AnsiballZ_command.py" <<< 24134 1727096401.82396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096401.82431: stderr chunk (state=3): >>><<< 24134 1727096401.82435: stdout chunk (state=3): >>><<< 24134 1727096401.82471: done transferring module to remote 24134 1727096401.82479: _low_level_execute_command(): starting 24134 1727096401.82483: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903/ /root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903/AnsiballZ_command.py && sleep 0' 24134 1727096401.82892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096401.82895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.82898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096401.82901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096401.82903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.82943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096401.82947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096401.83020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096401.85635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096401.85653: stderr chunk (state=3): >>><<< 24134 1727096401.85657: stdout chunk (state=3): >>><<< 24134 1727096401.85671: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096401.85675: _low_level_execute_command(): starting 24134 1727096401.85681: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903/AnsiballZ_command.py && sleep 0' 24134 1727096401.86085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096401.86088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.86090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096401.86092: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096401.86094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096401.86149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096401.86152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096401.86222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096402.10678: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:00:02.095822", "end": "2024-09-23 09:00:02.100731", "delta": "0:00:00.004909", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24134 1727096402.12593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096402.12600: stderr chunk (state=3): >>>Shared connection to 10.31.11.125 closed. <<< 24134 1727096402.12652: stderr chunk (state=3): >>><<< 24134 1727096402.12791: stdout chunk (state=3): >>><<< 24134 1727096402.12811: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:00:02.095822", "end": "2024-09-23 09:00:02.100731", "delta": "0:00:00.004909", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096402.12850: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096402.12859: _low_level_execute_command(): starting 24134 1727096402.12865: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096401.731634-24456-195862669352903/ > /dev/null 2>&1 && sleep 0' 24134 1727096402.13761: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096402.13791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096402.13914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096402.16673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096402.16685: stdout chunk (state=3): >>><<< 24134 1727096402.16697: stderr chunk (state=3): >>><<< 24134 1727096402.16723: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096402.16745: handler run complete 24134 1727096402.16898: Evaluated conditional (False): False 24134 1727096402.16901: attempt loop complete, returning result 24134 1727096402.16904: _execute() done 24134 1727096402.16906: dumping result to json 24134 1727096402.16908: done dumping result, returning 24134 1727096402.16910: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0afff68d-5257-1673-d3fc-00000000027c] 24134 1727096402.16912: sending task result for task 0afff68d-5257-1673-d3fc-00000000027c 24134 1727096402.16995: done sending task result for task 0afff68d-5257-1673-d3fc-00000000027c 24134 1727096402.16998: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004909", "end": "2024-09-23 09:00:02.100731", "rc": 0, "start": "2024-09-23 09:00:02.095822" } STDOUT: bonding_masters eth0 lo 24134 1727096402.17088: no more pending results, returning what we have 24134 1727096402.17092: results queue empty 24134 1727096402.17093: checking for any_errors_fatal 24134 1727096402.17095: done checking for any_errors_fatal 24134 1727096402.17095: checking for max_fail_percentage 24134 1727096402.17097: done checking for max_fail_percentage 24134 1727096402.17098: checking to see if all hosts have failed and the running result is not ok 24134 1727096402.17099: done checking to see if all hosts have failed 24134 1727096402.17100: getting the remaining hosts for this loop 24134 1727096402.17101: done getting the remaining hosts for this loop 24134 1727096402.17104: getting the next task for host managed_node1 24134 1727096402.17114: done getting next task for host managed_node1 24134 1727096402.17116: ^ task is: TASK: Set current_interfaces 24134 1727096402.17121: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096402.17126: getting variables 24134 1727096402.17128: in VariableManager get_vars() 24134 1727096402.17165: Calling all_inventory to load vars for managed_node1 24134 1727096402.17579: Calling groups_inventory to load vars for managed_node1 24134 1727096402.17584: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096402.17594: Calling all_plugins_play to load vars for managed_node1 24134 1727096402.17598: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096402.17601: Calling groups_plugins_play to load vars for managed_node1 24134 1727096402.17976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096402.18229: done with get_vars() 24134 1727096402.18240: done getting variables 24134 1727096402.18304: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:00:02 -0400 (0:00:00.515) 0:00:06.396 ****** 24134 1727096402.18341: entering _queue_task() for managed_node1/set_fact 24134 1727096402.18824: worker is 1 (out of 1 available) 24134 1727096402.18837: exiting _queue_task() for managed_node1/set_fact 24134 1727096402.18852: done queuing things up, now waiting for results queue to drain 24134 1727096402.18854: waiting for pending results... 24134 1727096402.19424: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 24134 1727096402.19430: in run() - task 0afff68d-5257-1673-d3fc-00000000027d 24134 1727096402.19434: variable 'ansible_search_path' from source: unknown 24134 1727096402.19522: variable 'ansible_search_path' from source: unknown 24134 1727096402.19526: calling self._execute() 24134 1727096402.19572: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096402.19585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096402.19599: variable 'omit' from source: magic vars 24134 1727096402.19978: variable 'ansible_distribution_major_version' from source: facts 24134 1727096402.19996: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096402.20008: variable 'omit' from source: magic vars 24134 1727096402.20077: variable 'omit' from source: magic vars 24134 1727096402.20190: variable '_current_interfaces' from source: set_fact 24134 1727096402.20256: variable 'omit' from source: magic vars 24134 1727096402.20308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096402.20346: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096402.20396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096402.20404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096402.20422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096402.20453: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096402.20505: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096402.20508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096402.20589: Set connection var ansible_shell_executable to /bin/sh 24134 1727096402.20600: Set connection var ansible_pipelining to False 24134 1727096402.20618: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096402.20633: Set connection var ansible_timeout to 10 24134 1727096402.20641: Set connection var ansible_connection to ssh 24134 1727096402.20649: Set connection var ansible_shell_type to sh 24134 1727096402.20722: variable 'ansible_shell_executable' from source: unknown 24134 1727096402.20726: variable 'ansible_connection' from source: unknown 24134 1727096402.20729: variable 'ansible_module_compression' from source: unknown 24134 1727096402.20731: variable 'ansible_shell_type' from source: unknown 24134 1727096402.20733: variable 'ansible_shell_executable' from source: unknown 24134 1727096402.20735: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096402.20737: variable 'ansible_pipelining' from source: unknown 24134 1727096402.20739: variable 'ansible_timeout' from source: unknown 24134 1727096402.20741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096402.20883: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096402.20899: variable 'omit' from source: magic vars 24134 1727096402.20911: starting attempt loop 24134 1727096402.20942: running the handler 24134 1727096402.20945: handler run complete 24134 1727096402.20958: attempt loop complete, returning result 24134 1727096402.20965: _execute() done 24134 1727096402.21051: dumping result to json 24134 1727096402.21054: done dumping result, returning 24134 1727096402.21057: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0afff68d-5257-1673-d3fc-00000000027d] 24134 1727096402.21058: sending task result for task 0afff68d-5257-1673-d3fc-00000000027d 24134 1727096402.21124: done sending task result for task 0afff68d-5257-1673-d3fc-00000000027d 24134 1727096402.21127: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 24134 1727096402.21211: no more pending results, returning what we have 24134 1727096402.21215: results queue empty 24134 1727096402.21216: checking for any_errors_fatal 24134 1727096402.21224: done checking for any_errors_fatal 24134 1727096402.21224: checking for max_fail_percentage 24134 1727096402.21226: done checking for max_fail_percentage 24134 1727096402.21227: checking to see if all hosts have failed and the running result is not ok 24134 1727096402.21228: done checking to see if all hosts have failed 24134 1727096402.21229: getting the remaining hosts for this loop 24134 1727096402.21230: done getting the remaining hosts for this loop 24134 1727096402.21234: getting the next task for host managed_node1 24134 1727096402.21242: done getting next task for host managed_node1 24134 1727096402.21244: ^ task is: TASK: Show current_interfaces 24134 1727096402.21248: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096402.21252: getting variables 24134 1727096402.21253: in VariableManager get_vars() 24134 1727096402.21293: Calling all_inventory to load vars for managed_node1 24134 1727096402.21296: Calling groups_inventory to load vars for managed_node1 24134 1727096402.21299: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096402.21309: Calling all_plugins_play to load vars for managed_node1 24134 1727096402.21311: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096402.21314: Calling groups_plugins_play to load vars for managed_node1 24134 1727096402.21640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096402.21848: done with get_vars() 24134 1727096402.21857: done getting variables 24134 1727096402.21920: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:00:02 -0400 (0:00:00.036) 0:00:06.432 ****** 24134 1727096402.21950: entering _queue_task() for managed_node1/debug 24134 1727096402.22278: worker is 1 (out of 1 available) 24134 1727096402.22288: exiting _queue_task() for managed_node1/debug 24134 1727096402.22297: done queuing things up, now waiting for results queue to drain 24134 1727096402.22298: waiting for pending results... 24134 1727096402.22481: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 24134 1727096402.22551: in run() - task 0afff68d-5257-1673-d3fc-000000000246 24134 1727096402.22578: variable 'ansible_search_path' from source: unknown 24134 1727096402.22633: variable 'ansible_search_path' from source: unknown 24134 1727096402.22637: calling self._execute() 24134 1727096402.22706: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096402.22718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096402.22731: variable 'omit' from source: magic vars 24134 1727096402.23095: variable 'ansible_distribution_major_version' from source: facts 24134 1727096402.23118: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096402.23130: variable 'omit' from source: magic vars 24134 1727096402.23229: variable 'omit' from source: magic vars 24134 1727096402.23290: variable 'current_interfaces' from source: set_fact 24134 1727096402.23322: variable 'omit' from source: magic vars 24134 1727096402.23371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096402.23414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096402.23437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096402.23465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096402.23493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096402.23557: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096402.23560: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096402.23563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096402.23651: Set connection var ansible_shell_executable to /bin/sh 24134 1727096402.23684: Set connection var ansible_pipelining to False 24134 1727096402.23696: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096402.23711: Set connection var ansible_timeout to 10 24134 1727096402.23778: Set connection var ansible_connection to ssh 24134 1727096402.23781: Set connection var ansible_shell_type to sh 24134 1727096402.23783: variable 'ansible_shell_executable' from source: unknown 24134 1727096402.23786: variable 'ansible_connection' from source: unknown 24134 1727096402.23788: variable 'ansible_module_compression' from source: unknown 24134 1727096402.23790: variable 'ansible_shell_type' from source: unknown 24134 1727096402.23792: variable 'ansible_shell_executable' from source: unknown 24134 1727096402.23793: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096402.23795: variable 'ansible_pipelining' from source: unknown 24134 1727096402.23797: variable 'ansible_timeout' from source: unknown 24134 1727096402.23799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096402.23975: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096402.23979: variable 'omit' from source: magic vars 24134 1727096402.23981: starting attempt loop 24134 1727096402.23983: running the handler 24134 1727096402.24026: handler run complete 24134 1727096402.24044: attempt loop complete, returning result 24134 1727096402.24052: _execute() done 24134 1727096402.24059: dumping result to json 24134 1727096402.24101: done dumping result, returning 24134 1727096402.24105: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0afff68d-5257-1673-d3fc-000000000246] 24134 1727096402.24107: sending task result for task 0afff68d-5257-1673-d3fc-000000000246 ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 24134 1727096402.24292: no more pending results, returning what we have 24134 1727096402.24296: results queue empty 24134 1727096402.24297: checking for any_errors_fatal 24134 1727096402.24302: done checking for any_errors_fatal 24134 1727096402.24302: checking for max_fail_percentage 24134 1727096402.24304: done checking for max_fail_percentage 24134 1727096402.24305: checking to see if all hosts have failed and the running result is not ok 24134 1727096402.24305: done checking to see if all hosts have failed 24134 1727096402.24306: getting the remaining hosts for this loop 24134 1727096402.24308: done getting the remaining hosts for this loop 24134 1727096402.24311: getting the next task for host managed_node1 24134 1727096402.24325: done getting next task for host managed_node1 24134 1727096402.24328: ^ task is: TASK: Install iproute 24134 1727096402.24331: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096402.24335: getting variables 24134 1727096402.24337: in VariableManager get_vars() 24134 1727096402.24474: Calling all_inventory to load vars for managed_node1 24134 1727096402.24477: Calling groups_inventory to load vars for managed_node1 24134 1727096402.24480: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096402.24486: done sending task result for task 0afff68d-5257-1673-d3fc-000000000246 24134 1727096402.24489: WORKER PROCESS EXITING 24134 1727096402.24497: Calling all_plugins_play to load vars for managed_node1 24134 1727096402.24499: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096402.24502: Calling groups_plugins_play to load vars for managed_node1 24134 1727096402.24745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096402.24954: done with get_vars() 24134 1727096402.24962: done getting variables 24134 1727096402.25019: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 09:00:02 -0400 (0:00:00.030) 0:00:06.463 ****** 24134 1727096402.25047: entering _queue_task() for managed_node1/package 24134 1727096402.25257: worker is 1 (out of 1 available) 24134 1727096402.25373: exiting _queue_task() for managed_node1/package 24134 1727096402.25382: done queuing things up, now waiting for results queue to drain 24134 1727096402.25383: waiting for pending results... 24134 1727096402.25524: running TaskExecutor() for managed_node1/TASK: Install iproute 24134 1727096402.25610: in run() - task 0afff68d-5257-1673-d3fc-0000000001b1 24134 1727096402.25632: variable 'ansible_search_path' from source: unknown 24134 1727096402.25638: variable 'ansible_search_path' from source: unknown 24134 1727096402.25673: calling self._execute() 24134 1727096402.25756: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096402.25772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096402.25787: variable 'omit' from source: magic vars 24134 1727096402.26127: variable 'ansible_distribution_major_version' from source: facts 24134 1727096402.26146: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096402.26172: variable 'omit' from source: magic vars 24134 1727096402.26277: variable 'omit' from source: magic vars 24134 1727096402.26414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096402.29301: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096402.29383: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096402.29427: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096402.29471: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096402.29510: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096402.29614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096402.29648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096402.29683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096402.29811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096402.29815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096402.29875: variable '__network_is_ostree' from source: set_fact 24134 1727096402.29919: variable 'omit' from source: magic vars 24134 1727096402.29922: variable 'omit' from source: magic vars 24134 1727096402.29950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096402.29985: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096402.30008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096402.30040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096402.30055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096402.30136: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096402.30139: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096402.30145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096402.30211: Set connection var ansible_shell_executable to /bin/sh 24134 1727096402.30225: Set connection var ansible_pipelining to False 24134 1727096402.30234: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096402.30256: Set connection var ansible_timeout to 10 24134 1727096402.30262: Set connection var ansible_connection to ssh 24134 1727096402.30270: Set connection var ansible_shell_type to sh 24134 1727096402.30295: variable 'ansible_shell_executable' from source: unknown 24134 1727096402.30353: variable 'ansible_connection' from source: unknown 24134 1727096402.30355: variable 'ansible_module_compression' from source: unknown 24134 1727096402.30362: variable 'ansible_shell_type' from source: unknown 24134 1727096402.30364: variable 'ansible_shell_executable' from source: unknown 24134 1727096402.30365: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096402.30371: variable 'ansible_pipelining' from source: unknown 24134 1727096402.30373: variable 'ansible_timeout' from source: unknown 24134 1727096402.30375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096402.30434: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096402.30447: variable 'omit' from source: magic vars 24134 1727096402.30461: starting attempt loop 24134 1727096402.30475: running the handler 24134 1727096402.30486: variable 'ansible_facts' from source: unknown 24134 1727096402.30491: variable 'ansible_facts' from source: unknown 24134 1727096402.30530: _low_level_execute_command(): starting 24134 1727096402.30573: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096402.31207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096402.31230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096402.31253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096402.31286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096402.31339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096402.31342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096402.31390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096402.31414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096402.31426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096402.31538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096402.33884: stdout chunk (state=3): >>>/root <<< 24134 1727096402.34102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096402.34106: stdout chunk (state=3): >>><<< 24134 1727096402.34116: stderr chunk (state=3): >>><<< 24134 1727096402.34217: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096402.34220: _low_level_execute_command(): starting 24134 1727096402.34225: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234 `" && echo ansible-tmp-1727096402.3413377-24515-234875018558234="` echo /root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234 `" ) && sleep 0' 24134 1727096402.34752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096402.34766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096402.34787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096402.34811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096402.34915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096402.34933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096402.34949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096402.35060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096402.37972: stdout chunk (state=3): >>>ansible-tmp-1727096402.3413377-24515-234875018558234=/root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234 <<< 24134 1727096402.38162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096402.38176: stdout chunk (state=3): >>><<< 24134 1727096402.38187: stderr chunk (state=3): >>><<< 24134 1727096402.38212: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096402.3413377-24515-234875018558234=/root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096402.38245: variable 'ansible_module_compression' from source: unknown 24134 1727096402.38316: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 24134 1727096402.38324: ANSIBALLZ: Acquiring lock 24134 1727096402.38331: ANSIBALLZ: Lock acquired: 140085163806880 24134 1727096402.38338: ANSIBALLZ: Creating module 24134 1727096402.54454: ANSIBALLZ: Writing module into payload 24134 1727096402.54683: ANSIBALLZ: Writing module 24134 1727096402.54775: ANSIBALLZ: Renaming module 24134 1727096402.54779: ANSIBALLZ: Done creating module 24134 1727096402.54781: variable 'ansible_facts' from source: unknown 24134 1727096402.54860: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234/AnsiballZ_dnf.py 24134 1727096402.55025: Sending initial data 24134 1727096402.55035: Sent initial data (152 bytes) 24134 1727096402.55780: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096402.55804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096402.55817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096402.55922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096402.58240: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24134 1727096402.58262: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096402.58377: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096402.58477: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpp3ugkp11 /root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234/AnsiballZ_dnf.py <<< 24134 1727096402.58482: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234/AnsiballZ_dnf.py" <<< 24134 1727096402.58541: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpp3ugkp11" to remote "/root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234/AnsiballZ_dnf.py" <<< 24134 1727096402.60053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096402.60057: stdout chunk (state=3): >>><<< 24134 1727096402.60059: stderr chunk (state=3): >>><<< 24134 1727096402.60078: done transferring module to remote 24134 1727096402.60093: _low_level_execute_command(): starting 24134 1727096402.60102: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234/ /root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234/AnsiballZ_dnf.py && sleep 0' 24134 1727096402.61607: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096402.61893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096402.61921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096402.61937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096402.62005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096402.62298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096402.65091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096402.65095: stdout chunk (state=3): >>><<< 24134 1727096402.65097: stderr chunk (state=3): >>><<< 24134 1727096402.65113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096402.65125: _low_level_execute_command(): starting 24134 1727096402.65133: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234/AnsiballZ_dnf.py && sleep 0' 24134 1727096402.65936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096402.65949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096402.65964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096402.65990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096402.66017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096402.66236: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096402.66274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096402.66356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096402.66390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096402.66471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096403.34570: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 24134 1727096403.41644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096403.41664: stdout chunk (state=3): >>><<< 24134 1727096403.41682: stderr chunk (state=3): >>><<< 24134 1727096403.41761: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096403.41772: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096403.41775: _low_level_execute_command(): starting 24134 1727096403.41777: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096402.3413377-24515-234875018558234/ > /dev/null 2>&1 && sleep 0' 24134 1727096403.42729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096403.43026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096403.43105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096403.45841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096403.45909: stdout chunk (state=3): >>><<< 24134 1727096403.45912: stderr chunk (state=3): >>><<< 24134 1727096403.46088: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096403.46091: handler run complete 24134 1727096403.46354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096403.46897: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096403.46900: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096403.46902: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096403.47012: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096403.47133: variable '__install_status' from source: unknown 24134 1727096403.47330: Evaluated conditional (__install_status is success): True 24134 1727096403.47333: attempt loop complete, returning result 24134 1727096403.47336: _execute() done 24134 1727096403.47338: dumping result to json 24134 1727096403.47340: done dumping result, returning 24134 1727096403.47342: done running TaskExecutor() for managed_node1/TASK: Install iproute [0afff68d-5257-1673-d3fc-0000000001b1] 24134 1727096403.47344: sending task result for task 0afff68d-5257-1673-d3fc-0000000001b1 24134 1727096403.47418: done sending task result for task 0afff68d-5257-1673-d3fc-0000000001b1 24134 1727096403.47421: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 24134 1727096403.47516: no more pending results, returning what we have 24134 1727096403.47520: results queue empty 24134 1727096403.47521: checking for any_errors_fatal 24134 1727096403.47526: done checking for any_errors_fatal 24134 1727096403.47527: checking for max_fail_percentage 24134 1727096403.47528: done checking for max_fail_percentage 24134 1727096403.47529: checking to see if all hosts have failed and the running result is not ok 24134 1727096403.47530: done checking to see if all hosts have failed 24134 1727096403.47530: getting the remaining hosts for this loop 24134 1727096403.47532: done getting the remaining hosts for this loop 24134 1727096403.47536: getting the next task for host managed_node1 24134 1727096403.47541: done getting next task for host managed_node1 24134 1727096403.47543: ^ task is: TASK: Create veth interface {{ interface }} 24134 1727096403.47546: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096403.47550: getting variables 24134 1727096403.47551: in VariableManager get_vars() 24134 1727096403.47813: Calling all_inventory to load vars for managed_node1 24134 1727096403.47816: Calling groups_inventory to load vars for managed_node1 24134 1727096403.47819: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096403.47830: Calling all_plugins_play to load vars for managed_node1 24134 1727096403.47832: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096403.47835: Calling groups_plugins_play to load vars for managed_node1 24134 1727096403.48483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096403.48974: done with get_vars() 24134 1727096403.48985: done getting variables 24134 1727096403.49174: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096403.49449: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 09:00:03 -0400 (0:00:01.244) 0:00:07.708 ****** 24134 1727096403.49485: entering _queue_task() for managed_node1/command 24134 1727096403.50058: worker is 1 (out of 1 available) 24134 1727096403.50175: exiting _queue_task() for managed_node1/command 24134 1727096403.50189: done queuing things up, now waiting for results queue to drain 24134 1727096403.50191: waiting for pending results... 24134 1727096403.50777: running TaskExecutor() for managed_node1/TASK: Create veth interface ethtest0 24134 1727096403.50899: in run() - task 0afff68d-5257-1673-d3fc-0000000001b2 24134 1727096403.50912: variable 'ansible_search_path' from source: unknown 24134 1727096403.50917: variable 'ansible_search_path' from source: unknown 24134 1727096403.51309: variable 'interface' from source: set_fact 24134 1727096403.51441: variable 'interface' from source: set_fact 24134 1727096403.51611: variable 'interface' from source: set_fact 24134 1727096403.51647: Loaded config def from plugin (lookup/items) 24134 1727096403.51653: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 24134 1727096403.51675: variable 'omit' from source: magic vars 24134 1727096403.51878: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096403.51882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096403.51884: variable 'omit' from source: magic vars 24134 1727096403.52182: variable 'ansible_distribution_major_version' from source: facts 24134 1727096403.52185: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096403.52247: variable 'type' from source: set_fact 24134 1727096403.52250: variable 'state' from source: include params 24134 1727096403.52262: variable 'interface' from source: set_fact 24134 1727096403.52265: variable 'current_interfaces' from source: set_fact 24134 1727096403.52271: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24134 1727096403.52274: variable 'omit' from source: magic vars 24134 1727096403.52309: variable 'omit' from source: magic vars 24134 1727096403.52450: variable 'item' from source: unknown 24134 1727096403.52453: variable 'item' from source: unknown 24134 1727096403.52456: variable 'omit' from source: magic vars 24134 1727096403.52498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096403.52553: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096403.52557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096403.52559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096403.52562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096403.52661: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096403.52665: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096403.52671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096403.52685: Set connection var ansible_shell_executable to /bin/sh 24134 1727096403.52688: Set connection var ansible_pipelining to False 24134 1727096403.52690: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096403.52700: Set connection var ansible_timeout to 10 24134 1727096403.52703: Set connection var ansible_connection to ssh 24134 1727096403.52706: Set connection var ansible_shell_type to sh 24134 1727096403.52774: variable 'ansible_shell_executable' from source: unknown 24134 1727096403.52777: variable 'ansible_connection' from source: unknown 24134 1727096403.52779: variable 'ansible_module_compression' from source: unknown 24134 1727096403.52781: variable 'ansible_shell_type' from source: unknown 24134 1727096403.52784: variable 'ansible_shell_executable' from source: unknown 24134 1727096403.52785: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096403.52787: variable 'ansible_pipelining' from source: unknown 24134 1727096403.52790: variable 'ansible_timeout' from source: unknown 24134 1727096403.52792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096403.52998: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096403.53002: variable 'omit' from source: magic vars 24134 1727096403.53005: starting attempt loop 24134 1727096403.53007: running the handler 24134 1727096403.53010: _low_level_execute_command(): starting 24134 1727096403.53012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096403.54409: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096403.54413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096403.54416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096403.56082: stdout chunk (state=3): >>>/root <<< 24134 1727096403.56227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096403.56230: stdout chunk (state=3): >>><<< 24134 1727096403.56240: stderr chunk (state=3): >>><<< 24134 1727096403.56491: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096403.56506: _low_level_execute_command(): starting 24134 1727096403.56512: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601 `" && echo ansible-tmp-1727096403.564917-24559-262579166629601="` echo /root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601 `" ) && sleep 0' 24134 1727096403.58049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096403.58126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096403.58154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096403.58283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096403.58340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096403.58442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096403.58446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096403.58536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096403.60590: stdout chunk (state=3): >>>ansible-tmp-1727096403.564917-24559-262579166629601=/root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601 <<< 24134 1727096403.60804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096403.60808: stdout chunk (state=3): >>><<< 24134 1727096403.60812: stderr chunk (state=3): >>><<< 24134 1727096403.60819: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096403.564917-24559-262579166629601=/root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096403.60958: variable 'ansible_module_compression' from source: unknown 24134 1727096403.61090: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24134 1727096403.61161: variable 'ansible_facts' from source: unknown 24134 1727096403.61462: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601/AnsiballZ_command.py 24134 1727096403.61778: Sending initial data 24134 1727096403.61783: Sent initial data (155 bytes) 24134 1727096403.63163: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096403.63177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096403.63190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096403.63288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096403.63395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096403.63437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096403.63502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096403.63589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 24134 1727096403.65899: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096403.65965: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096403.66034: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpnlnitl7h /root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601/AnsiballZ_command.py <<< 24134 1727096403.66050: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601/AnsiballZ_command.py" <<< 24134 1727096403.66113: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpnlnitl7h" to remote "/root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601/AnsiballZ_command.py" <<< 24134 1727096403.66116: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601/AnsiballZ_command.py" <<< 24134 1727096403.67434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096403.67481: stderr chunk (state=3): >>><<< 24134 1727096403.67718: stdout chunk (state=3): >>><<< 24134 1727096403.67722: done transferring module to remote 24134 1727096403.67724: _low_level_execute_command(): starting 24134 1727096403.67726: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601/ /root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601/AnsiballZ_command.py && sleep 0' 24134 1727096403.68454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096403.68461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096403.68488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096403.68496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096403.68576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096403.68581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096403.68650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096403.71185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096403.71189: stdout chunk (state=3): >>><<< 24134 1727096403.71191: stderr chunk (state=3): >>><<< 24134 1727096403.71194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096403.71196: _low_level_execute_command(): starting 24134 1727096403.71198: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601/AnsiballZ_command.py && sleep 0' 24134 1727096403.72016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096403.72020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096403.72027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096403.72030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096403.72032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096403.72105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096403.72135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096403.72138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096403.72239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096403.95771: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-23 09:00:03.945832", "end": "2024-09-23 09:00:03.955624", "delta": "0:00:00.009792", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24134 1727096404.03540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096404.03567: stderr chunk (state=3): >>><<< 24134 1727096404.03572: stdout chunk (state=3): >>><<< 24134 1727096404.03593: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-23 09:00:03.945832", "end": "2024-09-23 09:00:03.955624", "delta": "0:00:00.009792", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096404.03624: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096404.03631: _low_level_execute_command(): starting 24134 1727096404.03636: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096403.564917-24559-262579166629601/ > /dev/null 2>&1 && sleep 0' 24134 1727096404.04073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096404.04103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096404.04106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.04108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096404.04110: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096404.04112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.04178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.04181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096404.04182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.04252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096404.08820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.08846: stderr chunk (state=3): >>><<< 24134 1727096404.08849: stdout chunk (state=3): >>><<< 24134 1727096404.08863: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096404.08873: handler run complete 24134 1727096404.08893: Evaluated conditional (False): False 24134 1727096404.08901: attempt loop complete, returning result 24134 1727096404.08915: variable 'item' from source: unknown 24134 1727096404.08979: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.009792", "end": "2024-09-23 09:00:03.955624", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-23 09:00:03.945832" } 24134 1727096404.09147: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096404.09150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096404.09152: variable 'omit' from source: magic vars 24134 1727096404.09222: variable 'ansible_distribution_major_version' from source: facts 24134 1727096404.09226: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096404.09346: variable 'type' from source: set_fact 24134 1727096404.09350: variable 'state' from source: include params 24134 1727096404.09353: variable 'interface' from source: set_fact 24134 1727096404.09357: variable 'current_interfaces' from source: set_fact 24134 1727096404.09363: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24134 1727096404.09372: variable 'omit' from source: magic vars 24134 1727096404.09384: variable 'omit' from source: magic vars 24134 1727096404.09411: variable 'item' from source: unknown 24134 1727096404.09453: variable 'item' from source: unknown 24134 1727096404.09464: variable 'omit' from source: magic vars 24134 1727096404.09490: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096404.09500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096404.09503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096404.09509: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096404.09511: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096404.09514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096404.09562: Set connection var ansible_shell_executable to /bin/sh 24134 1727096404.09566: Set connection var ansible_pipelining to False 24134 1727096404.09573: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096404.09581: Set connection var ansible_timeout to 10 24134 1727096404.09584: Set connection var ansible_connection to ssh 24134 1727096404.09586: Set connection var ansible_shell_type to sh 24134 1727096404.09603: variable 'ansible_shell_executable' from source: unknown 24134 1727096404.09607: variable 'ansible_connection' from source: unknown 24134 1727096404.09609: variable 'ansible_module_compression' from source: unknown 24134 1727096404.09611: variable 'ansible_shell_type' from source: unknown 24134 1727096404.09613: variable 'ansible_shell_executable' from source: unknown 24134 1727096404.09615: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096404.09618: variable 'ansible_pipelining' from source: unknown 24134 1727096404.09620: variable 'ansible_timeout' from source: unknown 24134 1727096404.09622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096404.09688: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096404.09696: variable 'omit' from source: magic vars 24134 1727096404.09699: starting attempt loop 24134 1727096404.09703: running the handler 24134 1727096404.09711: _low_level_execute_command(): starting 24134 1727096404.09713: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096404.10154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096404.10188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096404.10196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.10198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.10249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.10252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096404.10254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.10336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24134 1727096404.12845: stdout chunk (state=3): >>>/root <<< 24134 1727096404.13053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.13057: stdout chunk (state=3): >>><<< 24134 1727096404.13059: stderr chunk (state=3): >>><<< 24134 1727096404.13062: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24134 1727096404.13064: _low_level_execute_command(): starting 24134 1727096404.13066: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076 `" && echo ansible-tmp-1727096404.1298654-24559-215761131524076="` echo /root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076 `" ) && sleep 0' 24134 1727096404.13630: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096404.13633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096404.13676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.13685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096404.13697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.13796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096404.16652: stdout chunk (state=3): >>>ansible-tmp-1727096404.1298654-24559-215761131524076=/root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076 <<< 24134 1727096404.16687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.16719: stdout chunk (state=3): >>><<< 24134 1727096404.16725: stderr chunk (state=3): >>><<< 24134 1727096404.16880: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096404.1298654-24559-215761131524076=/root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096404.16884: variable 'ansible_module_compression' from source: unknown 24134 1727096404.16886: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24134 1727096404.16888: variable 'ansible_facts' from source: unknown 24134 1727096404.16985: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076/AnsiballZ_command.py 24134 1727096404.17248: Sending initial data 24134 1727096404.17251: Sent initial data (156 bytes) 24134 1727096404.18396: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.18447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.18503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096404.18531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.18696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096404.21091: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096404.21189: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096404.21291: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpf_i13wf6 /root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076/AnsiballZ_command.py <<< 24134 1727096404.21295: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076/AnsiballZ_command.py" <<< 24134 1727096404.21385: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpf_i13wf6" to remote "/root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076/AnsiballZ_command.py" <<< 24134 1727096404.23879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.23883: stdout chunk (state=3): >>><<< 24134 1727096404.23885: stderr chunk (state=3): >>><<< 24134 1727096404.23963: done transferring module to remote 24134 1727096404.24078: _low_level_execute_command(): starting 24134 1727096404.24082: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076/ /root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076/AnsiballZ_command.py && sleep 0' 24134 1727096404.25337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096404.25350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096404.25366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096404.25544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.25600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.25688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.25760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096404.27809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.28039: stderr chunk (state=3): >>><<< 24134 1727096404.28043: stdout chunk (state=3): >>><<< 24134 1727096404.28045: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096404.28048: _low_level_execute_command(): starting 24134 1727096404.28050: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076/AnsiballZ_command.py && sleep 0' 24134 1727096404.29516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096404.29598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096404.29812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.30162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.30284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096404.30433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.30578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096404.46880: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-23 09:00:04.461400", "end": "2024-09-23 09:00:04.465315", "delta": "0:00:00.003915", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24134 1727096404.48702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096404.48727: stderr chunk (state=3): >>><<< 24134 1727096404.48730: stdout chunk (state=3): >>><<< 24134 1727096404.48752: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-23 09:00:04.461400", "end": "2024-09-23 09:00:04.465315", "delta": "0:00:00.003915", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096404.48801: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096404.48904: _low_level_execute_command(): starting 24134 1727096404.48914: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096404.1298654-24559-215761131524076/ > /dev/null 2>&1 && sleep 0' 24134 1727096404.50164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.50385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.50485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096404.52552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.52555: stdout chunk (state=3): >>><<< 24134 1727096404.52561: stderr chunk (state=3): >>><<< 24134 1727096404.52582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096404.52587: handler run complete 24134 1727096404.52608: Evaluated conditional (False): False 24134 1727096404.52621: attempt loop complete, returning result 24134 1727096404.52637: variable 'item' from source: unknown 24134 1727096404.52781: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003915", "end": "2024-09-23 09:00:04.465315", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-23 09:00:04.461400" } 24134 1727096404.53203: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096404.53207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096404.53211: variable 'omit' from source: magic vars 24134 1727096404.53705: variable 'ansible_distribution_major_version' from source: facts 24134 1727096404.53708: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096404.53857: variable 'type' from source: set_fact 24134 1727096404.53860: variable 'state' from source: include params 24134 1727096404.53863: variable 'interface' from source: set_fact 24134 1727096404.53870: variable 'current_interfaces' from source: set_fact 24134 1727096404.54284: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24134 1727096404.54287: variable 'omit' from source: magic vars 24134 1727096404.54290: variable 'omit' from source: magic vars 24134 1727096404.54292: variable 'item' from source: unknown 24134 1727096404.54499: variable 'item' from source: unknown 24134 1727096404.54514: variable 'omit' from source: magic vars 24134 1727096404.54531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096404.54538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096404.54545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096404.54717: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096404.54720: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096404.54729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096404.55045: Set connection var ansible_shell_executable to /bin/sh 24134 1727096404.55049: Set connection var ansible_pipelining to False 24134 1727096404.55051: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096404.55053: Set connection var ansible_timeout to 10 24134 1727096404.55057: Set connection var ansible_connection to ssh 24134 1727096404.55059: Set connection var ansible_shell_type to sh 24134 1727096404.55061: variable 'ansible_shell_executable' from source: unknown 24134 1727096404.55097: variable 'ansible_connection' from source: unknown 24134 1727096404.55100: variable 'ansible_module_compression' from source: unknown 24134 1727096404.55103: variable 'ansible_shell_type' from source: unknown 24134 1727096404.55105: variable 'ansible_shell_executable' from source: unknown 24134 1727096404.55107: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096404.55112: variable 'ansible_pipelining' from source: unknown 24134 1727096404.55114: variable 'ansible_timeout' from source: unknown 24134 1727096404.55118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096404.55577: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096404.55581: variable 'omit' from source: magic vars 24134 1727096404.55583: starting attempt loop 24134 1727096404.55585: running the handler 24134 1727096404.55591: _low_level_execute_command(): starting 24134 1727096404.55596: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096404.57164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096404.57179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096404.57483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.57566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.57577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096404.57593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.57983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096404.59522: stdout chunk (state=3): >>>/root <<< 24134 1727096404.59710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.59726: stderr chunk (state=3): >>><<< 24134 1727096404.59730: stdout chunk (state=3): >>><<< 24134 1727096404.59747: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096404.59755: _low_level_execute_command(): starting 24134 1727096404.59759: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469 `" && echo ansible-tmp-1727096404.5974593-24559-72812416181469="` echo /root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469 `" ) && sleep 0' 24134 1727096404.60670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.60674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.60684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096404.60702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.60854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096404.62887: stdout chunk (state=3): >>>ansible-tmp-1727096404.5974593-24559-72812416181469=/root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469 <<< 24134 1727096404.63106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.63109: stdout chunk (state=3): >>><<< 24134 1727096404.63111: stderr chunk (state=3): >>><<< 24134 1727096404.63114: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096404.5974593-24559-72812416181469=/root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096404.63284: variable 'ansible_module_compression' from source: unknown 24134 1727096404.63287: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24134 1727096404.63290: variable 'ansible_facts' from source: unknown 24134 1727096404.63292: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469/AnsiballZ_command.py 24134 1727096404.63404: Sending initial data 24134 1727096404.63407: Sent initial data (155 bytes) 24134 1727096404.64084: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.64135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.64152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096404.64176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.64359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096404.66074: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096404.66165: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096404.66263: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpo5q_53x6 /root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469/AnsiballZ_command.py <<< 24134 1727096404.66271: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469/AnsiballZ_command.py" <<< 24134 1727096404.66355: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpo5q_53x6" to remote "/root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469/AnsiballZ_command.py" <<< 24134 1727096404.67513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.67517: stdout chunk (state=3): >>><<< 24134 1727096404.67520: stderr chunk (state=3): >>><<< 24134 1727096404.67690: done transferring module to remote 24134 1727096404.67694: _low_level_execute_command(): starting 24134 1727096404.67696: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469/ /root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469/AnsiballZ_command.py && sleep 0' 24134 1727096404.68930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096404.68934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.68959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096404.68966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.69029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.69037: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096404.69184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.69244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096404.71187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.71375: stderr chunk (state=3): >>><<< 24134 1727096404.71380: stdout chunk (state=3): >>><<< 24134 1727096404.71383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096404.71385: _low_level_execute_command(): starting 24134 1727096404.71388: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469/AnsiballZ_command.py && sleep 0' 24134 1727096404.71885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096404.71895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096404.71905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096404.71937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096404.71984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096404.72045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.72058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096404.72086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.72188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096404.88918: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-23 09:00:04.881171", "end": "2024-09-23 09:00:04.885199", "delta": "0:00:00.004028", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24134 1727096404.90425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.90452: stderr chunk (state=3): >>>Shared connection to 10.31.11.125 closed. <<< 24134 1727096404.90520: stderr chunk (state=3): >>><<< 24134 1727096404.90532: stdout chunk (state=3): >>><<< 24134 1727096404.90578: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-23 09:00:04.881171", "end": "2024-09-23 09:00:04.885199", "delta": "0:00:00.004028", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096404.90623: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096404.90660: _low_level_execute_command(): starting 24134 1727096404.90674: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096404.5974593-24559-72812416181469/ > /dev/null 2>&1 && sleep 0' 24134 1727096404.91663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096404.91694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096404.91712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096404.91818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096404.93785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096404.93818: stdout chunk (state=3): >>><<< 24134 1727096404.93821: stderr chunk (state=3): >>><<< 24134 1727096404.93975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096404.93979: handler run complete 24134 1727096404.93981: Evaluated conditional (False): False 24134 1727096404.93983: attempt loop complete, returning result 24134 1727096404.93985: variable 'item' from source: unknown 24134 1727096404.94000: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.004028", "end": "2024-09-23 09:00:04.885199", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-23 09:00:04.881171" } 24134 1727096404.94311: dumping result to json 24134 1727096404.94314: done dumping result, returning 24134 1727096404.94316: done running TaskExecutor() for managed_node1/TASK: Create veth interface ethtest0 [0afff68d-5257-1673-d3fc-0000000001b2] 24134 1727096404.94319: sending task result for task 0afff68d-5257-1673-d3fc-0000000001b2 24134 1727096404.94549: no more pending results, returning what we have 24134 1727096404.94552: results queue empty 24134 1727096404.94554: checking for any_errors_fatal 24134 1727096404.94563: done checking for any_errors_fatal 24134 1727096404.94564: checking for max_fail_percentage 24134 1727096404.94566: done checking for max_fail_percentage 24134 1727096404.94567: checking to see if all hosts have failed and the running result is not ok 24134 1727096404.94571: done checking to see if all hosts have failed 24134 1727096404.94572: getting the remaining hosts for this loop 24134 1727096404.94573: done getting the remaining hosts for this loop 24134 1727096404.94577: getting the next task for host managed_node1 24134 1727096404.94583: done getting next task for host managed_node1 24134 1727096404.94586: ^ task is: TASK: Set up veth as managed by NetworkManager 24134 1727096404.94589: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096404.94594: getting variables 24134 1727096404.94595: in VariableManager get_vars() 24134 1727096404.94626: Calling all_inventory to load vars for managed_node1 24134 1727096404.94629: Calling groups_inventory to load vars for managed_node1 24134 1727096404.94632: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096404.94644: Calling all_plugins_play to load vars for managed_node1 24134 1727096404.94647: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096404.94651: Calling groups_plugins_play to load vars for managed_node1 24134 1727096404.95292: done sending task result for task 0afff68d-5257-1673-d3fc-0000000001b2 24134 1727096404.95296: WORKER PROCESS EXITING 24134 1727096404.95963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096404.96624: done with get_vars() 24134 1727096404.96666: done getting variables 24134 1727096404.96855: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 09:00:04 -0400 (0:00:01.474) 0:00:09.182 ****** 24134 1727096404.96947: entering _queue_task() for managed_node1/command 24134 1727096404.97481: worker is 1 (out of 1 available) 24134 1727096404.97492: exiting _queue_task() for managed_node1/command 24134 1727096404.97505: done queuing things up, now waiting for results queue to drain 24134 1727096404.97506: waiting for pending results... 24134 1727096404.97885: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 24134 1727096404.98059: in run() - task 0afff68d-5257-1673-d3fc-0000000001b3 24134 1727096404.98084: variable 'ansible_search_path' from source: unknown 24134 1727096404.98092: variable 'ansible_search_path' from source: unknown 24134 1727096404.98139: calling self._execute() 24134 1727096404.98277: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096404.98337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096404.98340: variable 'omit' from source: magic vars 24134 1727096404.98713: variable 'ansible_distribution_major_version' from source: facts 24134 1727096404.98731: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096404.98975: variable 'type' from source: set_fact 24134 1727096404.98979: variable 'state' from source: include params 24134 1727096404.98982: Evaluated conditional (type == 'veth' and state == 'present'): True 24134 1727096404.98990: variable 'omit' from source: magic vars 24134 1727096404.98993: variable 'omit' from source: magic vars 24134 1727096404.99127: variable 'interface' from source: set_fact 24134 1727096404.99154: variable 'omit' from source: magic vars 24134 1727096404.99207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096404.99320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096404.99324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096404.99335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096404.99351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096404.99393: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096404.99402: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096404.99409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096404.99573: Set connection var ansible_shell_executable to /bin/sh 24134 1727096404.99577: Set connection var ansible_pipelining to False 24134 1727096404.99580: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096404.99582: Set connection var ansible_timeout to 10 24134 1727096404.99584: Set connection var ansible_connection to ssh 24134 1727096404.99586: Set connection var ansible_shell_type to sh 24134 1727096404.99646: variable 'ansible_shell_executable' from source: unknown 24134 1727096404.99648: variable 'ansible_connection' from source: unknown 24134 1727096404.99651: variable 'ansible_module_compression' from source: unknown 24134 1727096404.99653: variable 'ansible_shell_type' from source: unknown 24134 1727096404.99655: variable 'ansible_shell_executable' from source: unknown 24134 1727096404.99657: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096404.99659: variable 'ansible_pipelining' from source: unknown 24134 1727096404.99661: variable 'ansible_timeout' from source: unknown 24134 1727096404.99663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096404.99862: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096404.99866: variable 'omit' from source: magic vars 24134 1727096404.99872: starting attempt loop 24134 1727096404.99875: running the handler 24134 1727096404.99877: _low_level_execute_command(): starting 24134 1727096404.99887: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096405.01826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096405.01993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096405.01998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096405.02099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096405.03843: stdout chunk (state=3): >>>/root <<< 24134 1727096405.03986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096405.04031: stderr chunk (state=3): >>><<< 24134 1727096405.04114: stdout chunk (state=3): >>><<< 24134 1727096405.04196: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096405.04199: _low_level_execute_command(): starting 24134 1727096405.04202: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934 `" && echo ansible-tmp-1727096405.0415208-24641-72449939002934="` echo /root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934 `" ) && sleep 0' 24134 1727096405.04900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096405.05004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.05017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096405.05023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096405.05036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096405.05135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096405.07247: stdout chunk (state=3): >>>ansible-tmp-1727096405.0415208-24641-72449939002934=/root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934 <<< 24134 1727096405.07501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096405.07505: stdout chunk (state=3): >>><<< 24134 1727096405.07507: stderr chunk (state=3): >>><<< 24134 1727096405.07694: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096405.0415208-24641-72449939002934=/root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096405.07698: variable 'ansible_module_compression' from source: unknown 24134 1727096405.07701: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24134 1727096405.07715: variable 'ansible_facts' from source: unknown 24134 1727096405.07890: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934/AnsiballZ_command.py 24134 1727096405.08126: Sending initial data 24134 1727096405.08129: Sent initial data (155 bytes) 24134 1727096405.09218: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096405.09312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096405.09414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096405.11174: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096405.11262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096405.11367: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpkulyd8d0 /root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934/AnsiballZ_command.py <<< 24134 1727096405.11377: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934/AnsiballZ_command.py" <<< 24134 1727096405.11424: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpkulyd8d0" to remote "/root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934/AnsiballZ_command.py" <<< 24134 1727096405.12159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096405.12256: stderr chunk (state=3): >>><<< 24134 1727096405.12260: stdout chunk (state=3): >>><<< 24134 1727096405.12262: done transferring module to remote 24134 1727096405.12264: _low_level_execute_command(): starting 24134 1727096405.12267: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934/ /root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934/AnsiballZ_command.py && sleep 0' 24134 1727096405.12746: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.12749: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 24134 1727096405.12753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.12815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096405.12818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096405.12935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096405.14975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096405.14979: stdout chunk (state=3): >>><<< 24134 1727096405.14981: stderr chunk (state=3): >>><<< 24134 1727096405.14984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096405.14991: _low_level_execute_command(): starting 24134 1727096405.14993: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934/AnsiballZ_command.py && sleep 0' 24134 1727096405.15577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096405.15581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096405.15593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096405.15615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096405.15627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096405.15685: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.15773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096405.15786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096405.15788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096405.15935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096405.34108: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-23 09:00:05.316519", "end": "2024-09-23 09:00:05.336074", "delta": "0:00:00.019555", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24134 1727096405.35688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096405.35734: stderr chunk (state=3): >>>Shared connection to 10.31.11.125 closed. <<< 24134 1727096405.35738: stdout chunk (state=3): >>><<< 24134 1727096405.35744: stderr chunk (state=3): >>><<< 24134 1727096405.35764: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-23 09:00:05.316519", "end": "2024-09-23 09:00:05.336074", "delta": "0:00:00.019555", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096405.35806: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096405.35813: _low_level_execute_command(): starting 24134 1727096405.35818: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096405.0415208-24641-72449939002934/ > /dev/null 2>&1 && sleep 0' 24134 1727096405.37185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096405.37301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096405.37323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096405.37416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096405.39397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096405.39401: stdout chunk (state=3): >>><<< 24134 1727096405.39408: stderr chunk (state=3): >>><<< 24134 1727096405.39427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096405.39435: handler run complete 24134 1727096405.39457: Evaluated conditional (False): False 24134 1727096405.39471: attempt loop complete, returning result 24134 1727096405.39474: _execute() done 24134 1727096405.39477: dumping result to json 24134 1727096405.39479: done dumping result, returning 24134 1727096405.39489: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-1673-d3fc-0000000001b3] 24134 1727096405.39494: sending task result for task 0afff68d-5257-1673-d3fc-0000000001b3 24134 1727096405.39775: done sending task result for task 0afff68d-5257-1673-d3fc-0000000001b3 24134 1727096405.39778: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.019555", "end": "2024-09-23 09:00:05.336074", "rc": 0, "start": "2024-09-23 09:00:05.316519" } 24134 1727096405.39834: no more pending results, returning what we have 24134 1727096405.39836: results queue empty 24134 1727096405.39837: checking for any_errors_fatal 24134 1727096405.39850: done checking for any_errors_fatal 24134 1727096405.39851: checking for max_fail_percentage 24134 1727096405.39853: done checking for max_fail_percentage 24134 1727096405.39853: checking to see if all hosts have failed and the running result is not ok 24134 1727096405.39854: done checking to see if all hosts have failed 24134 1727096405.39855: getting the remaining hosts for this loop 24134 1727096405.39856: done getting the remaining hosts for this loop 24134 1727096405.39859: getting the next task for host managed_node1 24134 1727096405.39864: done getting next task for host managed_node1 24134 1727096405.39871: ^ task is: TASK: Delete veth interface {{ interface }} 24134 1727096405.39874: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096405.39878: getting variables 24134 1727096405.39879: in VariableManager get_vars() 24134 1727096405.39914: Calling all_inventory to load vars for managed_node1 24134 1727096405.39918: Calling groups_inventory to load vars for managed_node1 24134 1727096405.39921: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096405.39930: Calling all_plugins_play to load vars for managed_node1 24134 1727096405.39933: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096405.39935: Calling groups_plugins_play to load vars for managed_node1 24134 1727096405.40109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096405.40339: done with get_vars() 24134 1727096405.40350: done getting variables 24134 1727096405.40425: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096405.40546: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 09:00:05 -0400 (0:00:00.436) 0:00:09.619 ****** 24134 1727096405.40581: entering _queue_task() for managed_node1/command 24134 1727096405.40841: worker is 1 (out of 1 available) 24134 1727096405.40854: exiting _queue_task() for managed_node1/command 24134 1727096405.40874: done queuing things up, now waiting for results queue to drain 24134 1727096405.40876: waiting for pending results... 24134 1727096405.41520: running TaskExecutor() for managed_node1/TASK: Delete veth interface ethtest0 24134 1727096405.41693: in run() - task 0afff68d-5257-1673-d3fc-0000000001b4 24134 1727096405.41891: variable 'ansible_search_path' from source: unknown 24134 1727096405.41900: variable 'ansible_search_path' from source: unknown 24134 1727096405.41938: calling self._execute() 24134 1727096405.42375: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096405.42380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096405.42382: variable 'omit' from source: magic vars 24134 1727096405.42879: variable 'ansible_distribution_major_version' from source: facts 24134 1727096405.43063: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096405.43305: variable 'type' from source: set_fact 24134 1727096405.43317: variable 'state' from source: include params 24134 1727096405.43325: variable 'interface' from source: set_fact 24134 1727096405.43333: variable 'current_interfaces' from source: set_fact 24134 1727096405.43345: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 24134 1727096405.43351: when evaluation is False, skipping this task 24134 1727096405.43358: _execute() done 24134 1727096405.43365: dumping result to json 24134 1727096405.43379: done dumping result, returning 24134 1727096405.43393: done running TaskExecutor() for managed_node1/TASK: Delete veth interface ethtest0 [0afff68d-5257-1673-d3fc-0000000001b4] 24134 1727096405.43408: sending task result for task 0afff68d-5257-1673-d3fc-0000000001b4 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24134 1727096405.43559: no more pending results, returning what we have 24134 1727096405.43563: results queue empty 24134 1727096405.43564: checking for any_errors_fatal 24134 1727096405.43576: done checking for any_errors_fatal 24134 1727096405.43577: checking for max_fail_percentage 24134 1727096405.43579: done checking for max_fail_percentage 24134 1727096405.43580: checking to see if all hosts have failed and the running result is not ok 24134 1727096405.43581: done checking to see if all hosts have failed 24134 1727096405.43581: getting the remaining hosts for this loop 24134 1727096405.43583: done getting the remaining hosts for this loop 24134 1727096405.43587: getting the next task for host managed_node1 24134 1727096405.43593: done getting next task for host managed_node1 24134 1727096405.43595: ^ task is: TASK: Create dummy interface {{ interface }} 24134 1727096405.43599: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096405.43604: getting variables 24134 1727096405.43605: in VariableManager get_vars() 24134 1727096405.43646: Calling all_inventory to load vars for managed_node1 24134 1727096405.43649: Calling groups_inventory to load vars for managed_node1 24134 1727096405.43651: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096405.43664: Calling all_plugins_play to load vars for managed_node1 24134 1727096405.43666: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096405.43674: Calling groups_plugins_play to load vars for managed_node1 24134 1727096405.44135: done sending task result for task 0afff68d-5257-1673-d3fc-0000000001b4 24134 1727096405.44139: WORKER PROCESS EXITING 24134 1727096405.44163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096405.44418: done with get_vars() 24134 1727096405.44456: done getting variables 24134 1727096405.44519: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096405.44632: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 09:00:05 -0400 (0:00:00.040) 0:00:09.660 ****** 24134 1727096405.44664: entering _queue_task() for managed_node1/command 24134 1727096405.44943: worker is 1 (out of 1 available) 24134 1727096405.44955: exiting _queue_task() for managed_node1/command 24134 1727096405.44966: done queuing things up, now waiting for results queue to drain 24134 1727096405.44970: waiting for pending results... 24134 1727096405.45248: running TaskExecutor() for managed_node1/TASK: Create dummy interface ethtest0 24134 1727096405.45483: in run() - task 0afff68d-5257-1673-d3fc-0000000001b5 24134 1727096405.45671: variable 'ansible_search_path' from source: unknown 24134 1727096405.45674: variable 'ansible_search_path' from source: unknown 24134 1727096405.45677: calling self._execute() 24134 1727096405.45819: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096405.45886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096405.45906: variable 'omit' from source: magic vars 24134 1727096405.46607: variable 'ansible_distribution_major_version' from source: facts 24134 1727096405.46684: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096405.47214: variable 'type' from source: set_fact 24134 1727096405.47217: variable 'state' from source: include params 24134 1727096405.47219: variable 'interface' from source: set_fact 24134 1727096405.47221: variable 'current_interfaces' from source: set_fact 24134 1727096405.47224: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 24134 1727096405.47225: when evaluation is False, skipping this task 24134 1727096405.47227: _execute() done 24134 1727096405.47229: dumping result to json 24134 1727096405.47231: done dumping result, returning 24134 1727096405.47233: done running TaskExecutor() for managed_node1/TASK: Create dummy interface ethtest0 [0afff68d-5257-1673-d3fc-0000000001b5] 24134 1727096405.47235: sending task result for task 0afff68d-5257-1673-d3fc-0000000001b5 24134 1727096405.47497: done sending task result for task 0afff68d-5257-1673-d3fc-0000000001b5 24134 1727096405.47501: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 24134 1727096405.47565: no more pending results, returning what we have 24134 1727096405.47574: results queue empty 24134 1727096405.47576: checking for any_errors_fatal 24134 1727096405.47584: done checking for any_errors_fatal 24134 1727096405.47585: checking for max_fail_percentage 24134 1727096405.47587: done checking for max_fail_percentage 24134 1727096405.47588: checking to see if all hosts have failed and the running result is not ok 24134 1727096405.47589: done checking to see if all hosts have failed 24134 1727096405.47590: getting the remaining hosts for this loop 24134 1727096405.47591: done getting the remaining hosts for this loop 24134 1727096405.47595: getting the next task for host managed_node1 24134 1727096405.47602: done getting next task for host managed_node1 24134 1727096405.47605: ^ task is: TASK: Delete dummy interface {{ interface }} 24134 1727096405.47609: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096405.47614: getting variables 24134 1727096405.47616: in VariableManager get_vars() 24134 1727096405.47655: Calling all_inventory to load vars for managed_node1 24134 1727096405.47658: Calling groups_inventory to load vars for managed_node1 24134 1727096405.47660: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096405.47933: Calling all_plugins_play to load vars for managed_node1 24134 1727096405.47937: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096405.47940: Calling groups_plugins_play to load vars for managed_node1 24134 1727096405.48463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096405.49081: done with get_vars() 24134 1727096405.49091: done getting variables 24134 1727096405.49146: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096405.49476: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 09:00:05 -0400 (0:00:00.048) 0:00:09.708 ****** 24134 1727096405.49506: entering _queue_task() for managed_node1/command 24134 1727096405.49899: worker is 1 (out of 1 available) 24134 1727096405.49912: exiting _queue_task() for managed_node1/command 24134 1727096405.49923: done queuing things up, now waiting for results queue to drain 24134 1727096405.49924: waiting for pending results... 24134 1727096405.50517: running TaskExecutor() for managed_node1/TASK: Delete dummy interface ethtest0 24134 1727096405.50559: in run() - task 0afff68d-5257-1673-d3fc-0000000001b6 24134 1727096405.50634: variable 'ansible_search_path' from source: unknown 24134 1727096405.50638: variable 'ansible_search_path' from source: unknown 24134 1727096405.50828: calling self._execute() 24134 1727096405.50883: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096405.50959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096405.51045: variable 'omit' from source: magic vars 24134 1727096405.51811: variable 'ansible_distribution_major_version' from source: facts 24134 1727096405.51814: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096405.52209: variable 'type' from source: set_fact 24134 1727096405.52264: variable 'state' from source: include params 24134 1727096405.52278: variable 'interface' from source: set_fact 24134 1727096405.52287: variable 'current_interfaces' from source: set_fact 24134 1727096405.52475: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 24134 1727096405.52479: when evaluation is False, skipping this task 24134 1727096405.52482: _execute() done 24134 1727096405.52484: dumping result to json 24134 1727096405.52487: done dumping result, returning 24134 1727096405.52489: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface ethtest0 [0afff68d-5257-1673-d3fc-0000000001b6] 24134 1727096405.52491: sending task result for task 0afff68d-5257-1673-d3fc-0000000001b6 24134 1727096405.52556: done sending task result for task 0afff68d-5257-1673-d3fc-0000000001b6 24134 1727096405.52559: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24134 1727096405.52629: no more pending results, returning what we have 24134 1727096405.52632: results queue empty 24134 1727096405.52634: checking for any_errors_fatal 24134 1727096405.52641: done checking for any_errors_fatal 24134 1727096405.52642: checking for max_fail_percentage 24134 1727096405.52645: done checking for max_fail_percentage 24134 1727096405.52647: checking to see if all hosts have failed and the running result is not ok 24134 1727096405.52647: done checking to see if all hosts have failed 24134 1727096405.52648: getting the remaining hosts for this loop 24134 1727096405.52650: done getting the remaining hosts for this loop 24134 1727096405.52654: getting the next task for host managed_node1 24134 1727096405.52660: done getting next task for host managed_node1 24134 1727096405.52663: ^ task is: TASK: Create tap interface {{ interface }} 24134 1727096405.52669: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096405.52675: getting variables 24134 1727096405.52773: in VariableManager get_vars() 24134 1727096405.52823: Calling all_inventory to load vars for managed_node1 24134 1727096405.52827: Calling groups_inventory to load vars for managed_node1 24134 1727096405.52829: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096405.52843: Calling all_plugins_play to load vars for managed_node1 24134 1727096405.52846: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096405.52849: Calling groups_plugins_play to load vars for managed_node1 24134 1727096405.53563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096405.53941: done with get_vars() 24134 1727096405.53951: done getting variables 24134 1727096405.54128: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096405.54471: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 09:00:05 -0400 (0:00:00.049) 0:00:09.758 ****** 24134 1727096405.54500: entering _queue_task() for managed_node1/command 24134 1727096405.54878: worker is 1 (out of 1 available) 24134 1727096405.55116: exiting _queue_task() for managed_node1/command 24134 1727096405.55129: done queuing things up, now waiting for results queue to drain 24134 1727096405.55130: waiting for pending results... 24134 1727096405.55281: running TaskExecutor() for managed_node1/TASK: Create tap interface ethtest0 24134 1727096405.55464: in run() - task 0afff68d-5257-1673-d3fc-0000000001b7 24134 1727096405.55482: variable 'ansible_search_path' from source: unknown 24134 1727096405.55487: variable 'ansible_search_path' from source: unknown 24134 1727096405.55520: calling self._execute() 24134 1727096405.55802: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096405.55807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096405.55817: variable 'omit' from source: magic vars 24134 1727096405.56359: variable 'ansible_distribution_major_version' from source: facts 24134 1727096405.56425: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096405.56983: variable 'type' from source: set_fact 24134 1727096405.56988: variable 'state' from source: include params 24134 1727096405.56992: variable 'interface' from source: set_fact 24134 1727096405.56997: variable 'current_interfaces' from source: set_fact 24134 1727096405.57005: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 24134 1727096405.57008: when evaluation is False, skipping this task 24134 1727096405.57010: _execute() done 24134 1727096405.57012: dumping result to json 24134 1727096405.57015: done dumping result, returning 24134 1727096405.57097: done running TaskExecutor() for managed_node1/TASK: Create tap interface ethtest0 [0afff68d-5257-1673-d3fc-0000000001b7] 24134 1727096405.57104: sending task result for task 0afff68d-5257-1673-d3fc-0000000001b7 24134 1727096405.57161: done sending task result for task 0afff68d-5257-1673-d3fc-0000000001b7 24134 1727096405.57163: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 24134 1727096405.57219: no more pending results, returning what we have 24134 1727096405.57222: results queue empty 24134 1727096405.57223: checking for any_errors_fatal 24134 1727096405.57226: done checking for any_errors_fatal 24134 1727096405.57227: checking for max_fail_percentage 24134 1727096405.57228: done checking for max_fail_percentage 24134 1727096405.57229: checking to see if all hosts have failed and the running result is not ok 24134 1727096405.57230: done checking to see if all hosts have failed 24134 1727096405.57231: getting the remaining hosts for this loop 24134 1727096405.57232: done getting the remaining hosts for this loop 24134 1727096405.57235: getting the next task for host managed_node1 24134 1727096405.57240: done getting next task for host managed_node1 24134 1727096405.57242: ^ task is: TASK: Delete tap interface {{ interface }} 24134 1727096405.57245: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096405.57249: getting variables 24134 1727096405.57250: in VariableManager get_vars() 24134 1727096405.57283: Calling all_inventory to load vars for managed_node1 24134 1727096405.57351: Calling groups_inventory to load vars for managed_node1 24134 1727096405.57354: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096405.57363: Calling all_plugins_play to load vars for managed_node1 24134 1727096405.57366: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096405.57374: Calling groups_plugins_play to load vars for managed_node1 24134 1727096405.57608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096405.57806: done with get_vars() 24134 1727096405.57816: done getting variables 24134 1727096405.57876: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096405.57987: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 09:00:05 -0400 (0:00:00.035) 0:00:09.793 ****** 24134 1727096405.58016: entering _queue_task() for managed_node1/command 24134 1727096405.58275: worker is 1 (out of 1 available) 24134 1727096405.58288: exiting _queue_task() for managed_node1/command 24134 1727096405.58299: done queuing things up, now waiting for results queue to drain 24134 1727096405.58300: waiting for pending results... 24134 1727096405.58618: running TaskExecutor() for managed_node1/TASK: Delete tap interface ethtest0 24134 1727096405.58624: in run() - task 0afff68d-5257-1673-d3fc-0000000001b8 24134 1727096405.58627: variable 'ansible_search_path' from source: unknown 24134 1727096405.58630: variable 'ansible_search_path' from source: unknown 24134 1727096405.58640: calling self._execute() 24134 1727096405.58721: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096405.58726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096405.58735: variable 'omit' from source: magic vars 24134 1727096405.59107: variable 'ansible_distribution_major_version' from source: facts 24134 1727096405.59183: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096405.59554: variable 'type' from source: set_fact 24134 1727096405.59558: variable 'state' from source: include params 24134 1727096405.59586: variable 'interface' from source: set_fact 24134 1727096405.59589: variable 'current_interfaces' from source: set_fact 24134 1727096405.59693: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 24134 1727096405.59696: when evaluation is False, skipping this task 24134 1727096405.59700: _execute() done 24134 1727096405.59702: dumping result to json 24134 1727096405.59705: done dumping result, returning 24134 1727096405.59707: done running TaskExecutor() for managed_node1/TASK: Delete tap interface ethtest0 [0afff68d-5257-1673-d3fc-0000000001b8] 24134 1727096405.59708: sending task result for task 0afff68d-5257-1673-d3fc-0000000001b8 24134 1727096405.59766: done sending task result for task 0afff68d-5257-1673-d3fc-0000000001b8 24134 1727096405.59772: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24134 1727096405.59823: no more pending results, returning what we have 24134 1727096405.59827: results queue empty 24134 1727096405.59828: checking for any_errors_fatal 24134 1727096405.59834: done checking for any_errors_fatal 24134 1727096405.59835: checking for max_fail_percentage 24134 1727096405.59836: done checking for max_fail_percentage 24134 1727096405.59837: checking to see if all hosts have failed and the running result is not ok 24134 1727096405.59838: done checking to see if all hosts have failed 24134 1727096405.59839: getting the remaining hosts for this loop 24134 1727096405.59841: done getting the remaining hosts for this loop 24134 1727096405.59845: getting the next task for host managed_node1 24134 1727096405.59853: done getting next task for host managed_node1 24134 1727096405.59858: ^ task is: TASK: Include the task 'assert_device_present.yml' 24134 1727096405.59861: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096405.59866: getting variables 24134 1727096405.59871: in VariableManager get_vars() 24134 1727096405.59911: Calling all_inventory to load vars for managed_node1 24134 1727096405.59914: Calling groups_inventory to load vars for managed_node1 24134 1727096405.59917: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096405.59930: Calling all_plugins_play to load vars for managed_node1 24134 1727096405.59933: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096405.59936: Calling groups_plugins_play to load vars for managed_node1 24134 1727096405.60277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096405.60475: done with get_vars() 24134 1727096405.60485: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:20 Monday 23 September 2024 09:00:05 -0400 (0:00:00.025) 0:00:09.819 ****** 24134 1727096405.60576: entering _queue_task() for managed_node1/include_tasks 24134 1727096405.61001: worker is 1 (out of 1 available) 24134 1727096405.61010: exiting _queue_task() for managed_node1/include_tasks 24134 1727096405.61019: done queuing things up, now waiting for results queue to drain 24134 1727096405.61021: waiting for pending results... 24134 1727096405.61320: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 24134 1727096405.61325: in run() - task 0afff68d-5257-1673-d3fc-00000000000e 24134 1727096405.61328: variable 'ansible_search_path' from source: unknown 24134 1727096405.61331: calling self._execute() 24134 1727096405.61334: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096405.61336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096405.61339: variable 'omit' from source: magic vars 24134 1727096405.61739: variable 'ansible_distribution_major_version' from source: facts 24134 1727096405.61750: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096405.61755: _execute() done 24134 1727096405.61759: dumping result to json 24134 1727096405.61761: done dumping result, returning 24134 1727096405.61771: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [0afff68d-5257-1673-d3fc-00000000000e] 24134 1727096405.61780: sending task result for task 0afff68d-5257-1673-d3fc-00000000000e 24134 1727096405.61878: done sending task result for task 0afff68d-5257-1673-d3fc-00000000000e 24134 1727096405.61881: WORKER PROCESS EXITING 24134 1727096405.61921: no more pending results, returning what we have 24134 1727096405.61925: in VariableManager get_vars() 24134 1727096405.61966: Calling all_inventory to load vars for managed_node1 24134 1727096405.61974: Calling groups_inventory to load vars for managed_node1 24134 1727096405.61977: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096405.61990: Calling all_plugins_play to load vars for managed_node1 24134 1727096405.61993: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096405.61996: Calling groups_plugins_play to load vars for managed_node1 24134 1727096405.62558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096405.62745: done with get_vars() 24134 1727096405.62751: variable 'ansible_search_path' from source: unknown 24134 1727096405.62763: we have included files to process 24134 1727096405.62764: generating all_blocks data 24134 1727096405.62765: done generating all_blocks data 24134 1727096405.62771: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24134 1727096405.62772: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24134 1727096405.62775: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24134 1727096405.62915: in VariableManager get_vars() 24134 1727096405.62932: done with get_vars() 24134 1727096405.63034: done processing included file 24134 1727096405.63035: iterating over new_blocks loaded from include file 24134 1727096405.63037: in VariableManager get_vars() 24134 1727096405.63050: done with get_vars() 24134 1727096405.63052: filtering new block on tags 24134 1727096405.63072: done filtering new block on tags 24134 1727096405.63074: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 24134 1727096405.63079: extending task lists for all hosts with included blocks 24134 1727096405.64655: done extending task lists 24134 1727096405.64657: done processing included files 24134 1727096405.64657: results queue empty 24134 1727096405.64658: checking for any_errors_fatal 24134 1727096405.64661: done checking for any_errors_fatal 24134 1727096405.64662: checking for max_fail_percentage 24134 1727096405.64663: done checking for max_fail_percentage 24134 1727096405.64663: checking to see if all hosts have failed and the running result is not ok 24134 1727096405.64664: done checking to see if all hosts have failed 24134 1727096405.64665: getting the remaining hosts for this loop 24134 1727096405.64666: done getting the remaining hosts for this loop 24134 1727096405.64683: getting the next task for host managed_node1 24134 1727096405.64688: done getting next task for host managed_node1 24134 1727096405.64690: ^ task is: TASK: Include the task 'get_interface_stat.yml' 24134 1727096405.64692: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096405.64694: getting variables 24134 1727096405.64695: in VariableManager get_vars() 24134 1727096405.64707: Calling all_inventory to load vars for managed_node1 24134 1727096405.64709: Calling groups_inventory to load vars for managed_node1 24134 1727096405.64711: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096405.64717: Calling all_plugins_play to load vars for managed_node1 24134 1727096405.64719: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096405.64722: Calling groups_plugins_play to load vars for managed_node1 24134 1727096405.64881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096405.65092: done with get_vars() 24134 1727096405.65107: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 09:00:05 -0400 (0:00:00.046) 0:00:09.865 ****** 24134 1727096405.65227: entering _queue_task() for managed_node1/include_tasks 24134 1727096405.65550: worker is 1 (out of 1 available) 24134 1727096405.65564: exiting _queue_task() for managed_node1/include_tasks 24134 1727096405.65790: done queuing things up, now waiting for results queue to drain 24134 1727096405.65792: waiting for pending results... 24134 1727096405.65994: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 24134 1727096405.65999: in run() - task 0afff68d-5257-1673-d3fc-0000000002bc 24134 1727096405.66002: variable 'ansible_search_path' from source: unknown 24134 1727096405.66005: variable 'ansible_search_path' from source: unknown 24134 1727096405.66007: calling self._execute() 24134 1727096405.66310: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096405.66314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096405.66316: variable 'omit' from source: magic vars 24134 1727096405.66679: variable 'ansible_distribution_major_version' from source: facts 24134 1727096405.66683: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096405.66794: _execute() done 24134 1727096405.66798: dumping result to json 24134 1727096405.66801: done dumping result, returning 24134 1727096405.66810: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-1673-d3fc-0000000002bc] 24134 1727096405.66861: sending task result for task 0afff68d-5257-1673-d3fc-0000000002bc 24134 1727096405.67084: done sending task result for task 0afff68d-5257-1673-d3fc-0000000002bc 24134 1727096405.67089: WORKER PROCESS EXITING 24134 1727096405.67119: no more pending results, returning what we have 24134 1727096405.67124: in VariableManager get_vars() 24134 1727096405.67165: Calling all_inventory to load vars for managed_node1 24134 1727096405.67171: Calling groups_inventory to load vars for managed_node1 24134 1727096405.67174: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096405.67185: Calling all_plugins_play to load vars for managed_node1 24134 1727096405.67189: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096405.67192: Calling groups_plugins_play to load vars for managed_node1 24134 1727096405.67504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096405.67701: done with get_vars() 24134 1727096405.67708: variable 'ansible_search_path' from source: unknown 24134 1727096405.67709: variable 'ansible_search_path' from source: unknown 24134 1727096405.67742: we have included files to process 24134 1727096405.67743: generating all_blocks data 24134 1727096405.67744: done generating all_blocks data 24134 1727096405.67746: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24134 1727096405.67747: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24134 1727096405.67749: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24134 1727096405.67966: done processing included file 24134 1727096405.67970: iterating over new_blocks loaded from include file 24134 1727096405.67971: in VariableManager get_vars() 24134 1727096405.67987: done with get_vars() 24134 1727096405.67989: filtering new block on tags 24134 1727096405.68003: done filtering new block on tags 24134 1727096405.68005: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 24134 1727096405.68009: extending task lists for all hosts with included blocks 24134 1727096405.68103: done extending task lists 24134 1727096405.68104: done processing included files 24134 1727096405.68105: results queue empty 24134 1727096405.68105: checking for any_errors_fatal 24134 1727096405.68108: done checking for any_errors_fatal 24134 1727096405.68109: checking for max_fail_percentage 24134 1727096405.68110: done checking for max_fail_percentage 24134 1727096405.68111: checking to see if all hosts have failed and the running result is not ok 24134 1727096405.68112: done checking to see if all hosts have failed 24134 1727096405.68112: getting the remaining hosts for this loop 24134 1727096405.68114: done getting the remaining hosts for this loop 24134 1727096405.68116: getting the next task for host managed_node1 24134 1727096405.68120: done getting next task for host managed_node1 24134 1727096405.68122: ^ task is: TASK: Get stat for interface {{ interface }} 24134 1727096405.68125: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096405.68127: getting variables 24134 1727096405.68128: in VariableManager get_vars() 24134 1727096405.68178: Calling all_inventory to load vars for managed_node1 24134 1727096405.68181: Calling groups_inventory to load vars for managed_node1 24134 1727096405.68183: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096405.68188: Calling all_plugins_play to load vars for managed_node1 24134 1727096405.68191: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096405.68194: Calling groups_plugins_play to load vars for managed_node1 24134 1727096405.68334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096405.68457: done with get_vars() 24134 1727096405.68463: done getting variables 24134 1727096405.68589: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 09:00:05 -0400 (0:00:00.033) 0:00:09.899 ****** 24134 1727096405.68612: entering _queue_task() for managed_node1/stat 24134 1727096405.68826: worker is 1 (out of 1 available) 24134 1727096405.68839: exiting _queue_task() for managed_node1/stat 24134 1727096405.68851: done queuing things up, now waiting for results queue to drain 24134 1727096405.68852: waiting for pending results... 24134 1727096405.69009: running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 24134 1727096405.69079: in run() - task 0afff68d-5257-1673-d3fc-000000000373 24134 1727096405.69091: variable 'ansible_search_path' from source: unknown 24134 1727096405.69094: variable 'ansible_search_path' from source: unknown 24134 1727096405.69121: calling self._execute() 24134 1727096405.69185: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096405.69197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096405.69200: variable 'omit' from source: magic vars 24134 1727096405.69523: variable 'ansible_distribution_major_version' from source: facts 24134 1727096405.69531: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096405.69537: variable 'omit' from source: magic vars 24134 1727096405.69569: variable 'omit' from source: magic vars 24134 1727096405.69638: variable 'interface' from source: set_fact 24134 1727096405.69653: variable 'omit' from source: magic vars 24134 1727096405.69687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096405.69714: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096405.69737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096405.69746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096405.69757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096405.69796: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096405.69799: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096405.69802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096405.69909: Set connection var ansible_shell_executable to /bin/sh 24134 1727096405.69912: Set connection var ansible_pipelining to False 24134 1727096405.69915: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096405.69923: Set connection var ansible_timeout to 10 24134 1727096405.69926: Set connection var ansible_connection to ssh 24134 1727096405.69928: Set connection var ansible_shell_type to sh 24134 1727096405.69958: variable 'ansible_shell_executable' from source: unknown 24134 1727096405.69962: variable 'ansible_connection' from source: unknown 24134 1727096405.69971: variable 'ansible_module_compression' from source: unknown 24134 1727096405.69978: variable 'ansible_shell_type' from source: unknown 24134 1727096405.69981: variable 'ansible_shell_executable' from source: unknown 24134 1727096405.69983: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096405.70009: variable 'ansible_pipelining' from source: unknown 24134 1727096405.70012: variable 'ansible_timeout' from source: unknown 24134 1727096405.70014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096405.70375: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096405.70380: variable 'omit' from source: magic vars 24134 1727096405.70383: starting attempt loop 24134 1727096405.70385: running the handler 24134 1727096405.70387: _low_level_execute_command(): starting 24134 1727096405.70389: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096405.71488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096405.71564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096405.71585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096405.71661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096405.73410: stdout chunk (state=3): >>>/root <<< 24134 1727096405.73506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096405.73531: stderr chunk (state=3): >>><<< 24134 1727096405.73534: stdout chunk (state=3): >>><<< 24134 1727096405.73555: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096405.73574: _low_level_execute_command(): starting 24134 1727096405.73581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607 `" && echo ansible-tmp-1727096405.7355707-24686-200695137658607="` echo /root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607 `" ) && sleep 0' 24134 1727096405.74023: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096405.74027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.74030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096405.74040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.74076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096405.74080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096405.74158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096405.76196: stdout chunk (state=3): >>>ansible-tmp-1727096405.7355707-24686-200695137658607=/root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607 <<< 24134 1727096405.76304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096405.76332: stderr chunk (state=3): >>><<< 24134 1727096405.76334: stdout chunk (state=3): >>><<< 24134 1727096405.76345: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096405.7355707-24686-200695137658607=/root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096405.76405: variable 'ansible_module_compression' from source: unknown 24134 1727096405.76447: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 24134 1727096405.76482: variable 'ansible_facts' from source: unknown 24134 1727096405.76542: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607/AnsiballZ_stat.py 24134 1727096405.76644: Sending initial data 24134 1727096405.76648: Sent initial data (153 bytes) 24134 1727096405.77064: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096405.77097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096405.77100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.77102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096405.77104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096405.77106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.77154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096405.77158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096405.77256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096405.78942: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096405.79026: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096405.79073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmp0la8srl8 /root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607/AnsiballZ_stat.py <<< 24134 1727096405.79080: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607/AnsiballZ_stat.py" <<< 24134 1727096405.79196: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmp0la8srl8" to remote "/root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607/AnsiballZ_stat.py" <<< 24134 1727096405.80296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096405.80349: stderr chunk (state=3): >>><<< 24134 1727096405.80352: stdout chunk (state=3): >>><<< 24134 1727096405.80379: done transferring module to remote 24134 1727096405.80388: _low_level_execute_command(): starting 24134 1727096405.80393: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607/ /root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607/AnsiballZ_stat.py && sleep 0' 24134 1727096405.81092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.81173: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096405.81193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.81226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096405.81256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096405.81291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096405.81384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096405.83291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096405.83328: stderr chunk (state=3): >>><<< 24134 1727096405.83335: stdout chunk (state=3): >>><<< 24134 1727096405.83356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096405.83359: _low_level_execute_command(): starting 24134 1727096405.83361: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607/AnsiballZ_stat.py && sleep 0' 24134 1727096405.83891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.83917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096405.83931: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096405.83943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096405.84015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096405.84019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096405.84058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096405.84119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096405.99841: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28977, "dev": 23, "nlink": 1, "atime": 1727096403.952705, "mtime": 1727096403.952705, "ctime": 1727096403.952705, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 24134 1727096406.01263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096406.01298: stderr chunk (state=3): >>><<< 24134 1727096406.01301: stdout chunk (state=3): >>><<< 24134 1727096406.01322: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28977, "dev": 23, "nlink": 1, "atime": 1727096403.952705, "mtime": 1727096403.952705, "ctime": 1727096403.952705, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096406.01357: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096406.01365: _low_level_execute_command(): starting 24134 1727096406.01371: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096405.7355707-24686-200695137658607/ > /dev/null 2>&1 && sleep 0' 24134 1727096406.02001: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096406.02005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096406.02007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096406.02010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096406.02053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096406.02118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096406.04103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096406.04151: stderr chunk (state=3): >>><<< 24134 1727096406.04155: stdout chunk (state=3): >>><<< 24134 1727096406.04194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096406.04197: handler run complete 24134 1727096406.04254: attempt loop complete, returning result 24134 1727096406.04264: _execute() done 24134 1727096406.04266: dumping result to json 24134 1727096406.04271: done dumping result, returning 24134 1727096406.04273: done running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 [0afff68d-5257-1673-d3fc-000000000373] 24134 1727096406.04275: sending task result for task 0afff68d-5257-1673-d3fc-000000000373 ok: [managed_node1] => { "changed": false, "stat": { "atime": 1727096403.952705, "block_size": 4096, "blocks": 0, "ctime": 1727096403.952705, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28977, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727096403.952705, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 24134 1727096406.04524: no more pending results, returning what we have 24134 1727096406.04528: results queue empty 24134 1727096406.04529: checking for any_errors_fatal 24134 1727096406.04530: done checking for any_errors_fatal 24134 1727096406.04531: checking for max_fail_percentage 24134 1727096406.04532: done checking for max_fail_percentage 24134 1727096406.04533: checking to see if all hosts have failed and the running result is not ok 24134 1727096406.04536: done checking to see if all hosts have failed 24134 1727096406.04537: getting the remaining hosts for this loop 24134 1727096406.04541: done getting the remaining hosts for this loop 24134 1727096406.04547: getting the next task for host managed_node1 24134 1727096406.04556: done getting next task for host managed_node1 24134 1727096406.04559: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 24134 1727096406.04562: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096406.04566: getting variables 24134 1727096406.04571: in VariableManager get_vars() 24134 1727096406.04717: Calling all_inventory to load vars for managed_node1 24134 1727096406.04720: Calling groups_inventory to load vars for managed_node1 24134 1727096406.04723: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096406.04733: Calling all_plugins_play to load vars for managed_node1 24134 1727096406.04736: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096406.04741: Calling groups_plugins_play to load vars for managed_node1 24134 1727096406.04996: done sending task result for task 0afff68d-5257-1673-d3fc-000000000373 24134 1727096406.05000: WORKER PROCESS EXITING 24134 1727096406.05024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096406.05246: done with get_vars() 24134 1727096406.05257: done getting variables 24134 1727096406.05379: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 24134 1727096406.05519: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 09:00:06 -0400 (0:00:00.369) 0:00:10.269 ****** 24134 1727096406.05572: entering _queue_task() for managed_node1/assert 24134 1727096406.05575: Creating lock for assert 24134 1727096406.05924: worker is 1 (out of 1 available) 24134 1727096406.05941: exiting _queue_task() for managed_node1/assert 24134 1727096406.05955: done queuing things up, now waiting for results queue to drain 24134 1727096406.05956: waiting for pending results... 24134 1727096406.06191: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'ethtest0' 24134 1727096406.06291: in run() - task 0afff68d-5257-1673-d3fc-0000000002bd 24134 1727096406.06295: variable 'ansible_search_path' from source: unknown 24134 1727096406.06297: variable 'ansible_search_path' from source: unknown 24134 1727096406.06328: calling self._execute() 24134 1727096406.06554: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.06558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.06562: variable 'omit' from source: magic vars 24134 1727096406.06983: variable 'ansible_distribution_major_version' from source: facts 24134 1727096406.06986: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096406.06989: variable 'omit' from source: magic vars 24134 1727096406.06995: variable 'omit' from source: magic vars 24134 1727096406.07213: variable 'interface' from source: set_fact 24134 1727096406.07231: variable 'omit' from source: magic vars 24134 1727096406.07273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096406.07310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096406.07329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096406.07345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096406.07403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096406.07407: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096406.07409: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.07411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.07511: Set connection var ansible_shell_executable to /bin/sh 24134 1727096406.07514: Set connection var ansible_pipelining to False 24134 1727096406.07524: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096406.07530: Set connection var ansible_timeout to 10 24134 1727096406.07533: Set connection var ansible_connection to ssh 24134 1727096406.07535: Set connection var ansible_shell_type to sh 24134 1727096406.07558: variable 'ansible_shell_executable' from source: unknown 24134 1727096406.07562: variable 'ansible_connection' from source: unknown 24134 1727096406.07564: variable 'ansible_module_compression' from source: unknown 24134 1727096406.07566: variable 'ansible_shell_type' from source: unknown 24134 1727096406.07570: variable 'ansible_shell_executable' from source: unknown 24134 1727096406.07581: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.07620: variable 'ansible_pipelining' from source: unknown 24134 1727096406.07624: variable 'ansible_timeout' from source: unknown 24134 1727096406.07626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.07730: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096406.07740: variable 'omit' from source: magic vars 24134 1727096406.07743: starting attempt loop 24134 1727096406.07746: running the handler 24134 1727096406.08063: variable 'interface_stat' from source: set_fact 24134 1727096406.08066: Evaluated conditional (interface_stat.stat.exists): True 24134 1727096406.08071: handler run complete 24134 1727096406.08073: attempt loop complete, returning result 24134 1727096406.08075: _execute() done 24134 1727096406.08077: dumping result to json 24134 1727096406.08078: done dumping result, returning 24134 1727096406.08080: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'ethtest0' [0afff68d-5257-1673-d3fc-0000000002bd] 24134 1727096406.08083: sending task result for task 0afff68d-5257-1673-d3fc-0000000002bd 24134 1727096406.08141: done sending task result for task 0afff68d-5257-1673-d3fc-0000000002bd 24134 1727096406.08144: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 24134 1727096406.08198: no more pending results, returning what we have 24134 1727096406.08201: results queue empty 24134 1727096406.08202: checking for any_errors_fatal 24134 1727096406.08210: done checking for any_errors_fatal 24134 1727096406.08211: checking for max_fail_percentage 24134 1727096406.08212: done checking for max_fail_percentage 24134 1727096406.08213: checking to see if all hosts have failed and the running result is not ok 24134 1727096406.08214: done checking to see if all hosts have failed 24134 1727096406.08214: getting the remaining hosts for this loop 24134 1727096406.08216: done getting the remaining hosts for this loop 24134 1727096406.08219: getting the next task for host managed_node1 24134 1727096406.08225: done getting next task for host managed_node1 24134 1727096406.08227: ^ task is: TASK: Initialize the connection_failed flag 24134 1727096406.08229: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096406.08233: getting variables 24134 1727096406.08234: in VariableManager get_vars() 24134 1727096406.08267: Calling all_inventory to load vars for managed_node1 24134 1727096406.08273: Calling groups_inventory to load vars for managed_node1 24134 1727096406.08275: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096406.08284: Calling all_plugins_play to load vars for managed_node1 24134 1727096406.08287: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096406.08289: Calling groups_plugins_play to load vars for managed_node1 24134 1727096406.08591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096406.08740: done with get_vars() 24134 1727096406.08749: done getting variables 24134 1727096406.08795: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize the connection_failed flag] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:23 Monday 23 September 2024 09:00:06 -0400 (0:00:00.032) 0:00:10.301 ****** 24134 1727096406.08816: entering _queue_task() for managed_node1/set_fact 24134 1727096406.09046: worker is 1 (out of 1 available) 24134 1727096406.09059: exiting _queue_task() for managed_node1/set_fact 24134 1727096406.09072: done queuing things up, now waiting for results queue to drain 24134 1727096406.09074: waiting for pending results... 24134 1727096406.09232: running TaskExecutor() for managed_node1/TASK: Initialize the connection_failed flag 24134 1727096406.09296: in run() - task 0afff68d-5257-1673-d3fc-00000000000f 24134 1727096406.09312: variable 'ansible_search_path' from source: unknown 24134 1727096406.09336: calling self._execute() 24134 1727096406.09401: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.09405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.09414: variable 'omit' from source: magic vars 24134 1727096406.09693: variable 'ansible_distribution_major_version' from source: facts 24134 1727096406.09703: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096406.09708: variable 'omit' from source: magic vars 24134 1727096406.09723: variable 'omit' from source: magic vars 24134 1727096406.09751: variable 'omit' from source: magic vars 24134 1727096406.09795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096406.09823: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096406.09838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096406.09854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096406.09864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096406.09891: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096406.09894: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.09897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.09966: Set connection var ansible_shell_executable to /bin/sh 24134 1727096406.09971: Set connection var ansible_pipelining to False 24134 1727096406.09980: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096406.09988: Set connection var ansible_timeout to 10 24134 1727096406.09991: Set connection var ansible_connection to ssh 24134 1727096406.09993: Set connection var ansible_shell_type to sh 24134 1727096406.10008: variable 'ansible_shell_executable' from source: unknown 24134 1727096406.10012: variable 'ansible_connection' from source: unknown 24134 1727096406.10014: variable 'ansible_module_compression' from source: unknown 24134 1727096406.10017: variable 'ansible_shell_type' from source: unknown 24134 1727096406.10019: variable 'ansible_shell_executable' from source: unknown 24134 1727096406.10021: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.10024: variable 'ansible_pipelining' from source: unknown 24134 1727096406.10026: variable 'ansible_timeout' from source: unknown 24134 1727096406.10030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.10135: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096406.10143: variable 'omit' from source: magic vars 24134 1727096406.10148: starting attempt loop 24134 1727096406.10151: running the handler 24134 1727096406.10161: handler run complete 24134 1727096406.10170: attempt loop complete, returning result 24134 1727096406.10178: _execute() done 24134 1727096406.10180: dumping result to json 24134 1727096406.10183: done dumping result, returning 24134 1727096406.10190: done running TaskExecutor() for managed_node1/TASK: Initialize the connection_failed flag [0afff68d-5257-1673-d3fc-00000000000f] 24134 1727096406.10193: sending task result for task 0afff68d-5257-1673-d3fc-00000000000f 24134 1727096406.10266: done sending task result for task 0afff68d-5257-1673-d3fc-00000000000f 24134 1727096406.10271: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "connection_failed": false }, "changed": false } 24134 1727096406.10329: no more pending results, returning what we have 24134 1727096406.10332: results queue empty 24134 1727096406.10333: checking for any_errors_fatal 24134 1727096406.10337: done checking for any_errors_fatal 24134 1727096406.10338: checking for max_fail_percentage 24134 1727096406.10340: done checking for max_fail_percentage 24134 1727096406.10341: checking to see if all hosts have failed and the running result is not ok 24134 1727096406.10342: done checking to see if all hosts have failed 24134 1727096406.10342: getting the remaining hosts for this loop 24134 1727096406.10344: done getting the remaining hosts for this loop 24134 1727096406.10347: getting the next task for host managed_node1 24134 1727096406.10353: done getting next task for host managed_node1 24134 1727096406.10357: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24134 1727096406.10361: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096406.10377: getting variables 24134 1727096406.10379: in VariableManager get_vars() 24134 1727096406.10412: Calling all_inventory to load vars for managed_node1 24134 1727096406.10415: Calling groups_inventory to load vars for managed_node1 24134 1727096406.10417: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096406.10424: Calling all_plugins_play to load vars for managed_node1 24134 1727096406.10427: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096406.10429: Calling groups_plugins_play to load vars for managed_node1 24134 1727096406.10558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096406.10692: done with get_vars() 24134 1727096406.10700: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 09:00:06 -0400 (0:00:00.019) 0:00:10.321 ****** 24134 1727096406.10762: entering _queue_task() for managed_node1/include_tasks 24134 1727096406.10972: worker is 1 (out of 1 available) 24134 1727096406.10986: exiting _queue_task() for managed_node1/include_tasks 24134 1727096406.10997: done queuing things up, now waiting for results queue to drain 24134 1727096406.10998: waiting for pending results... 24134 1727096406.11162: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24134 1727096406.11375: in run() - task 0afff68d-5257-1673-d3fc-000000000017 24134 1727096406.11379: variable 'ansible_search_path' from source: unknown 24134 1727096406.11381: variable 'ansible_search_path' from source: unknown 24134 1727096406.11384: calling self._execute() 24134 1727096406.11430: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.11442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.11455: variable 'omit' from source: magic vars 24134 1727096406.11807: variable 'ansible_distribution_major_version' from source: facts 24134 1727096406.11822: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096406.11831: _execute() done 24134 1727096406.11837: dumping result to json 24134 1727096406.11842: done dumping result, returning 24134 1727096406.11852: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-1673-d3fc-000000000017] 24134 1727096406.11859: sending task result for task 0afff68d-5257-1673-d3fc-000000000017 24134 1727096406.12009: no more pending results, returning what we have 24134 1727096406.12013: in VariableManager get_vars() 24134 1727096406.12052: Calling all_inventory to load vars for managed_node1 24134 1727096406.12054: Calling groups_inventory to load vars for managed_node1 24134 1727096406.12056: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096406.12074: Calling all_plugins_play to load vars for managed_node1 24134 1727096406.12078: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096406.12082: Calling groups_plugins_play to load vars for managed_node1 24134 1727096406.12418: done sending task result for task 0afff68d-5257-1673-d3fc-000000000017 24134 1727096406.12422: WORKER PROCESS EXITING 24134 1727096406.12435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096406.12629: done with get_vars() 24134 1727096406.12637: variable 'ansible_search_path' from source: unknown 24134 1727096406.12638: variable 'ansible_search_path' from source: unknown 24134 1727096406.12681: we have included files to process 24134 1727096406.12682: generating all_blocks data 24134 1727096406.12684: done generating all_blocks data 24134 1727096406.12687: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24134 1727096406.12688: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24134 1727096406.12690: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24134 1727096406.13364: done processing included file 24134 1727096406.13366: iterating over new_blocks loaded from include file 24134 1727096406.13371: in VariableManager get_vars() 24134 1727096406.13394: done with get_vars() 24134 1727096406.13396: filtering new block on tags 24134 1727096406.13414: done filtering new block on tags 24134 1727096406.13417: in VariableManager get_vars() 24134 1727096406.13438: done with get_vars() 24134 1727096406.13439: filtering new block on tags 24134 1727096406.13462: done filtering new block on tags 24134 1727096406.13465: in VariableManager get_vars() 24134 1727096406.13489: done with get_vars() 24134 1727096406.13491: filtering new block on tags 24134 1727096406.13509: done filtering new block on tags 24134 1727096406.13511: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 24134 1727096406.13516: extending task lists for all hosts with included blocks 24134 1727096406.14327: done extending task lists 24134 1727096406.14328: done processing included files 24134 1727096406.14329: results queue empty 24134 1727096406.14330: checking for any_errors_fatal 24134 1727096406.14332: done checking for any_errors_fatal 24134 1727096406.14333: checking for max_fail_percentage 24134 1727096406.14334: done checking for max_fail_percentage 24134 1727096406.14335: checking to see if all hosts have failed and the running result is not ok 24134 1727096406.14336: done checking to see if all hosts have failed 24134 1727096406.14337: getting the remaining hosts for this loop 24134 1727096406.14338: done getting the remaining hosts for this loop 24134 1727096406.14341: getting the next task for host managed_node1 24134 1727096406.14345: done getting next task for host managed_node1 24134 1727096406.14348: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24134 1727096406.14351: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096406.14360: getting variables 24134 1727096406.14361: in VariableManager get_vars() 24134 1727096406.14380: Calling all_inventory to load vars for managed_node1 24134 1727096406.14382: Calling groups_inventory to load vars for managed_node1 24134 1727096406.14385: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096406.14390: Calling all_plugins_play to load vars for managed_node1 24134 1727096406.14393: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096406.14396: Calling groups_plugins_play to load vars for managed_node1 24134 1727096406.14572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096406.14764: done with get_vars() 24134 1727096406.14778: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 09:00:06 -0400 (0:00:00.040) 0:00:10.361 ****** 24134 1727096406.14854: entering _queue_task() for managed_node1/setup 24134 1727096406.15188: worker is 1 (out of 1 available) 24134 1727096406.15201: exiting _queue_task() for managed_node1/setup 24134 1727096406.15212: done queuing things up, now waiting for results queue to drain 24134 1727096406.15214: waiting for pending results... 24134 1727096406.15482: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24134 1727096406.15633: in run() - task 0afff68d-5257-1673-d3fc-00000000038e 24134 1727096406.15684: variable 'ansible_search_path' from source: unknown 24134 1727096406.15688: variable 'ansible_search_path' from source: unknown 24134 1727096406.15704: calling self._execute() 24134 1727096406.15783: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.15799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.15872: variable 'omit' from source: magic vars 24134 1727096406.16182: variable 'ansible_distribution_major_version' from source: facts 24134 1727096406.16197: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096406.16404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096406.18511: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096406.18585: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096406.18627: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096406.18663: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096406.18700: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096406.18787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096406.18835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096406.18858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096406.18944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096406.18947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096406.18990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096406.19017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096406.19041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096406.19090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096406.19160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096406.19574: variable '__network_required_facts' from source: role '' defaults 24134 1727096406.19596: variable 'ansible_facts' from source: unknown 24134 1727096406.19700: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 24134 1727096406.19708: when evaluation is False, skipping this task 24134 1727096406.19715: _execute() done 24134 1727096406.19776: dumping result to json 24134 1727096406.19779: done dumping result, returning 24134 1727096406.19782: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-1673-d3fc-00000000038e] 24134 1727096406.19784: sending task result for task 0afff68d-5257-1673-d3fc-00000000038e 24134 1727096406.19871: done sending task result for task 0afff68d-5257-1673-d3fc-00000000038e 24134 1727096406.19875: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096406.19921: no more pending results, returning what we have 24134 1727096406.19924: results queue empty 24134 1727096406.19926: checking for any_errors_fatal 24134 1727096406.19927: done checking for any_errors_fatal 24134 1727096406.19927: checking for max_fail_percentage 24134 1727096406.19929: done checking for max_fail_percentage 24134 1727096406.19930: checking to see if all hosts have failed and the running result is not ok 24134 1727096406.19930: done checking to see if all hosts have failed 24134 1727096406.19931: getting the remaining hosts for this loop 24134 1727096406.19933: done getting the remaining hosts for this loop 24134 1727096406.19937: getting the next task for host managed_node1 24134 1727096406.19947: done getting next task for host managed_node1 24134 1727096406.19951: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 24134 1727096406.19956: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096406.19976: getting variables 24134 1727096406.19978: in VariableManager get_vars() 24134 1727096406.20017: Calling all_inventory to load vars for managed_node1 24134 1727096406.20021: Calling groups_inventory to load vars for managed_node1 24134 1727096406.20023: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096406.20033: Calling all_plugins_play to load vars for managed_node1 24134 1727096406.20037: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096406.20040: Calling groups_plugins_play to load vars for managed_node1 24134 1727096406.20421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096406.20984: done with get_vars() 24134 1727096406.20996: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 09:00:06 -0400 (0:00:00.062) 0:00:10.424 ****** 24134 1727096406.21109: entering _queue_task() for managed_node1/stat 24134 1727096406.21599: worker is 1 (out of 1 available) 24134 1727096406.21612: exiting _queue_task() for managed_node1/stat 24134 1727096406.21624: done queuing things up, now waiting for results queue to drain 24134 1727096406.21625: waiting for pending results... 24134 1727096406.22288: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 24134 1727096406.22315: in run() - task 0afff68d-5257-1673-d3fc-000000000390 24134 1727096406.22335: variable 'ansible_search_path' from source: unknown 24134 1727096406.22344: variable 'ansible_search_path' from source: unknown 24134 1727096406.22393: calling self._execute() 24134 1727096406.22481: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.22577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.22580: variable 'omit' from source: magic vars 24134 1727096406.22902: variable 'ansible_distribution_major_version' from source: facts 24134 1727096406.22921: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096406.23103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096406.23396: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096406.23445: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096406.23492: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096406.23528: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096406.23622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096406.23654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096406.23697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096406.23729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096406.23975: variable '__network_is_ostree' from source: set_fact 24134 1727096406.23979: Evaluated conditional (not __network_is_ostree is defined): False 24134 1727096406.23982: when evaluation is False, skipping this task 24134 1727096406.23984: _execute() done 24134 1727096406.23986: dumping result to json 24134 1727096406.23988: done dumping result, returning 24134 1727096406.23991: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-1673-d3fc-000000000390] 24134 1727096406.23993: sending task result for task 0afff68d-5257-1673-d3fc-000000000390 24134 1727096406.24060: done sending task result for task 0afff68d-5257-1673-d3fc-000000000390 24134 1727096406.24063: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24134 1727096406.24120: no more pending results, returning what we have 24134 1727096406.24124: results queue empty 24134 1727096406.24125: checking for any_errors_fatal 24134 1727096406.24132: done checking for any_errors_fatal 24134 1727096406.24133: checking for max_fail_percentage 24134 1727096406.24135: done checking for max_fail_percentage 24134 1727096406.24136: checking to see if all hosts have failed and the running result is not ok 24134 1727096406.24137: done checking to see if all hosts have failed 24134 1727096406.24138: getting the remaining hosts for this loop 24134 1727096406.24139: done getting the remaining hosts for this loop 24134 1727096406.24143: getting the next task for host managed_node1 24134 1727096406.24150: done getting next task for host managed_node1 24134 1727096406.24154: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24134 1727096406.24159: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096406.24180: getting variables 24134 1727096406.24182: in VariableManager get_vars() 24134 1727096406.24221: Calling all_inventory to load vars for managed_node1 24134 1727096406.24224: Calling groups_inventory to load vars for managed_node1 24134 1727096406.24227: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096406.24237: Calling all_plugins_play to load vars for managed_node1 24134 1727096406.24241: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096406.24245: Calling groups_plugins_play to load vars for managed_node1 24134 1727096406.24609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096406.24817: done with get_vars() 24134 1727096406.24826: done getting variables 24134 1727096406.25219: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 09:00:06 -0400 (0:00:00.041) 0:00:10.465 ****** 24134 1727096406.25255: entering _queue_task() for managed_node1/set_fact 24134 1727096406.25627: worker is 1 (out of 1 available) 24134 1727096406.25642: exiting _queue_task() for managed_node1/set_fact 24134 1727096406.25655: done queuing things up, now waiting for results queue to drain 24134 1727096406.25656: waiting for pending results... 24134 1727096406.26318: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24134 1727096406.26574: in run() - task 0afff68d-5257-1673-d3fc-000000000391 24134 1727096406.26587: variable 'ansible_search_path' from source: unknown 24134 1727096406.26591: variable 'ansible_search_path' from source: unknown 24134 1727096406.26623: calling self._execute() 24134 1727096406.26797: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.26803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.26813: variable 'omit' from source: magic vars 24134 1727096406.27202: variable 'ansible_distribution_major_version' from source: facts 24134 1727096406.27211: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096406.27377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096406.27926: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096406.28017: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096406.28022: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096406.28052: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096406.28142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096406.28175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096406.28206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096406.28241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096406.28343: variable '__network_is_ostree' from source: set_fact 24134 1727096406.28347: Evaluated conditional (not __network_is_ostree is defined): False 24134 1727096406.28453: when evaluation is False, skipping this task 24134 1727096406.28456: _execute() done 24134 1727096406.28459: dumping result to json 24134 1727096406.28461: done dumping result, returning 24134 1727096406.28464: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-1673-d3fc-000000000391] 24134 1727096406.28466: sending task result for task 0afff68d-5257-1673-d3fc-000000000391 24134 1727096406.28535: done sending task result for task 0afff68d-5257-1673-d3fc-000000000391 24134 1727096406.28538: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24134 1727096406.28605: no more pending results, returning what we have 24134 1727096406.28609: results queue empty 24134 1727096406.28610: checking for any_errors_fatal 24134 1727096406.28618: done checking for any_errors_fatal 24134 1727096406.28619: checking for max_fail_percentage 24134 1727096406.28620: done checking for max_fail_percentage 24134 1727096406.28621: checking to see if all hosts have failed and the running result is not ok 24134 1727096406.28622: done checking to see if all hosts have failed 24134 1727096406.28623: getting the remaining hosts for this loop 24134 1727096406.28625: done getting the remaining hosts for this loop 24134 1727096406.28628: getting the next task for host managed_node1 24134 1727096406.28637: done getting next task for host managed_node1 24134 1727096406.28641: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 24134 1727096406.28646: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096406.28660: getting variables 24134 1727096406.28661: in VariableManager get_vars() 24134 1727096406.28705: Calling all_inventory to load vars for managed_node1 24134 1727096406.28708: Calling groups_inventory to load vars for managed_node1 24134 1727096406.28710: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096406.28720: Calling all_plugins_play to load vars for managed_node1 24134 1727096406.28724: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096406.28727: Calling groups_plugins_play to load vars for managed_node1 24134 1727096406.29152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096406.29380: done with get_vars() 24134 1727096406.29390: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 09:00:06 -0400 (0:00:00.042) 0:00:10.508 ****** 24134 1727096406.29482: entering _queue_task() for managed_node1/service_facts 24134 1727096406.29484: Creating lock for service_facts 24134 1727096406.29750: worker is 1 (out of 1 available) 24134 1727096406.29763: exiting _queue_task() for managed_node1/service_facts 24134 1727096406.29882: done queuing things up, now waiting for results queue to drain 24134 1727096406.29883: waiting for pending results... 24134 1727096406.30040: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 24134 1727096406.30277: in run() - task 0afff68d-5257-1673-d3fc-000000000393 24134 1727096406.30281: variable 'ansible_search_path' from source: unknown 24134 1727096406.30284: variable 'ansible_search_path' from source: unknown 24134 1727096406.30287: calling self._execute() 24134 1727096406.30337: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.30349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.30364: variable 'omit' from source: magic vars 24134 1727096406.30755: variable 'ansible_distribution_major_version' from source: facts 24134 1727096406.30779: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096406.30790: variable 'omit' from source: magic vars 24134 1727096406.30865: variable 'omit' from source: magic vars 24134 1727096406.30947: variable 'omit' from source: magic vars 24134 1727096406.30958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096406.31004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096406.31029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096406.31058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096406.31084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096406.31164: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096406.31171: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.31175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.31246: Set connection var ansible_shell_executable to /bin/sh 24134 1727096406.31258: Set connection var ansible_pipelining to False 24134 1727096406.31277: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096406.31293: Set connection var ansible_timeout to 10 24134 1727096406.31300: Set connection var ansible_connection to ssh 24134 1727096406.31375: Set connection var ansible_shell_type to sh 24134 1727096406.31381: variable 'ansible_shell_executable' from source: unknown 24134 1727096406.31383: variable 'ansible_connection' from source: unknown 24134 1727096406.31386: variable 'ansible_module_compression' from source: unknown 24134 1727096406.31388: variable 'ansible_shell_type' from source: unknown 24134 1727096406.31390: variable 'ansible_shell_executable' from source: unknown 24134 1727096406.31392: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096406.31394: variable 'ansible_pipelining' from source: unknown 24134 1727096406.31396: variable 'ansible_timeout' from source: unknown 24134 1727096406.31398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096406.31622: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096406.31627: variable 'omit' from source: magic vars 24134 1727096406.31629: starting attempt loop 24134 1727096406.31632: running the handler 24134 1727096406.31636: _low_level_execute_command(): starting 24134 1727096406.31649: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096406.32362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096406.32387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096406.32405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096406.32427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096406.32444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096406.32486: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096406.32559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096406.32583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096406.32607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096406.32728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096406.34487: stdout chunk (state=3): >>>/root <<< 24134 1727096406.34657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096406.34660: stdout chunk (state=3): >>><<< 24134 1727096406.34663: stderr chunk (state=3): >>><<< 24134 1727096406.34791: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096406.34795: _low_level_execute_command(): starting 24134 1727096406.34798: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004 `" && echo ansible-tmp-1727096406.3469563-24715-56144195958004="` echo /root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004 `" ) && sleep 0' 24134 1727096406.35399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096406.35672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096406.35709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096406.35817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096406.37911: stdout chunk (state=3): >>>ansible-tmp-1727096406.3469563-24715-56144195958004=/root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004 <<< 24134 1727096406.38017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096406.38093: stderr chunk (state=3): >>><<< 24134 1727096406.38108: stdout chunk (state=3): >>><<< 24134 1727096406.38131: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096406.3469563-24715-56144195958004=/root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096406.38274: variable 'ansible_module_compression' from source: unknown 24134 1727096406.38277: ANSIBALLZ: Using lock for service_facts 24134 1727096406.38280: ANSIBALLZ: Acquiring lock 24134 1727096406.38282: ANSIBALLZ: Lock acquired: 140085159683200 24134 1727096406.38284: ANSIBALLZ: Creating module 24134 1727096406.54475: ANSIBALLZ: Writing module into payload 24134 1727096406.54480: ANSIBALLZ: Writing module 24134 1727096406.54495: ANSIBALLZ: Renaming module 24134 1727096406.54499: ANSIBALLZ: Done creating module 24134 1727096406.54558: variable 'ansible_facts' from source: unknown 24134 1727096406.54689: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004/AnsiballZ_service_facts.py 24134 1727096406.55182: Sending initial data 24134 1727096406.55185: Sent initial data (161 bytes) 24134 1727096406.56342: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096406.56350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096406.56444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096406.58165: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096406.58233: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096406.58301: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpchwf6p8x /root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004/AnsiballZ_service_facts.py <<< 24134 1727096406.58305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004/AnsiballZ_service_facts.py" <<< 24134 1727096406.58373: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpchwf6p8x" to remote "/root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004/AnsiballZ_service_facts.py" <<< 24134 1727096406.59780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096406.59828: stderr chunk (state=3): >>><<< 24134 1727096406.59832: stdout chunk (state=3): >>><<< 24134 1727096406.59834: done transferring module to remote 24134 1727096406.59933: _low_level_execute_command(): starting 24134 1727096406.59940: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004/ /root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004/AnsiballZ_service_facts.py && sleep 0' 24134 1727096406.60808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096406.60812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096406.60815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096406.60817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096406.60820: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096406.60822: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096406.60824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096406.60883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096406.60886: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096406.60889: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24134 1727096406.60892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096406.60942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096406.60958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096406.61002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096406.61328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096406.63119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096406.63123: stderr chunk (state=3): >>><<< 24134 1727096406.63126: stdout chunk (state=3): >>><<< 24134 1727096406.63177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096406.63180: _low_level_execute_command(): starting 24134 1727096406.63184: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004/AnsiballZ_service_facts.py && sleep 0' 24134 1727096406.63801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096406.63806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096406.63809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096406.63813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096406.63816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096406.63818: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096406.63820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096406.63822: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096406.63824: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096406.63826: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24134 1727096406.63828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096406.63830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096406.63832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096406.63834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096406.63836: stderr chunk (state=3): >>>debug2: match found <<< 24134 1727096406.63849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096406.63913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096406.63924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096406.63933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096406.64070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096408.26840: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 24134 1727096408.26857: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 24134 1727096408.26892: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 24134 1727096408.28495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096408.28498: stdout chunk (state=3): >>><<< 24134 1727096408.28501: stderr chunk (state=3): >>><<< 24134 1727096408.28531: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096408.30974: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096408.30979: _low_level_execute_command(): starting 24134 1727096408.30981: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096406.3469563-24715-56144195958004/ > /dev/null 2>&1 && sleep 0' 24134 1727096408.31594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096408.31615: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096408.31685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096408.31745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096408.31763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096408.31792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096408.31898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096408.34121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096408.34125: stdout chunk (state=3): >>><<< 24134 1727096408.34128: stderr chunk (state=3): >>><<< 24134 1727096408.34144: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096408.34681: handler run complete 24134 1727096408.34744: variable 'ansible_facts' from source: unknown 24134 1727096408.35143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096408.36164: variable 'ansible_facts' from source: unknown 24134 1727096408.36336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096408.36579: attempt loop complete, returning result 24134 1727096408.36595: _execute() done 24134 1727096408.36604: dumping result to json 24134 1727096408.36672: done dumping result, returning 24134 1727096408.36701: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-1673-d3fc-000000000393] 24134 1727096408.36711: sending task result for task 0afff68d-5257-1673-d3fc-000000000393 24134 1727096408.38260: done sending task result for task 0afff68d-5257-1673-d3fc-000000000393 24134 1727096408.38263: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096408.38363: no more pending results, returning what we have 24134 1727096408.38365: results queue empty 24134 1727096408.38366: checking for any_errors_fatal 24134 1727096408.38371: done checking for any_errors_fatal 24134 1727096408.38372: checking for max_fail_percentage 24134 1727096408.38374: done checking for max_fail_percentage 24134 1727096408.38374: checking to see if all hosts have failed and the running result is not ok 24134 1727096408.38375: done checking to see if all hosts have failed 24134 1727096408.38376: getting the remaining hosts for this loop 24134 1727096408.38377: done getting the remaining hosts for this loop 24134 1727096408.38381: getting the next task for host managed_node1 24134 1727096408.38385: done getting next task for host managed_node1 24134 1727096408.38388: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 24134 1727096408.38392: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096408.38400: getting variables 24134 1727096408.38401: in VariableManager get_vars() 24134 1727096408.38425: Calling all_inventory to load vars for managed_node1 24134 1727096408.38428: Calling groups_inventory to load vars for managed_node1 24134 1727096408.38430: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096408.38444: Calling all_plugins_play to load vars for managed_node1 24134 1727096408.38447: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096408.38449: Calling groups_plugins_play to load vars for managed_node1 24134 1727096408.38813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096408.39336: done with get_vars() 24134 1727096408.39348: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 09:00:08 -0400 (0:00:02.099) 0:00:12.608 ****** 24134 1727096408.39459: entering _queue_task() for managed_node1/package_facts 24134 1727096408.39465: Creating lock for package_facts 24134 1727096408.39875: worker is 1 (out of 1 available) 24134 1727096408.39886: exiting _queue_task() for managed_node1/package_facts 24134 1727096408.39896: done queuing things up, now waiting for results queue to drain 24134 1727096408.39897: waiting for pending results... 24134 1727096408.40304: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 24134 1727096408.40429: in run() - task 0afff68d-5257-1673-d3fc-000000000394 24134 1727096408.40450: variable 'ansible_search_path' from source: unknown 24134 1727096408.40458: variable 'ansible_search_path' from source: unknown 24134 1727096408.40509: calling self._execute() 24134 1727096408.40591: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096408.40606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096408.40709: variable 'omit' from source: magic vars 24134 1727096408.41010: variable 'ansible_distribution_major_version' from source: facts 24134 1727096408.41029: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096408.41051: variable 'omit' from source: magic vars 24134 1727096408.41128: variable 'omit' from source: magic vars 24134 1727096408.41182: variable 'omit' from source: magic vars 24134 1727096408.41256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096408.41283: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096408.41309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096408.41333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096408.41350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096408.41476: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096408.41484: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096408.41487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096408.41523: Set connection var ansible_shell_executable to /bin/sh 24134 1727096408.41536: Set connection var ansible_pipelining to False 24134 1727096408.41546: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096408.41560: Set connection var ansible_timeout to 10 24134 1727096408.41571: Set connection var ansible_connection to ssh 24134 1727096408.41584: Set connection var ansible_shell_type to sh 24134 1727096408.41614: variable 'ansible_shell_executable' from source: unknown 24134 1727096408.41624: variable 'ansible_connection' from source: unknown 24134 1727096408.41632: variable 'ansible_module_compression' from source: unknown 24134 1727096408.41639: variable 'ansible_shell_type' from source: unknown 24134 1727096408.41645: variable 'ansible_shell_executable' from source: unknown 24134 1727096408.41652: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096408.41661: variable 'ansible_pipelining' from source: unknown 24134 1727096408.41672: variable 'ansible_timeout' from source: unknown 24134 1727096408.41693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096408.41918: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096408.41975: variable 'omit' from source: magic vars 24134 1727096408.41978: starting attempt loop 24134 1727096408.41981: running the handler 24134 1727096408.41983: _low_level_execute_command(): starting 24134 1727096408.41985: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096408.42799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096408.42853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096408.42885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096408.42933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096408.42998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096408.44720: stdout chunk (state=3): >>>/root <<< 24134 1727096408.44880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096408.44884: stdout chunk (state=3): >>><<< 24134 1727096408.44886: stderr chunk (state=3): >>><<< 24134 1727096408.44905: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096408.44986: _low_level_execute_command(): starting 24134 1727096408.44991: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024 `" && echo ansible-tmp-1727096408.4491167-24828-148146441887024="` echo /root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024 `" ) && sleep 0' 24134 1727096408.45582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096408.45655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096408.45720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096408.45739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096408.45783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096408.45879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096408.47876: stdout chunk (state=3): >>>ansible-tmp-1727096408.4491167-24828-148146441887024=/root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024 <<< 24134 1727096408.48017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096408.48036: stderr chunk (state=3): >>><<< 24134 1727096408.48056: stdout chunk (state=3): >>><<< 24134 1727096408.48273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096408.4491167-24828-148146441887024=/root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096408.48277: variable 'ansible_module_compression' from source: unknown 24134 1727096408.48279: ANSIBALLZ: Using lock for package_facts 24134 1727096408.48282: ANSIBALLZ: Acquiring lock 24134 1727096408.48283: ANSIBALLZ: Lock acquired: 140085159953472 24134 1727096408.48285: ANSIBALLZ: Creating module 24134 1727096408.78457: ANSIBALLZ: Writing module into payload 24134 1727096408.78614: ANSIBALLZ: Writing module 24134 1727096408.78653: ANSIBALLZ: Renaming module 24134 1727096408.78670: ANSIBALLZ: Done creating module 24134 1727096408.78698: variable 'ansible_facts' from source: unknown 24134 1727096408.78909: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024/AnsiballZ_package_facts.py 24134 1727096408.79146: Sending initial data 24134 1727096408.79160: Sent initial data (162 bytes) 24134 1727096408.79684: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096408.79743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096408.79753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096408.79779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096408.79887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096408.81605: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096408.81668: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096408.81745: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmp8d9yvc3j /root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024/AnsiballZ_package_facts.py <<< 24134 1727096408.81749: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024/AnsiballZ_package_facts.py" <<< 24134 1727096408.81839: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmp8d9yvc3j" to remote "/root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024/AnsiballZ_package_facts.py" <<< 24134 1727096408.83547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096408.83558: stderr chunk (state=3): >>><<< 24134 1727096408.83565: stdout chunk (state=3): >>><<< 24134 1727096408.83679: done transferring module to remote 24134 1727096408.83682: _low_level_execute_command(): starting 24134 1727096408.83684: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024/ /root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024/AnsiballZ_package_facts.py && sleep 0' 24134 1727096408.84238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096408.84253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096408.84283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096408.84322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096408.84326: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096408.84391: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096408.84440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096408.84463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096408.84486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096408.84585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096408.86621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096408.86625: stderr chunk (state=3): >>><<< 24134 1727096408.86628: stdout chunk (state=3): >>><<< 24134 1727096408.86630: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096408.86632: _low_level_execute_command(): starting 24134 1727096408.86635: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024/AnsiballZ_package_facts.py && sleep 0' 24134 1727096408.87238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096408.87266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096408.87371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096409.32489: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap":<<< 24134 1727096409.32530: stdout chunk (state=3): >>> [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 24134 1727096409.34274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096409.34287: stdout chunk (state=3): >>><<< 24134 1727096409.34296: stderr chunk (state=3): >>><<< 24134 1727096409.34577: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096409.37911: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096409.37941: _low_level_execute_command(): starting 24134 1727096409.37952: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096408.4491167-24828-148146441887024/ > /dev/null 2>&1 && sleep 0' 24134 1727096409.39294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096409.39487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096409.39500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096409.39653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096409.41540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096409.41621: stderr chunk (state=3): >>><<< 24134 1727096409.41624: stdout chunk (state=3): >>><<< 24134 1727096409.41636: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096409.41645: handler run complete 24134 1727096409.42936: variable 'ansible_facts' from source: unknown 24134 1727096409.43491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096409.45376: variable 'ansible_facts' from source: unknown 24134 1727096409.49879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096409.50615: attempt loop complete, returning result 24134 1727096409.50637: _execute() done 24134 1727096409.50645: dumping result to json 24134 1727096409.50977: done dumping result, returning 24134 1727096409.50981: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-1673-d3fc-000000000394] 24134 1727096409.50983: sending task result for task 0afff68d-5257-1673-d3fc-000000000394 24134 1727096409.53126: done sending task result for task 0afff68d-5257-1673-d3fc-000000000394 24134 1727096409.53130: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096409.53232: no more pending results, returning what we have 24134 1727096409.53234: results queue empty 24134 1727096409.53235: checking for any_errors_fatal 24134 1727096409.53239: done checking for any_errors_fatal 24134 1727096409.53240: checking for max_fail_percentage 24134 1727096409.53242: done checking for max_fail_percentage 24134 1727096409.53242: checking to see if all hosts have failed and the running result is not ok 24134 1727096409.53243: done checking to see if all hosts have failed 24134 1727096409.53244: getting the remaining hosts for this loop 24134 1727096409.53245: done getting the remaining hosts for this loop 24134 1727096409.53249: getting the next task for host managed_node1 24134 1727096409.53255: done getting next task for host managed_node1 24134 1727096409.53258: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 24134 1727096409.53261: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096409.53276: getting variables 24134 1727096409.53278: in VariableManager get_vars() 24134 1727096409.53307: Calling all_inventory to load vars for managed_node1 24134 1727096409.53310: Calling groups_inventory to load vars for managed_node1 24134 1727096409.53312: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096409.53320: Calling all_plugins_play to load vars for managed_node1 24134 1727096409.53323: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096409.53326: Calling groups_plugins_play to load vars for managed_node1 24134 1727096409.54485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096409.56107: done with get_vars() 24134 1727096409.56130: done getting variables 24134 1727096409.56202: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 09:00:09 -0400 (0:00:01.167) 0:00:13.775 ****** 24134 1727096409.56237: entering _queue_task() for managed_node1/debug 24134 1727096409.56701: worker is 1 (out of 1 available) 24134 1727096409.56712: exiting _queue_task() for managed_node1/debug 24134 1727096409.56723: done queuing things up, now waiting for results queue to drain 24134 1727096409.56724: waiting for pending results... 24134 1727096409.56961: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 24134 1727096409.57017: in run() - task 0afff68d-5257-1673-d3fc-000000000018 24134 1727096409.57040: variable 'ansible_search_path' from source: unknown 24134 1727096409.57050: variable 'ansible_search_path' from source: unknown 24134 1727096409.57098: calling self._execute() 24134 1727096409.57189: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096409.57278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096409.57282: variable 'omit' from source: magic vars 24134 1727096409.57614: variable 'ansible_distribution_major_version' from source: facts 24134 1727096409.57632: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096409.57644: variable 'omit' from source: magic vars 24134 1727096409.57706: variable 'omit' from source: magic vars 24134 1727096409.57811: variable 'network_provider' from source: set_fact 24134 1727096409.57931: variable 'omit' from source: magic vars 24134 1727096409.57933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096409.57937: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096409.57948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096409.57972: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096409.57988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096409.58016: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096409.58023: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096409.58029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096409.58128: Set connection var ansible_shell_executable to /bin/sh 24134 1727096409.58137: Set connection var ansible_pipelining to False 24134 1727096409.58149: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096409.58162: Set connection var ansible_timeout to 10 24134 1727096409.58173: Set connection var ansible_connection to ssh 24134 1727096409.58179: Set connection var ansible_shell_type to sh 24134 1727096409.58203: variable 'ansible_shell_executable' from source: unknown 24134 1727096409.58210: variable 'ansible_connection' from source: unknown 24134 1727096409.58216: variable 'ansible_module_compression' from source: unknown 24134 1727096409.58221: variable 'ansible_shell_type' from source: unknown 24134 1727096409.58226: variable 'ansible_shell_executable' from source: unknown 24134 1727096409.58231: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096409.58237: variable 'ansible_pipelining' from source: unknown 24134 1727096409.58242: variable 'ansible_timeout' from source: unknown 24134 1727096409.58251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096409.58388: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096409.58476: variable 'omit' from source: magic vars 24134 1727096409.58479: starting attempt loop 24134 1727096409.58481: running the handler 24134 1727096409.58483: handler run complete 24134 1727096409.58485: attempt loop complete, returning result 24134 1727096409.58487: _execute() done 24134 1727096409.58489: dumping result to json 24134 1727096409.58491: done dumping result, returning 24134 1727096409.58496: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-1673-d3fc-000000000018] 24134 1727096409.58504: sending task result for task 0afff68d-5257-1673-d3fc-000000000018 24134 1727096409.58686: done sending task result for task 0afff68d-5257-1673-d3fc-000000000018 24134 1727096409.58690: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 24134 1727096409.58750: no more pending results, returning what we have 24134 1727096409.58753: results queue empty 24134 1727096409.58754: checking for any_errors_fatal 24134 1727096409.58763: done checking for any_errors_fatal 24134 1727096409.58764: checking for max_fail_percentage 24134 1727096409.58765: done checking for max_fail_percentage 24134 1727096409.58766: checking to see if all hosts have failed and the running result is not ok 24134 1727096409.58771: done checking to see if all hosts have failed 24134 1727096409.58772: getting the remaining hosts for this loop 24134 1727096409.58774: done getting the remaining hosts for this loop 24134 1727096409.58778: getting the next task for host managed_node1 24134 1727096409.58785: done getting next task for host managed_node1 24134 1727096409.58789: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24134 1727096409.58793: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096409.58805: getting variables 24134 1727096409.58806: in VariableManager get_vars() 24134 1727096409.58844: Calling all_inventory to load vars for managed_node1 24134 1727096409.58847: Calling groups_inventory to load vars for managed_node1 24134 1727096409.58850: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096409.58860: Calling all_plugins_play to load vars for managed_node1 24134 1727096409.58863: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096409.58866: Calling groups_plugins_play to load vars for managed_node1 24134 1727096409.60488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096409.62904: done with get_vars() 24134 1727096409.62935: done getting variables 24134 1727096409.63019: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 09:00:09 -0400 (0:00:00.068) 0:00:13.844 ****** 24134 1727096409.63087: entering _queue_task() for managed_node1/fail 24134 1727096409.63439: worker is 1 (out of 1 available) 24134 1727096409.63455: exiting _queue_task() for managed_node1/fail 24134 1727096409.63469: done queuing things up, now waiting for results queue to drain 24134 1727096409.63470: waiting for pending results... 24134 1727096409.63751: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24134 1727096409.63904: in run() - task 0afff68d-5257-1673-d3fc-000000000019 24134 1727096409.63927: variable 'ansible_search_path' from source: unknown 24134 1727096409.63937: variable 'ansible_search_path' from source: unknown 24134 1727096409.64073: calling self._execute() 24134 1727096409.64076: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096409.64079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096409.64084: variable 'omit' from source: magic vars 24134 1727096409.64451: variable 'ansible_distribution_major_version' from source: facts 24134 1727096409.64465: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096409.64588: variable 'network_state' from source: role '' defaults 24134 1727096409.64639: Evaluated conditional (network_state != {}): False 24134 1727096409.64642: when evaluation is False, skipping this task 24134 1727096409.64645: _execute() done 24134 1727096409.64647: dumping result to json 24134 1727096409.64649: done dumping result, returning 24134 1727096409.64651: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-1673-d3fc-000000000019] 24134 1727096409.64653: sending task result for task 0afff68d-5257-1673-d3fc-000000000019 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096409.64796: no more pending results, returning what we have 24134 1727096409.64800: results queue empty 24134 1727096409.64801: checking for any_errors_fatal 24134 1727096409.64807: done checking for any_errors_fatal 24134 1727096409.64808: checking for max_fail_percentage 24134 1727096409.64810: done checking for max_fail_percentage 24134 1727096409.64812: checking to see if all hosts have failed and the running result is not ok 24134 1727096409.64813: done checking to see if all hosts have failed 24134 1727096409.64814: getting the remaining hosts for this loop 24134 1727096409.64815: done getting the remaining hosts for this loop 24134 1727096409.64819: getting the next task for host managed_node1 24134 1727096409.64826: done getting next task for host managed_node1 24134 1727096409.64829: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24134 1727096409.64833: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096409.64980: getting variables 24134 1727096409.64982: in VariableManager get_vars() 24134 1727096409.65019: Calling all_inventory to load vars for managed_node1 24134 1727096409.65023: Calling groups_inventory to load vars for managed_node1 24134 1727096409.65026: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096409.65038: Calling all_plugins_play to load vars for managed_node1 24134 1727096409.65041: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096409.65045: Calling groups_plugins_play to load vars for managed_node1 24134 1727096409.65722: done sending task result for task 0afff68d-5257-1673-d3fc-000000000019 24134 1727096409.65725: WORKER PROCESS EXITING 24134 1727096409.68156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096409.70286: done with get_vars() 24134 1727096409.70424: done getting variables 24134 1727096409.70535: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 09:00:09 -0400 (0:00:00.074) 0:00:13.919 ****** 24134 1727096409.70575: entering _queue_task() for managed_node1/fail 24134 1727096409.71235: worker is 1 (out of 1 available) 24134 1727096409.71248: exiting _queue_task() for managed_node1/fail 24134 1727096409.71259: done queuing things up, now waiting for results queue to drain 24134 1727096409.71260: waiting for pending results... 24134 1727096409.71732: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24134 1727096409.71846: in run() - task 0afff68d-5257-1673-d3fc-00000000001a 24134 1727096409.71862: variable 'ansible_search_path' from source: unknown 24134 1727096409.71866: variable 'ansible_search_path' from source: unknown 24134 1727096409.72104: calling self._execute() 24134 1727096409.72187: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096409.72194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096409.72202: variable 'omit' from source: magic vars 24134 1727096409.73062: variable 'ansible_distribution_major_version' from source: facts 24134 1727096409.73079: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096409.73302: variable 'network_state' from source: role '' defaults 24134 1727096409.73312: Evaluated conditional (network_state != {}): False 24134 1727096409.73374: when evaluation is False, skipping this task 24134 1727096409.73379: _execute() done 24134 1727096409.73382: dumping result to json 24134 1727096409.73385: done dumping result, returning 24134 1727096409.73393: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-1673-d3fc-00000000001a] 24134 1727096409.73399: sending task result for task 0afff68d-5257-1673-d3fc-00000000001a 24134 1727096409.73577: done sending task result for task 0afff68d-5257-1673-d3fc-00000000001a 24134 1727096409.73580: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096409.73655: no more pending results, returning what we have 24134 1727096409.73658: results queue empty 24134 1727096409.73659: checking for any_errors_fatal 24134 1727096409.73670: done checking for any_errors_fatal 24134 1727096409.73671: checking for max_fail_percentage 24134 1727096409.73673: done checking for max_fail_percentage 24134 1727096409.73674: checking to see if all hosts have failed and the running result is not ok 24134 1727096409.73674: done checking to see if all hosts have failed 24134 1727096409.73675: getting the remaining hosts for this loop 24134 1727096409.73676: done getting the remaining hosts for this loop 24134 1727096409.73680: getting the next task for host managed_node1 24134 1727096409.73687: done getting next task for host managed_node1 24134 1727096409.73691: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24134 1727096409.73694: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096409.73709: getting variables 24134 1727096409.73710: in VariableManager get_vars() 24134 1727096409.73807: Calling all_inventory to load vars for managed_node1 24134 1727096409.73811: Calling groups_inventory to load vars for managed_node1 24134 1727096409.73813: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096409.73938: Calling all_plugins_play to load vars for managed_node1 24134 1727096409.73941: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096409.73945: Calling groups_plugins_play to load vars for managed_node1 24134 1727096409.75766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096409.77481: done with get_vars() 24134 1727096409.77507: done getting variables 24134 1727096409.77574: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 09:00:09 -0400 (0:00:00.070) 0:00:13.989 ****** 24134 1727096409.77608: entering _queue_task() for managed_node1/fail 24134 1727096409.77937: worker is 1 (out of 1 available) 24134 1727096409.77949: exiting _queue_task() for managed_node1/fail 24134 1727096409.78077: done queuing things up, now waiting for results queue to drain 24134 1727096409.78079: waiting for pending results... 24134 1727096409.78550: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24134 1727096409.78578: in run() - task 0afff68d-5257-1673-d3fc-00000000001b 24134 1727096409.78597: variable 'ansible_search_path' from source: unknown 24134 1727096409.78604: variable 'ansible_search_path' from source: unknown 24134 1727096409.78647: calling self._execute() 24134 1727096409.78754: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096409.78770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096409.78784: variable 'omit' from source: magic vars 24134 1727096409.79146: variable 'ansible_distribution_major_version' from source: facts 24134 1727096409.79160: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096409.79335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096409.81945: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096409.82057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096409.82061: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096409.82577: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096409.82581: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096409.82584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096409.82587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096409.82589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096409.82705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096409.82726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096409.82941: variable 'ansible_distribution_major_version' from source: facts 24134 1727096409.82962: Evaluated conditional (ansible_distribution_major_version | int > 9): True 24134 1727096409.83201: variable 'ansible_distribution' from source: facts 24134 1727096409.83238: variable '__network_rh_distros' from source: role '' defaults 24134 1727096409.83252: Evaluated conditional (ansible_distribution in __network_rh_distros): True 24134 1727096409.83819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096409.83854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096409.83945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096409.84052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096409.84075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096409.84190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096409.84282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096409.84314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096409.84360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096409.84537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096409.84547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096409.84576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096409.84665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096409.84713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096409.84770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096409.85407: variable 'network_connections' from source: task vars 24134 1727096409.85508: variable 'interface' from source: set_fact 24134 1727096409.85605: variable 'interface' from source: set_fact 24134 1727096409.85624: variable 'interface' from source: set_fact 24134 1727096409.85834: variable 'interface' from source: set_fact 24134 1727096409.85837: variable 'network_state' from source: role '' defaults 24134 1727096409.85972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096409.86314: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096409.86413: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096409.86591: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096409.86594: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096409.86630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096409.86808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096409.86812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096409.87075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096409.87079: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 24134 1727096409.87082: when evaluation is False, skipping this task 24134 1727096409.87085: _execute() done 24134 1727096409.87087: dumping result to json 24134 1727096409.87089: done dumping result, returning 24134 1727096409.87092: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-1673-d3fc-00000000001b] 24134 1727096409.87094: sending task result for task 0afff68d-5257-1673-d3fc-00000000001b 24134 1727096409.87162: done sending task result for task 0afff68d-5257-1673-d3fc-00000000001b 24134 1727096409.87165: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 24134 1727096409.87218: no more pending results, returning what we have 24134 1727096409.87222: results queue empty 24134 1727096409.87223: checking for any_errors_fatal 24134 1727096409.87228: done checking for any_errors_fatal 24134 1727096409.87229: checking for max_fail_percentage 24134 1727096409.87231: done checking for max_fail_percentage 24134 1727096409.87232: checking to see if all hosts have failed and the running result is not ok 24134 1727096409.87232: done checking to see if all hosts have failed 24134 1727096409.87233: getting the remaining hosts for this loop 24134 1727096409.87235: done getting the remaining hosts for this loop 24134 1727096409.87239: getting the next task for host managed_node1 24134 1727096409.87245: done getting next task for host managed_node1 24134 1727096409.87249: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24134 1727096409.87252: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096409.87265: getting variables 24134 1727096409.87266: in VariableManager get_vars() 24134 1727096409.87307: Calling all_inventory to load vars for managed_node1 24134 1727096409.87310: Calling groups_inventory to load vars for managed_node1 24134 1727096409.87312: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096409.87321: Calling all_plugins_play to load vars for managed_node1 24134 1727096409.87324: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096409.87326: Calling groups_plugins_play to load vars for managed_node1 24134 1727096409.90708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096409.93912: done with get_vars() 24134 1727096409.94058: done getting variables 24134 1727096409.94275: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 09:00:09 -0400 (0:00:00.166) 0:00:14.156 ****** 24134 1727096409.94309: entering _queue_task() for managed_node1/dnf 24134 1727096409.95028: worker is 1 (out of 1 available) 24134 1727096409.95039: exiting _queue_task() for managed_node1/dnf 24134 1727096409.95051: done queuing things up, now waiting for results queue to drain 24134 1727096409.95052: waiting for pending results... 24134 1727096409.95369: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24134 1727096409.95784: in run() - task 0afff68d-5257-1673-d3fc-00000000001c 24134 1727096409.95789: variable 'ansible_search_path' from source: unknown 24134 1727096409.95791: variable 'ansible_search_path' from source: unknown 24134 1727096409.95795: calling self._execute() 24134 1727096409.95920: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096409.95924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096409.96110: variable 'omit' from source: magic vars 24134 1727096409.96734: variable 'ansible_distribution_major_version' from source: facts 24134 1727096409.96747: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096409.97280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096410.01530: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096410.01539: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096410.01583: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096410.01614: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096410.01646: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096410.01962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.01965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.01970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.01972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.01975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.01977: variable 'ansible_distribution' from source: facts 24134 1727096410.01980: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.01992: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 24134 1727096410.02112: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096410.02244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.02269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.02300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.02339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.02352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.02398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.02421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.02444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.02573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.02576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.02579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.02581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.02595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.02637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.02651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.02808: variable 'network_connections' from source: task vars 24134 1727096410.02825: variable 'interface' from source: set_fact 24134 1727096410.03047: variable 'interface' from source: set_fact 24134 1727096410.03050: variable 'interface' from source: set_fact 24134 1727096410.03053: variable 'interface' from source: set_fact 24134 1727096410.03055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096410.03197: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096410.03236: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096410.03282: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096410.03314: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096410.03355: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096410.03381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096410.03406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.03437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096410.03496: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096410.03733: variable 'network_connections' from source: task vars 24134 1727096410.03738: variable 'interface' from source: set_fact 24134 1727096410.03872: variable 'interface' from source: set_fact 24134 1727096410.03876: variable 'interface' from source: set_fact 24134 1727096410.03878: variable 'interface' from source: set_fact 24134 1727096410.03906: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24134 1727096410.03909: when evaluation is False, skipping this task 24134 1727096410.03920: _execute() done 24134 1727096410.03923: dumping result to json 24134 1727096410.03925: done dumping result, returning 24134 1727096410.03928: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-00000000001c] 24134 1727096410.03930: sending task result for task 0afff68d-5257-1673-d3fc-00000000001c 24134 1727096410.04205: done sending task result for task 0afff68d-5257-1673-d3fc-00000000001c 24134 1727096410.04208: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24134 1727096410.04283: no more pending results, returning what we have 24134 1727096410.04286: results queue empty 24134 1727096410.04287: checking for any_errors_fatal 24134 1727096410.04291: done checking for any_errors_fatal 24134 1727096410.04292: checking for max_fail_percentage 24134 1727096410.04294: done checking for max_fail_percentage 24134 1727096410.04294: checking to see if all hosts have failed and the running result is not ok 24134 1727096410.04295: done checking to see if all hosts have failed 24134 1727096410.04296: getting the remaining hosts for this loop 24134 1727096410.04297: done getting the remaining hosts for this loop 24134 1727096410.04300: getting the next task for host managed_node1 24134 1727096410.04305: done getting next task for host managed_node1 24134 1727096410.04309: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24134 1727096410.04312: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096410.04326: getting variables 24134 1727096410.04327: in VariableManager get_vars() 24134 1727096410.04362: Calling all_inventory to load vars for managed_node1 24134 1727096410.04365: Calling groups_inventory to load vars for managed_node1 24134 1727096410.04370: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096410.04379: Calling all_plugins_play to load vars for managed_node1 24134 1727096410.04382: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096410.04385: Calling groups_plugins_play to load vars for managed_node1 24134 1727096410.06951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096410.10580: done with get_vars() 24134 1727096410.10602: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24134 1727096410.10790: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 09:00:10 -0400 (0:00:00.165) 0:00:14.321 ****** 24134 1727096410.10824: entering _queue_task() for managed_node1/yum 24134 1727096410.10826: Creating lock for yum 24134 1727096410.11504: worker is 1 (out of 1 available) 24134 1727096410.11518: exiting _queue_task() for managed_node1/yum 24134 1727096410.11645: done queuing things up, now waiting for results queue to drain 24134 1727096410.11647: waiting for pending results... 24134 1727096410.11976: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24134 1727096410.12924: in run() - task 0afff68d-5257-1673-d3fc-00000000001d 24134 1727096410.12962: variable 'ansible_search_path' from source: unknown 24134 1727096410.12966: variable 'ansible_search_path' from source: unknown 24134 1727096410.12977: calling self._execute() 24134 1727096410.13071: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096410.13075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096410.13078: variable 'omit' from source: magic vars 24134 1727096410.13711: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.13714: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096410.14014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096410.18744: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096410.18933: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096410.18968: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096410.19230: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096410.19234: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096410.19382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.19412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.19437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.19594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.19607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.19973: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.19977: Evaluated conditional (ansible_distribution_major_version | int < 8): False 24134 1727096410.19979: when evaluation is False, skipping this task 24134 1727096410.19982: _execute() done 24134 1727096410.19984: dumping result to json 24134 1727096410.19986: done dumping result, returning 24134 1727096410.19989: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-00000000001d] 24134 1727096410.19992: sending task result for task 0afff68d-5257-1673-d3fc-00000000001d 24134 1727096410.20056: done sending task result for task 0afff68d-5257-1673-d3fc-00000000001d 24134 1727096410.20059: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 24134 1727096410.20114: no more pending results, returning what we have 24134 1727096410.20117: results queue empty 24134 1727096410.20118: checking for any_errors_fatal 24134 1727096410.20123: done checking for any_errors_fatal 24134 1727096410.20124: checking for max_fail_percentage 24134 1727096410.20126: done checking for max_fail_percentage 24134 1727096410.20127: checking to see if all hosts have failed and the running result is not ok 24134 1727096410.20127: done checking to see if all hosts have failed 24134 1727096410.20128: getting the remaining hosts for this loop 24134 1727096410.20130: done getting the remaining hosts for this loop 24134 1727096410.20133: getting the next task for host managed_node1 24134 1727096410.20139: done getting next task for host managed_node1 24134 1727096410.20143: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24134 1727096410.20148: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096410.20161: getting variables 24134 1727096410.20163: in VariableManager get_vars() 24134 1727096410.20208: Calling all_inventory to load vars for managed_node1 24134 1727096410.20211: Calling groups_inventory to load vars for managed_node1 24134 1727096410.20214: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096410.20225: Calling all_plugins_play to load vars for managed_node1 24134 1727096410.20228: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096410.20231: Calling groups_plugins_play to load vars for managed_node1 24134 1727096410.22887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096410.26295: done with get_vars() 24134 1727096410.26439: done getting variables 24134 1727096410.26502: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 09:00:10 -0400 (0:00:00.158) 0:00:14.479 ****** 24134 1727096410.26650: entering _queue_task() for managed_node1/fail 24134 1727096410.27295: worker is 1 (out of 1 available) 24134 1727096410.27306: exiting _queue_task() for managed_node1/fail 24134 1727096410.27318: done queuing things up, now waiting for results queue to drain 24134 1727096410.27319: waiting for pending results... 24134 1727096410.27998: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24134 1727096410.28044: in run() - task 0afff68d-5257-1673-d3fc-00000000001e 24134 1727096410.28279: variable 'ansible_search_path' from source: unknown 24134 1727096410.28282: variable 'ansible_search_path' from source: unknown 24134 1727096410.28285: calling self._execute() 24134 1727096410.28497: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096410.28501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096410.28504: variable 'omit' from source: magic vars 24134 1727096410.29261: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.29282: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096410.29515: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096410.29953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096410.33546: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096410.33636: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096410.33688: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096410.33732: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096410.33771: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096410.33915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.33919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.33938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.33990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.34011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.34070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.34105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.34140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.34242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.34247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.34258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.34289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.34324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.34374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.34394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.34591: variable 'network_connections' from source: task vars 24134 1727096410.34609: variable 'interface' from source: set_fact 24134 1727096410.34838: variable 'interface' from source: set_fact 24134 1727096410.34853: variable 'interface' from source: set_fact 24134 1727096410.34927: variable 'interface' from source: set_fact 24134 1727096410.35066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096410.35191: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096410.35236: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096410.35288: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096410.35324: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096410.35372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096410.35411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096410.35448: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.35481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096410.35551: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096410.35827: variable 'network_connections' from source: task vars 24134 1727096410.35830: variable 'interface' from source: set_fact 24134 1727096410.35891: variable 'interface' from source: set_fact 24134 1727096410.35936: variable 'interface' from source: set_fact 24134 1727096410.35983: variable 'interface' from source: set_fact 24134 1727096410.36019: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24134 1727096410.36028: when evaluation is False, skipping this task 24134 1727096410.36035: _execute() done 24134 1727096410.36046: dumping result to json 24134 1727096410.36053: done dumping result, returning 24134 1727096410.36090: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-00000000001e] 24134 1727096410.36100: sending task result for task 0afff68d-5257-1673-d3fc-00000000001e skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24134 1727096410.36255: no more pending results, returning what we have 24134 1727096410.36259: results queue empty 24134 1727096410.36260: checking for any_errors_fatal 24134 1727096410.36266: done checking for any_errors_fatal 24134 1727096410.36269: checking for max_fail_percentage 24134 1727096410.36271: done checking for max_fail_percentage 24134 1727096410.36272: checking to see if all hosts have failed and the running result is not ok 24134 1727096410.36273: done checking to see if all hosts have failed 24134 1727096410.36273: getting the remaining hosts for this loop 24134 1727096410.36275: done getting the remaining hosts for this loop 24134 1727096410.36279: getting the next task for host managed_node1 24134 1727096410.36285: done getting next task for host managed_node1 24134 1727096410.36289: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 24134 1727096410.36292: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096410.36305: getting variables 24134 1727096410.36307: in VariableManager get_vars() 24134 1727096410.36350: Calling all_inventory to load vars for managed_node1 24134 1727096410.36353: Calling groups_inventory to load vars for managed_node1 24134 1727096410.36355: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096410.36366: Calling all_plugins_play to load vars for managed_node1 24134 1727096410.36588: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096410.36593: Calling groups_plugins_play to load vars for managed_node1 24134 1727096410.37203: done sending task result for task 0afff68d-5257-1673-d3fc-00000000001e 24134 1727096410.37207: WORKER PROCESS EXITING 24134 1727096410.39115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096410.45827: done with get_vars() 24134 1727096410.45854: done getting variables 24134 1727096410.45906: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 09:00:10 -0400 (0:00:00.192) 0:00:14.672 ****** 24134 1727096410.45938: entering _queue_task() for managed_node1/package 24134 1727096410.46848: worker is 1 (out of 1 available) 24134 1727096410.46860: exiting _queue_task() for managed_node1/package 24134 1727096410.46874: done queuing things up, now waiting for results queue to drain 24134 1727096410.46875: waiting for pending results... 24134 1727096410.47262: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 24134 1727096410.47827: in run() - task 0afff68d-5257-1673-d3fc-00000000001f 24134 1727096410.47831: variable 'ansible_search_path' from source: unknown 24134 1727096410.47834: variable 'ansible_search_path' from source: unknown 24134 1727096410.47837: calling self._execute() 24134 1727096410.47943: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096410.47956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096410.47972: variable 'omit' from source: magic vars 24134 1727096410.48674: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.48703: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096410.49070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096410.49635: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096410.49775: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096410.49976: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096410.50005: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096410.50245: variable 'network_packages' from source: role '' defaults 24134 1727096410.50462: variable '__network_provider_setup' from source: role '' defaults 24134 1727096410.50530: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096410.50652: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096410.50656: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096410.50835: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096410.51037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096410.53197: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096410.53282: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096410.53350: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096410.53394: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096410.53426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096410.53520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.53565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.53600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.53648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.53679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.53729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.53772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.53800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.53885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.53892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.54127: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24134 1727096410.54254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.54287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.54371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.54379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.54391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.54493: variable 'ansible_python' from source: facts 24134 1727096410.54523: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24134 1727096410.54619: variable '__network_wpa_supplicant_required' from source: role '' defaults 24134 1727096410.54718: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24134 1727096410.54852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.54890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.54972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.54976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.54993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.55047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.55091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.55121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.55172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.55251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.55348: variable 'network_connections' from source: task vars 24134 1727096410.55363: variable 'interface' from source: set_fact 24134 1727096410.55469: variable 'interface' from source: set_fact 24134 1727096410.55484: variable 'interface' from source: set_fact 24134 1727096410.55585: variable 'interface' from source: set_fact 24134 1727096410.55656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096410.55693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096410.55728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.55792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096410.55809: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096410.56095: variable 'network_connections' from source: task vars 24134 1727096410.56107: variable 'interface' from source: set_fact 24134 1727096410.56214: variable 'interface' from source: set_fact 24134 1727096410.56234: variable 'interface' from source: set_fact 24134 1727096410.56443: variable 'interface' from source: set_fact 24134 1727096410.56446: variable '__network_packages_default_wireless' from source: role '' defaults 24134 1727096410.56485: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096410.56820: variable 'network_connections' from source: task vars 24134 1727096410.56830: variable 'interface' from source: set_fact 24134 1727096410.56902: variable 'interface' from source: set_fact 24134 1727096410.56915: variable 'interface' from source: set_fact 24134 1727096410.56983: variable 'interface' from source: set_fact 24134 1727096410.57020: variable '__network_packages_default_team' from source: role '' defaults 24134 1727096410.57110: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096410.57401: variable 'network_connections' from source: task vars 24134 1727096410.57410: variable 'interface' from source: set_fact 24134 1727096410.57479: variable 'interface' from source: set_fact 24134 1727096410.57538: variable 'interface' from source: set_fact 24134 1727096410.57550: variable 'interface' from source: set_fact 24134 1727096410.57614: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096410.57686: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096410.57698: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096410.57765: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096410.58000: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24134 1727096410.58478: variable 'network_connections' from source: task vars 24134 1727096410.58489: variable 'interface' from source: set_fact 24134 1727096410.58773: variable 'interface' from source: set_fact 24134 1727096410.58776: variable 'interface' from source: set_fact 24134 1727096410.58778: variable 'interface' from source: set_fact 24134 1727096410.58780: variable 'ansible_distribution' from source: facts 24134 1727096410.58782: variable '__network_rh_distros' from source: role '' defaults 24134 1727096410.58784: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.58786: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24134 1727096410.58857: variable 'ansible_distribution' from source: facts 24134 1727096410.58869: variable '__network_rh_distros' from source: role '' defaults 24134 1727096410.58882: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.58907: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24134 1727096410.59082: variable 'ansible_distribution' from source: facts 24134 1727096410.59092: variable '__network_rh_distros' from source: role '' defaults 24134 1727096410.59102: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.59149: variable 'network_provider' from source: set_fact 24134 1727096410.59173: variable 'ansible_facts' from source: unknown 24134 1727096410.59928: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 24134 1727096410.59936: when evaluation is False, skipping this task 24134 1727096410.59943: _execute() done 24134 1727096410.59951: dumping result to json 24134 1727096410.59959: done dumping result, returning 24134 1727096410.59975: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-1673-d3fc-00000000001f] 24134 1727096410.59996: sending task result for task 0afff68d-5257-1673-d3fc-00000000001f skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 24134 1727096410.60260: no more pending results, returning what we have 24134 1727096410.60264: results queue empty 24134 1727096410.60265: checking for any_errors_fatal 24134 1727096410.60277: done checking for any_errors_fatal 24134 1727096410.60278: checking for max_fail_percentage 24134 1727096410.60281: done checking for max_fail_percentage 24134 1727096410.60281: checking to see if all hosts have failed and the running result is not ok 24134 1727096410.60282: done checking to see if all hosts have failed 24134 1727096410.60283: getting the remaining hosts for this loop 24134 1727096410.60285: done getting the remaining hosts for this loop 24134 1727096410.60288: getting the next task for host managed_node1 24134 1727096410.60295: done getting next task for host managed_node1 24134 1727096410.60299: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24134 1727096410.60302: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096410.60324: getting variables 24134 1727096410.60326: in VariableManager get_vars() 24134 1727096410.60366: Calling all_inventory to load vars for managed_node1 24134 1727096410.60433: done sending task result for task 0afff68d-5257-1673-d3fc-00000000001f 24134 1727096410.60445: WORKER PROCESS EXITING 24134 1727096410.60441: Calling groups_inventory to load vars for managed_node1 24134 1727096410.60450: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096410.60461: Calling all_plugins_play to load vars for managed_node1 24134 1727096410.60464: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096410.60470: Calling groups_plugins_play to load vars for managed_node1 24134 1727096410.61960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096410.63528: done with get_vars() 24134 1727096410.63554: done getting variables 24134 1727096410.63619: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 09:00:10 -0400 (0:00:00.177) 0:00:14.849 ****** 24134 1727096410.63652: entering _queue_task() for managed_node1/package 24134 1727096410.63988: worker is 1 (out of 1 available) 24134 1727096410.64001: exiting _queue_task() for managed_node1/package 24134 1727096410.64016: done queuing things up, now waiting for results queue to drain 24134 1727096410.64017: waiting for pending results... 24134 1727096410.64323: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24134 1727096410.64482: in run() - task 0afff68d-5257-1673-d3fc-000000000020 24134 1727096410.64503: variable 'ansible_search_path' from source: unknown 24134 1727096410.64584: variable 'ansible_search_path' from source: unknown 24134 1727096410.64588: calling self._execute() 24134 1727096410.64652: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096410.64666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096410.64689: variable 'omit' from source: magic vars 24134 1727096410.65096: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.65114: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096410.65249: variable 'network_state' from source: role '' defaults 24134 1727096410.65263: Evaluated conditional (network_state != {}): False 24134 1727096410.65273: when evaluation is False, skipping this task 24134 1727096410.65282: _execute() done 24134 1727096410.65289: dumping result to json 24134 1727096410.65297: done dumping result, returning 24134 1727096410.65348: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-1673-d3fc-000000000020] 24134 1727096410.65352: sending task result for task 0afff68d-5257-1673-d3fc-000000000020 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096410.65501: no more pending results, returning what we have 24134 1727096410.65505: results queue empty 24134 1727096410.65507: checking for any_errors_fatal 24134 1727096410.65514: done checking for any_errors_fatal 24134 1727096410.65515: checking for max_fail_percentage 24134 1727096410.65517: done checking for max_fail_percentage 24134 1727096410.65519: checking to see if all hosts have failed and the running result is not ok 24134 1727096410.65520: done checking to see if all hosts have failed 24134 1727096410.65521: getting the remaining hosts for this loop 24134 1727096410.65522: done getting the remaining hosts for this loop 24134 1727096410.65526: getting the next task for host managed_node1 24134 1727096410.65533: done getting next task for host managed_node1 24134 1727096410.65537: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24134 1727096410.65540: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096410.65557: getting variables 24134 1727096410.65559: in VariableManager get_vars() 24134 1727096410.65601: Calling all_inventory to load vars for managed_node1 24134 1727096410.65604: Calling groups_inventory to load vars for managed_node1 24134 1727096410.65607: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096410.65619: Calling all_plugins_play to load vars for managed_node1 24134 1727096410.65622: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096410.65625: Calling groups_plugins_play to load vars for managed_node1 24134 1727096410.66284: done sending task result for task 0afff68d-5257-1673-d3fc-000000000020 24134 1727096410.66287: WORKER PROCESS EXITING 24134 1727096410.67376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096410.68890: done with get_vars() 24134 1727096410.68920: done getting variables 24134 1727096410.68985: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 09:00:10 -0400 (0:00:00.053) 0:00:14.903 ****** 24134 1727096410.69025: entering _queue_task() for managed_node1/package 24134 1727096410.69486: worker is 1 (out of 1 available) 24134 1727096410.69499: exiting _queue_task() for managed_node1/package 24134 1727096410.69511: done queuing things up, now waiting for results queue to drain 24134 1727096410.69513: waiting for pending results... 24134 1727096410.69729: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24134 1727096410.69887: in run() - task 0afff68d-5257-1673-d3fc-000000000021 24134 1727096410.69914: variable 'ansible_search_path' from source: unknown 24134 1727096410.69924: variable 'ansible_search_path' from source: unknown 24134 1727096410.69974: calling self._execute() 24134 1727096410.70079: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096410.70092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096410.70106: variable 'omit' from source: magic vars 24134 1727096410.70524: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.70541: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096410.70679: variable 'network_state' from source: role '' defaults 24134 1727096410.70696: Evaluated conditional (network_state != {}): False 24134 1727096410.70704: when evaluation is False, skipping this task 24134 1727096410.70769: _execute() done 24134 1727096410.70772: dumping result to json 24134 1727096410.70775: done dumping result, returning 24134 1727096410.70778: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-1673-d3fc-000000000021] 24134 1727096410.70780: sending task result for task 0afff68d-5257-1673-d3fc-000000000021 24134 1727096410.70984: done sending task result for task 0afff68d-5257-1673-d3fc-000000000021 24134 1727096410.70988: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096410.71039: no more pending results, returning what we have 24134 1727096410.71044: results queue empty 24134 1727096410.71046: checking for any_errors_fatal 24134 1727096410.71054: done checking for any_errors_fatal 24134 1727096410.71055: checking for max_fail_percentage 24134 1727096410.71057: done checking for max_fail_percentage 24134 1727096410.71058: checking to see if all hosts have failed and the running result is not ok 24134 1727096410.71059: done checking to see if all hosts have failed 24134 1727096410.71060: getting the remaining hosts for this loop 24134 1727096410.71061: done getting the remaining hosts for this loop 24134 1727096410.71065: getting the next task for host managed_node1 24134 1727096410.71076: done getting next task for host managed_node1 24134 1727096410.71081: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24134 1727096410.71084: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096410.71191: getting variables 24134 1727096410.71198: in VariableManager get_vars() 24134 1727096410.71240: Calling all_inventory to load vars for managed_node1 24134 1727096410.71243: Calling groups_inventory to load vars for managed_node1 24134 1727096410.71245: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096410.71256: Calling all_plugins_play to load vars for managed_node1 24134 1727096410.71259: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096410.71263: Calling groups_plugins_play to load vars for managed_node1 24134 1727096410.72697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096410.74304: done with get_vars() 24134 1727096410.74330: done getting variables 24134 1727096410.74428: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 09:00:10 -0400 (0:00:00.054) 0:00:14.958 ****** 24134 1727096410.74466: entering _queue_task() for managed_node1/service 24134 1727096410.74469: Creating lock for service 24134 1727096410.74999: worker is 1 (out of 1 available) 24134 1727096410.75010: exiting _queue_task() for managed_node1/service 24134 1727096410.75021: done queuing things up, now waiting for results queue to drain 24134 1727096410.75022: waiting for pending results... 24134 1727096410.75128: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24134 1727096410.75286: in run() - task 0afff68d-5257-1673-d3fc-000000000022 24134 1727096410.75360: variable 'ansible_search_path' from source: unknown 24134 1727096410.75364: variable 'ansible_search_path' from source: unknown 24134 1727096410.75366: calling self._execute() 24134 1727096410.75473: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096410.75487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096410.75502: variable 'omit' from source: magic vars 24134 1727096410.75913: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.75932: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096410.76064: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096410.76337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096410.78912: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096410.78991: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096410.79045: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096410.79093: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096410.79126: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096410.79218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.79253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.79287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.79338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.79358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.79423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.79474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.79478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.79514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.79536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.79579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.79641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.79644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.79689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.79708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.79885: variable 'network_connections' from source: task vars 24134 1727096410.79902: variable 'interface' from source: set_fact 24134 1727096410.80075: variable 'interface' from source: set_fact 24134 1727096410.80078: variable 'interface' from source: set_fact 24134 1727096410.80080: variable 'interface' from source: set_fact 24134 1727096410.80134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096410.80317: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096410.80358: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096410.80398: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096410.80449: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096410.80500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096410.80534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096410.80632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.80634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096410.80653: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096410.80926: variable 'network_connections' from source: task vars 24134 1727096410.80936: variable 'interface' from source: set_fact 24134 1727096410.81016: variable 'interface' from source: set_fact 24134 1727096410.81028: variable 'interface' from source: set_fact 24134 1727096410.81102: variable 'interface' from source: set_fact 24134 1727096410.81138: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24134 1727096410.81147: when evaluation is False, skipping this task 24134 1727096410.81176: _execute() done 24134 1727096410.81184: dumping result to json 24134 1727096410.81187: done dumping result, returning 24134 1727096410.81189: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-000000000022] 24134 1727096410.81274: sending task result for task 0afff68d-5257-1673-d3fc-000000000022 24134 1727096410.81579: done sending task result for task 0afff68d-5257-1673-d3fc-000000000022 24134 1727096410.81582: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24134 1727096410.81627: no more pending results, returning what we have 24134 1727096410.81631: results queue empty 24134 1727096410.81632: checking for any_errors_fatal 24134 1727096410.81638: done checking for any_errors_fatal 24134 1727096410.81639: checking for max_fail_percentage 24134 1727096410.81641: done checking for max_fail_percentage 24134 1727096410.81642: checking to see if all hosts have failed and the running result is not ok 24134 1727096410.81642: done checking to see if all hosts have failed 24134 1727096410.81643: getting the remaining hosts for this loop 24134 1727096410.81645: done getting the remaining hosts for this loop 24134 1727096410.81649: getting the next task for host managed_node1 24134 1727096410.81656: done getting next task for host managed_node1 24134 1727096410.81660: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24134 1727096410.81663: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096410.81682: getting variables 24134 1727096410.81684: in VariableManager get_vars() 24134 1727096410.81730: Calling all_inventory to load vars for managed_node1 24134 1727096410.81733: Calling groups_inventory to load vars for managed_node1 24134 1727096410.81735: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096410.81744: Calling all_plugins_play to load vars for managed_node1 24134 1727096410.81746: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096410.81749: Calling groups_plugins_play to load vars for managed_node1 24134 1727096410.83411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096410.85015: done with get_vars() 24134 1727096410.85042: done getting variables 24134 1727096410.85112: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 09:00:10 -0400 (0:00:00.106) 0:00:15.064 ****** 24134 1727096410.85145: entering _queue_task() for managed_node1/service 24134 1727096410.85489: worker is 1 (out of 1 available) 24134 1727096410.85616: exiting _queue_task() for managed_node1/service 24134 1727096410.85627: done queuing things up, now waiting for results queue to drain 24134 1727096410.85629: waiting for pending results... 24134 1727096410.85813: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24134 1727096410.85978: in run() - task 0afff68d-5257-1673-d3fc-000000000023 24134 1727096410.85999: variable 'ansible_search_path' from source: unknown 24134 1727096410.86008: variable 'ansible_search_path' from source: unknown 24134 1727096410.86050: calling self._execute() 24134 1727096410.86146: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096410.86164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096410.86185: variable 'omit' from source: magic vars 24134 1727096410.86581: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.86601: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096410.86787: variable 'network_provider' from source: set_fact 24134 1727096410.86798: variable 'network_state' from source: role '' defaults 24134 1727096410.86819: Evaluated conditional (network_provider == "nm" or network_state != {}): True 24134 1727096410.86920: variable 'omit' from source: magic vars 24134 1727096410.86923: variable 'omit' from source: magic vars 24134 1727096410.86940: variable 'network_service_name' from source: role '' defaults 24134 1727096410.87024: variable 'network_service_name' from source: role '' defaults 24134 1727096410.87145: variable '__network_provider_setup' from source: role '' defaults 24134 1727096410.87162: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096410.87231: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096410.87251: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096410.87361: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096410.87777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096410.90977: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096410.91022: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096410.91261: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096410.91264: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096410.91267: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096410.91440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.91550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.91590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.91634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.91709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.91909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.91912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.91942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.92031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.92110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.92650: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24134 1727096410.92903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.93000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.93029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.93122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.93211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.93428: variable 'ansible_python' from source: facts 24134 1727096410.93432: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24134 1727096410.93778: variable '__network_wpa_supplicant_required' from source: role '' defaults 24134 1727096410.93810: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24134 1727096410.94061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.94146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.94201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.94289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.94335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.94390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096410.94436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096410.94464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.94511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096410.94538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096410.94697: variable 'network_connections' from source: task vars 24134 1727096410.94710: variable 'interface' from source: set_fact 24134 1727096410.94796: variable 'interface' from source: set_fact 24134 1727096410.94810: variable 'interface' from source: set_fact 24134 1727096410.94893: variable 'interface' from source: set_fact 24134 1727096410.95007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096410.95214: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096410.95263: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096410.95315: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096410.95356: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096410.95433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096410.95535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096410.95573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096410.95674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096410.95711: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096410.95989: variable 'network_connections' from source: task vars 24134 1727096410.96072: variable 'interface' from source: set_fact 24134 1727096410.96086: variable 'interface' from source: set_fact 24134 1727096410.96101: variable 'interface' from source: set_fact 24134 1727096410.96182: variable 'interface' from source: set_fact 24134 1727096410.96233: variable '__network_packages_default_wireless' from source: role '' defaults 24134 1727096410.96326: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096410.96642: variable 'network_connections' from source: task vars 24134 1727096410.96654: variable 'interface' from source: set_fact 24134 1727096410.96738: variable 'interface' from source: set_fact 24134 1727096410.96750: variable 'interface' from source: set_fact 24134 1727096410.96830: variable 'interface' from source: set_fact 24134 1727096410.96878: variable '__network_packages_default_team' from source: role '' defaults 24134 1727096410.96952: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096410.97243: variable 'network_connections' from source: task vars 24134 1727096410.97252: variable 'interface' from source: set_fact 24134 1727096410.97334: variable 'interface' from source: set_fact 24134 1727096410.97347: variable 'interface' from source: set_fact 24134 1727096410.97430: variable 'interface' from source: set_fact 24134 1727096410.97506: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096410.97595: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096410.97598: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096410.97657: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096410.98179: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24134 1727096410.99077: variable 'network_connections' from source: task vars 24134 1727096410.99201: variable 'interface' from source: set_fact 24134 1727096410.99341: variable 'interface' from source: set_fact 24134 1727096410.99353: variable 'interface' from source: set_fact 24134 1727096410.99467: variable 'interface' from source: set_fact 24134 1727096410.99528: variable 'ansible_distribution' from source: facts 24134 1727096410.99629: variable '__network_rh_distros' from source: role '' defaults 24134 1727096410.99731: variable 'ansible_distribution_major_version' from source: facts 24134 1727096410.99735: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24134 1727096411.00003: variable 'ansible_distribution' from source: facts 24134 1727096411.00013: variable '__network_rh_distros' from source: role '' defaults 24134 1727096411.00076: variable 'ansible_distribution_major_version' from source: facts 24134 1727096411.00095: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24134 1727096411.00481: variable 'ansible_distribution' from source: facts 24134 1727096411.00491: variable '__network_rh_distros' from source: role '' defaults 24134 1727096411.00505: variable 'ansible_distribution_major_version' from source: facts 24134 1727096411.00540: variable 'network_provider' from source: set_fact 24134 1727096411.00723: variable 'omit' from source: magic vars 24134 1727096411.00725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096411.00728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096411.00730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096411.00830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096411.00844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096411.00876: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096411.00939: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096411.00946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096411.01174: Set connection var ansible_shell_executable to /bin/sh 24134 1727096411.01178: Set connection var ansible_pipelining to False 24134 1727096411.01180: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096411.01197: Set connection var ansible_timeout to 10 24134 1727096411.01204: Set connection var ansible_connection to ssh 24134 1727096411.01210: Set connection var ansible_shell_type to sh 24134 1727096411.01238: variable 'ansible_shell_executable' from source: unknown 24134 1727096411.01266: variable 'ansible_connection' from source: unknown 24134 1727096411.01478: variable 'ansible_module_compression' from source: unknown 24134 1727096411.01482: variable 'ansible_shell_type' from source: unknown 24134 1727096411.01484: variable 'ansible_shell_executable' from source: unknown 24134 1727096411.01487: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096411.01493: variable 'ansible_pipelining' from source: unknown 24134 1727096411.01495: variable 'ansible_timeout' from source: unknown 24134 1727096411.01498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096411.01603: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096411.01802: variable 'omit' from source: magic vars 24134 1727096411.01806: starting attempt loop 24134 1727096411.01808: running the handler 24134 1727096411.01810: variable 'ansible_facts' from source: unknown 24134 1727096411.03679: _low_level_execute_command(): starting 24134 1727096411.03906: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096411.05079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096411.05330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096411.05435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096411.07129: stdout chunk (state=3): >>>/root <<< 24134 1727096411.07226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096411.07310: stderr chunk (state=3): >>><<< 24134 1727096411.07313: stdout chunk (state=3): >>><<< 24134 1727096411.07331: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096411.07490: _low_level_execute_command(): starting 24134 1727096411.07500: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113 `" && echo ansible-tmp-1727096411.0739756-24917-139760176451113="` echo /root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113 `" ) && sleep 0' 24134 1727096411.08483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096411.08754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096411.08757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096411.08760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096411.08762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096411.08903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096411.09020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096411.11029: stdout chunk (state=3): >>>ansible-tmp-1727096411.0739756-24917-139760176451113=/root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113 <<< 24134 1727096411.11132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096411.11166: stderr chunk (state=3): >>><<< 24134 1727096411.11181: stdout chunk (state=3): >>><<< 24134 1727096411.11578: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096411.0739756-24917-139760176451113=/root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096411.11582: variable 'ansible_module_compression' from source: unknown 24134 1727096411.11585: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 24134 1727096411.11588: ANSIBALLZ: Acquiring lock 24134 1727096411.11589: ANSIBALLZ: Lock acquired: 140085163806880 24134 1727096411.11591: ANSIBALLZ: Creating module 24134 1727096411.45030: ANSIBALLZ: Writing module into payload 24134 1727096411.45184: ANSIBALLZ: Writing module 24134 1727096411.45214: ANSIBALLZ: Renaming module 24134 1727096411.45220: ANSIBALLZ: Done creating module 24134 1727096411.45241: variable 'ansible_facts' from source: unknown 24134 1727096411.45393: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113/AnsiballZ_systemd.py 24134 1727096411.45594: Sending initial data 24134 1727096411.45597: Sent initial data (156 bytes) 24134 1727096411.46145: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096411.46156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096411.46171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096411.46187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096411.46200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096411.46207: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096411.46217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096411.46308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096411.46315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096411.46418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096411.48132: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24134 1727096411.48145: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 24134 1727096411.48155: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 24134 1727096411.48175: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096411.48259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096411.48357: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmphgc0a7iu /root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113/AnsiballZ_systemd.py <<< 24134 1727096411.48360: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113/AnsiballZ_systemd.py" <<< 24134 1727096411.48426: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmphgc0a7iu" to remote "/root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113/AnsiballZ_systemd.py" <<< 24134 1727096411.50182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096411.50185: stdout chunk (state=3): >>><<< 24134 1727096411.50188: stderr chunk (state=3): >>><<< 24134 1727096411.50190: done transferring module to remote 24134 1727096411.50192: _low_level_execute_command(): starting 24134 1727096411.50194: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113/ /root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113/AnsiballZ_systemd.py && sleep 0' 24134 1727096411.50747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096411.50790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096411.50803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096411.50890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096411.50915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096411.50934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096411.50955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096411.51076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096411.53321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096411.53325: stdout chunk (state=3): >>><<< 24134 1727096411.53327: stderr chunk (state=3): >>><<< 24134 1727096411.53329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096411.53331: _low_level_execute_command(): starting 24134 1727096411.53333: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113/AnsiballZ_systemd.py && sleep 0' 24134 1727096411.54580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096411.54727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096411.54839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096411.84529: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10674176", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314122752", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "1097853000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 24134 1727096411.84599: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 24134 1727096411.86677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096411.86680: stdout chunk (state=3): >>><<< 24134 1727096411.86682: stderr chunk (state=3): >>><<< 24134 1727096411.86685: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10674176", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314122752", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "1097853000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096411.87083: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096411.87091: _low_level_execute_command(): starting 24134 1727096411.87096: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096411.0739756-24917-139760176451113/ > /dev/null 2>&1 && sleep 0' 24134 1727096411.88239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096411.88258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096411.88281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096411.88447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096411.88592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096411.88689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096411.90682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096411.90685: stdout chunk (state=3): >>><<< 24134 1727096411.90687: stderr chunk (state=3): >>><<< 24134 1727096411.90690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096411.90692: handler run complete 24134 1727096411.90736: attempt loop complete, returning result 24134 1727096411.90739: _execute() done 24134 1727096411.90742: dumping result to json 24134 1727096411.90885: done dumping result, returning 24134 1727096411.90895: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-1673-d3fc-000000000023] 24134 1727096411.90900: sending task result for task 0afff68d-5257-1673-d3fc-000000000023 24134 1727096411.91318: done sending task result for task 0afff68d-5257-1673-d3fc-000000000023 24134 1727096411.91323: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096411.91379: no more pending results, returning what we have 24134 1727096411.91383: results queue empty 24134 1727096411.91384: checking for any_errors_fatal 24134 1727096411.91390: done checking for any_errors_fatal 24134 1727096411.91391: checking for max_fail_percentage 24134 1727096411.91393: done checking for max_fail_percentage 24134 1727096411.91394: checking to see if all hosts have failed and the running result is not ok 24134 1727096411.91395: done checking to see if all hosts have failed 24134 1727096411.91395: getting the remaining hosts for this loop 24134 1727096411.91397: done getting the remaining hosts for this loop 24134 1727096411.91401: getting the next task for host managed_node1 24134 1727096411.91413: done getting next task for host managed_node1 24134 1727096411.91417: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24134 1727096411.91420: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096411.91430: getting variables 24134 1727096411.91432: in VariableManager get_vars() 24134 1727096411.91464: Calling all_inventory to load vars for managed_node1 24134 1727096411.91466: Calling groups_inventory to load vars for managed_node1 24134 1727096411.91473: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096411.91483: Calling all_plugins_play to load vars for managed_node1 24134 1727096411.91486: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096411.91490: Calling groups_plugins_play to load vars for managed_node1 24134 1727096411.93114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096411.94732: done with get_vars() 24134 1727096411.94754: done getting variables 24134 1727096411.94814: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 09:00:11 -0400 (0:00:01.097) 0:00:16.161 ****** 24134 1727096411.94849: entering _queue_task() for managed_node1/service 24134 1727096411.95371: worker is 1 (out of 1 available) 24134 1727096411.95385: exiting _queue_task() for managed_node1/service 24134 1727096411.95397: done queuing things up, now waiting for results queue to drain 24134 1727096411.95398: waiting for pending results... 24134 1727096411.95694: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24134 1727096411.95699: in run() - task 0afff68d-5257-1673-d3fc-000000000024 24134 1727096411.95702: variable 'ansible_search_path' from source: unknown 24134 1727096411.95706: variable 'ansible_search_path' from source: unknown 24134 1727096411.95708: calling self._execute() 24134 1727096411.95773: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096411.95789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096411.95804: variable 'omit' from source: magic vars 24134 1727096411.96172: variable 'ansible_distribution_major_version' from source: facts 24134 1727096411.96226: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096411.96316: variable 'network_provider' from source: set_fact 24134 1727096411.96327: Evaluated conditional (network_provider == "nm"): True 24134 1727096411.96429: variable '__network_wpa_supplicant_required' from source: role '' defaults 24134 1727096411.96520: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24134 1727096411.96696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096411.98729: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096411.98772: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096411.98811: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096411.98853: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096411.98889: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096411.99054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096411.99059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096411.99061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096411.99093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096411.99111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096411.99160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096411.99192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096411.99219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096411.99258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096411.99283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096411.99325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096411.99352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096411.99490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096411.99493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096411.99496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096411.99585: variable 'network_connections' from source: task vars 24134 1727096411.99608: variable 'interface' from source: set_fact 24134 1727096411.99686: variable 'interface' from source: set_fact 24134 1727096411.99702: variable 'interface' from source: set_fact 24134 1727096411.99765: variable 'interface' from source: set_fact 24134 1727096411.99845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096412.00023: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096412.00070: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096412.00106: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096412.00145: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096412.00196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096412.00252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096412.00255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096412.00288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096412.00340: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096412.00616: variable 'network_connections' from source: task vars 24134 1727096412.00684: variable 'interface' from source: set_fact 24134 1727096412.00698: variable 'interface' from source: set_fact 24134 1727096412.00709: variable 'interface' from source: set_fact 24134 1727096412.00772: variable 'interface' from source: set_fact 24134 1727096412.00824: Evaluated conditional (__network_wpa_supplicant_required): False 24134 1727096412.00832: when evaluation is False, skipping this task 24134 1727096412.00839: _execute() done 24134 1727096412.00854: dumping result to json 24134 1727096412.00861: done dumping result, returning 24134 1727096412.00877: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-1673-d3fc-000000000024] 24134 1727096412.00901: sending task result for task 0afff68d-5257-1673-d3fc-000000000024 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 24134 1727096412.01061: no more pending results, returning what we have 24134 1727096412.01065: results queue empty 24134 1727096412.01065: checking for any_errors_fatal 24134 1727096412.01091: done checking for any_errors_fatal 24134 1727096412.01092: checking for max_fail_percentage 24134 1727096412.01094: done checking for max_fail_percentage 24134 1727096412.01095: checking to see if all hosts have failed and the running result is not ok 24134 1727096412.01096: done checking to see if all hosts have failed 24134 1727096412.01097: getting the remaining hosts for this loop 24134 1727096412.01098: done getting the remaining hosts for this loop 24134 1727096412.01103: getting the next task for host managed_node1 24134 1727096412.01110: done getting next task for host managed_node1 24134 1727096412.01114: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 24134 1727096412.01117: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096412.01132: getting variables 24134 1727096412.01133: in VariableManager get_vars() 24134 1727096412.01477: Calling all_inventory to load vars for managed_node1 24134 1727096412.01481: Calling groups_inventory to load vars for managed_node1 24134 1727096412.01483: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096412.01493: Calling all_plugins_play to load vars for managed_node1 24134 1727096412.01496: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096412.01499: Calling groups_plugins_play to load vars for managed_node1 24134 1727096412.02376: done sending task result for task 0afff68d-5257-1673-d3fc-000000000024 24134 1727096412.02380: WORKER PROCESS EXITING 24134 1727096412.03684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096412.05340: done with get_vars() 24134 1727096412.05365: done getting variables 24134 1727096412.05684: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 09:00:12 -0400 (0:00:00.108) 0:00:16.270 ****** 24134 1727096412.05717: entering _queue_task() for managed_node1/service 24134 1727096412.06046: worker is 1 (out of 1 available) 24134 1727096412.06057: exiting _queue_task() for managed_node1/service 24134 1727096412.06273: done queuing things up, now waiting for results queue to drain 24134 1727096412.06275: waiting for pending results... 24134 1727096412.06390: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 24134 1727096412.06530: in run() - task 0afff68d-5257-1673-d3fc-000000000025 24134 1727096412.06551: variable 'ansible_search_path' from source: unknown 24134 1727096412.06559: variable 'ansible_search_path' from source: unknown 24134 1727096412.06600: calling self._execute() 24134 1727096412.06693: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096412.06707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096412.06728: variable 'omit' from source: magic vars 24134 1727096412.07419: variable 'ansible_distribution_major_version' from source: facts 24134 1727096412.07488: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096412.07729: variable 'network_provider' from source: set_fact 24134 1727096412.07741: Evaluated conditional (network_provider == "initscripts"): False 24134 1727096412.07936: when evaluation is False, skipping this task 24134 1727096412.07939: _execute() done 24134 1727096412.07942: dumping result to json 24134 1727096412.07944: done dumping result, returning 24134 1727096412.07947: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-1673-d3fc-000000000025] 24134 1727096412.07949: sending task result for task 0afff68d-5257-1673-d3fc-000000000025 24134 1727096412.08019: done sending task result for task 0afff68d-5257-1673-d3fc-000000000025 24134 1727096412.08023: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096412.08088: no more pending results, returning what we have 24134 1727096412.08092: results queue empty 24134 1727096412.08093: checking for any_errors_fatal 24134 1727096412.08103: done checking for any_errors_fatal 24134 1727096412.08104: checking for max_fail_percentage 24134 1727096412.08106: done checking for max_fail_percentage 24134 1727096412.08107: checking to see if all hosts have failed and the running result is not ok 24134 1727096412.08108: done checking to see if all hosts have failed 24134 1727096412.08109: getting the remaining hosts for this loop 24134 1727096412.08111: done getting the remaining hosts for this loop 24134 1727096412.08115: getting the next task for host managed_node1 24134 1727096412.08122: done getting next task for host managed_node1 24134 1727096412.08127: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24134 1727096412.08131: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096412.08151: getting variables 24134 1727096412.08152: in VariableManager get_vars() 24134 1727096412.08197: Calling all_inventory to load vars for managed_node1 24134 1727096412.08201: Calling groups_inventory to load vars for managed_node1 24134 1727096412.08204: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096412.08215: Calling all_plugins_play to load vars for managed_node1 24134 1727096412.08219: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096412.08222: Calling groups_plugins_play to load vars for managed_node1 24134 1727096412.09899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096412.11405: done with get_vars() 24134 1727096412.11429: done getting variables 24134 1727096412.11493: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 09:00:12 -0400 (0:00:00.058) 0:00:16.328 ****** 24134 1727096412.11528: entering _queue_task() for managed_node1/copy 24134 1727096412.11859: worker is 1 (out of 1 available) 24134 1727096412.11875: exiting _queue_task() for managed_node1/copy 24134 1727096412.11887: done queuing things up, now waiting for results queue to drain 24134 1727096412.11889: waiting for pending results... 24134 1727096412.12291: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24134 1727096412.12312: in run() - task 0afff68d-5257-1673-d3fc-000000000026 24134 1727096412.12334: variable 'ansible_search_path' from source: unknown 24134 1727096412.12342: variable 'ansible_search_path' from source: unknown 24134 1727096412.12384: calling self._execute() 24134 1727096412.12485: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096412.12502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096412.12516: variable 'omit' from source: magic vars 24134 1727096412.12897: variable 'ansible_distribution_major_version' from source: facts 24134 1727096412.12914: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096412.13029: variable 'network_provider' from source: set_fact 24134 1727096412.13044: Evaluated conditional (network_provider == "initscripts"): False 24134 1727096412.13050: when evaluation is False, skipping this task 24134 1727096412.13056: _execute() done 24134 1727096412.13062: dumping result to json 24134 1727096412.13070: done dumping result, returning 24134 1727096412.13081: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-1673-d3fc-000000000026] 24134 1727096412.13091: sending task result for task 0afff68d-5257-1673-d3fc-000000000026 skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 24134 1727096412.13362: no more pending results, returning what we have 24134 1727096412.13365: results queue empty 24134 1727096412.13366: checking for any_errors_fatal 24134 1727096412.13377: done checking for any_errors_fatal 24134 1727096412.13378: checking for max_fail_percentage 24134 1727096412.13380: done checking for max_fail_percentage 24134 1727096412.13381: checking to see if all hosts have failed and the running result is not ok 24134 1727096412.13382: done checking to see if all hosts have failed 24134 1727096412.13383: getting the remaining hosts for this loop 24134 1727096412.13384: done getting the remaining hosts for this loop 24134 1727096412.13388: getting the next task for host managed_node1 24134 1727096412.13395: done getting next task for host managed_node1 24134 1727096412.13398: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24134 1727096412.13402: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096412.13420: getting variables 24134 1727096412.13421: in VariableManager get_vars() 24134 1727096412.13460: Calling all_inventory to load vars for managed_node1 24134 1727096412.13463: Calling groups_inventory to load vars for managed_node1 24134 1727096412.13466: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096412.13628: done sending task result for task 0afff68d-5257-1673-d3fc-000000000026 24134 1727096412.13632: WORKER PROCESS EXITING 24134 1727096412.13641: Calling all_plugins_play to load vars for managed_node1 24134 1727096412.13644: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096412.13648: Calling groups_plugins_play to load vars for managed_node1 24134 1727096412.15665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096412.18390: done with get_vars() 24134 1727096412.18419: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 09:00:12 -0400 (0:00:00.069) 0:00:16.398 ****** 24134 1727096412.18506: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24134 1727096412.18508: Creating lock for fedora.linux_system_roles.network_connections 24134 1727096412.18850: worker is 1 (out of 1 available) 24134 1727096412.18863: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24134 1727096412.19079: done queuing things up, now waiting for results queue to drain 24134 1727096412.19081: waiting for pending results... 24134 1727096412.19154: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24134 1727096412.19291: in run() - task 0afff68d-5257-1673-d3fc-000000000027 24134 1727096412.19317: variable 'ansible_search_path' from source: unknown 24134 1727096412.19324: variable 'ansible_search_path' from source: unknown 24134 1727096412.19362: calling self._execute() 24134 1727096412.19454: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096412.19465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096412.19482: variable 'omit' from source: magic vars 24134 1727096412.19868: variable 'ansible_distribution_major_version' from source: facts 24134 1727096412.19887: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096412.19898: variable 'omit' from source: magic vars 24134 1727096412.19961: variable 'omit' from source: magic vars 24134 1727096412.20124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096412.22500: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096412.22671: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096412.22675: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096412.22677: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096412.22680: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096412.22756: variable 'network_provider' from source: set_fact 24134 1727096412.22900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096412.22938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096412.22966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096412.23018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096412.23036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096412.23111: variable 'omit' from source: magic vars 24134 1727096412.23233: variable 'omit' from source: magic vars 24134 1727096412.23341: variable 'network_connections' from source: task vars 24134 1727096412.23451: variable 'interface' from source: set_fact 24134 1727096412.23454: variable 'interface' from source: set_fact 24134 1727096412.23456: variable 'interface' from source: set_fact 24134 1727096412.23495: variable 'interface' from source: set_fact 24134 1727096412.23661: variable 'omit' from source: magic vars 24134 1727096412.23680: variable '__lsr_ansible_managed' from source: task vars 24134 1727096412.23740: variable '__lsr_ansible_managed' from source: task vars 24134 1727096412.24002: Loaded config def from plugin (lookup/template) 24134 1727096412.24012: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 24134 1727096412.24041: File lookup term: get_ansible_managed.j2 24134 1727096412.24050: variable 'ansible_search_path' from source: unknown 24134 1727096412.24058: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 24134 1727096412.24077: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 24134 1727096412.24208: variable 'ansible_search_path' from source: unknown 24134 1727096412.34496: variable 'ansible_managed' from source: unknown 24134 1727096412.34807: variable 'omit' from source: magic vars 24134 1727096412.35001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096412.35005: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096412.35007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096412.35010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096412.35012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096412.35132: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096412.35143: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096412.35152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096412.35374: Set connection var ansible_shell_executable to /bin/sh 24134 1727096412.35388: Set connection var ansible_pipelining to False 24134 1727096412.35400: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096412.35416: Set connection var ansible_timeout to 10 24134 1727096412.35423: Set connection var ansible_connection to ssh 24134 1727096412.35434: Set connection var ansible_shell_type to sh 24134 1727096412.35675: variable 'ansible_shell_executable' from source: unknown 24134 1727096412.35678: variable 'ansible_connection' from source: unknown 24134 1727096412.35680: variable 'ansible_module_compression' from source: unknown 24134 1727096412.35683: variable 'ansible_shell_type' from source: unknown 24134 1727096412.35686: variable 'ansible_shell_executable' from source: unknown 24134 1727096412.35688: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096412.35690: variable 'ansible_pipelining' from source: unknown 24134 1727096412.35692: variable 'ansible_timeout' from source: unknown 24134 1727096412.35694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096412.35838: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096412.35863: variable 'omit' from source: magic vars 24134 1727096412.35901: starting attempt loop 24134 1727096412.35911: running the handler 24134 1727096412.35932: _low_level_execute_command(): starting 24134 1727096412.36010: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096412.37423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096412.37439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096412.37531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096412.37694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096412.37707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096412.37717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096412.37853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096412.39573: stdout chunk (state=3): >>>/root <<< 24134 1727096412.39723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096412.39726: stderr chunk (state=3): >>><<< 24134 1727096412.39728: stdout chunk (state=3): >>><<< 24134 1727096412.39746: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096412.39893: _low_level_execute_command(): starting 24134 1727096412.39898: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865 `" && echo ansible-tmp-1727096412.3980813-24984-234493144644865="` echo /root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865 `" ) && sleep 0' 24134 1727096412.40972: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096412.40988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096412.41005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096412.41025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096412.41042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096412.41181: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096412.41226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096412.41358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096412.41441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096412.41590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096412.43608: stdout chunk (state=3): >>>ansible-tmp-1727096412.3980813-24984-234493144644865=/root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865 <<< 24134 1727096412.43755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096412.43765: stdout chunk (state=3): >>><<< 24134 1727096412.43790: stderr chunk (state=3): >>><<< 24134 1727096412.43813: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096412.3980813-24984-234493144644865=/root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096412.43865: variable 'ansible_module_compression' from source: unknown 24134 1727096412.43924: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 24134 1727096412.43931: ANSIBALLZ: Acquiring lock 24134 1727096412.43939: ANSIBALLZ: Lock acquired: 140085157962640 24134 1727096412.43947: ANSIBALLZ: Creating module 24134 1727096412.79960: ANSIBALLZ: Writing module into payload 24134 1727096412.80536: ANSIBALLZ: Writing module 24134 1727096412.80569: ANSIBALLZ: Renaming module 24134 1727096412.80620: ANSIBALLZ: Done creating module 24134 1727096412.80743: variable 'ansible_facts' from source: unknown 24134 1727096412.80892: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865/AnsiballZ_network_connections.py 24134 1727096412.81634: Sending initial data 24134 1727096412.81642: Sent initial data (168 bytes) 24134 1727096412.82696: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096412.82711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096412.82727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096412.82905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096412.83030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096412.83142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096412.84901: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096412.85096: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096412.85182: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmptqo66qjn /root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865/AnsiballZ_network_connections.py <<< 24134 1727096412.85322: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865/AnsiballZ_network_connections.py" <<< 24134 1727096412.85338: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmptqo66qjn" to remote "/root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865/AnsiballZ_network_connections.py" <<< 24134 1727096412.88091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096412.88124: stderr chunk (state=3): >>><<< 24134 1727096412.88142: stdout chunk (state=3): >>><<< 24134 1727096412.88173: done transferring module to remote 24134 1727096412.88219: _low_level_execute_command(): starting 24134 1727096412.88254: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865/ /root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865/AnsiballZ_network_connections.py && sleep 0' 24134 1727096412.89557: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096412.89561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096412.89564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096412.89566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096412.89588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096412.89613: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096412.89726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096412.89809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096412.89852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096412.89946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096412.91993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096412.92019: stderr chunk (state=3): >>><<< 24134 1727096412.92022: stdout chunk (state=3): >>><<< 24134 1727096412.92117: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096412.92120: _low_level_execute_command(): starting 24134 1727096412.92125: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865/AnsiballZ_network_connections.py && sleep 0' 24134 1727096412.93219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096412.93241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096412.93258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096412.93279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096412.93422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096412.93425: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096412.93519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096412.93536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096412.93552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096412.93907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096413.20852: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 7ebb5ff9-a77a-410c-88b2-c781d382a6fc\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 24134 1727096413.22778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096413.23078: stdout chunk (state=3): >>><<< 24134 1727096413.23082: stderr chunk (state=3): >>><<< 24134 1727096413.23085: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 7ebb5ff9-a77a-410c-88b2-c781d382a6fc\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096413.23088: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'type': 'ethernet', 'ip': {'ipv6_disabled': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096413.23090: _low_level_execute_command(): starting 24134 1727096413.23093: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096412.3980813-24984-234493144644865/ > /dev/null 2>&1 && sleep 0' 24134 1727096413.23810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096413.23822: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096413.23837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096413.23885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096413.23989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096413.24017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096413.24036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096413.24052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096413.24143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096413.26290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096413.26294: stdout chunk (state=3): >>><<< 24134 1727096413.26305: stderr chunk (state=3): >>><<< 24134 1727096413.26580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096413.26584: handler run complete 24134 1727096413.26587: attempt loop complete, returning result 24134 1727096413.26589: _execute() done 24134 1727096413.26596: dumping result to json 24134 1727096413.26598: done dumping result, returning 24134 1727096413.26600: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-1673-d3fc-000000000027] 24134 1727096413.26602: sending task result for task 0afff68d-5257-1673-d3fc-000000000027 24134 1727096413.26796: done sending task result for task 0afff68d-5257-1673-d3fc-000000000027 24134 1727096413.26800: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 7ebb5ff9-a77a-410c-88b2-c781d382a6fc 24134 1727096413.26917: no more pending results, returning what we have 24134 1727096413.26921: results queue empty 24134 1727096413.26922: checking for any_errors_fatal 24134 1727096413.26930: done checking for any_errors_fatal 24134 1727096413.26930: checking for max_fail_percentage 24134 1727096413.26932: done checking for max_fail_percentage 24134 1727096413.26933: checking to see if all hosts have failed and the running result is not ok 24134 1727096413.26934: done checking to see if all hosts have failed 24134 1727096413.26935: getting the remaining hosts for this loop 24134 1727096413.26936: done getting the remaining hosts for this loop 24134 1727096413.26940: getting the next task for host managed_node1 24134 1727096413.26946: done getting next task for host managed_node1 24134 1727096413.26950: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 24134 1727096413.26953: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096413.26965: getting variables 24134 1727096413.26966: in VariableManager get_vars() 24134 1727096413.27373: Calling all_inventory to load vars for managed_node1 24134 1727096413.27376: Calling groups_inventory to load vars for managed_node1 24134 1727096413.27379: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096413.27389: Calling all_plugins_play to load vars for managed_node1 24134 1727096413.27392: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096413.27395: Calling groups_plugins_play to load vars for managed_node1 24134 1727096413.29251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096413.31814: done with get_vars() 24134 1727096413.31914: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 09:00:13 -0400 (0:00:01.136) 0:00:17.534 ****** 24134 1727096413.32133: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24134 1727096413.32135: Creating lock for fedora.linux_system_roles.network_state 24134 1727096413.32992: worker is 1 (out of 1 available) 24134 1727096413.33009: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24134 1727096413.33026: done queuing things up, now waiting for results queue to drain 24134 1727096413.33031: waiting for pending results... 24134 1727096413.33289: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 24134 1727096413.33488: in run() - task 0afff68d-5257-1673-d3fc-000000000028 24134 1727096413.33493: variable 'ansible_search_path' from source: unknown 24134 1727096413.33496: variable 'ansible_search_path' from source: unknown 24134 1727096413.33499: calling self._execute() 24134 1727096413.33524: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096413.33531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096413.33540: variable 'omit' from source: magic vars 24134 1727096413.34072: variable 'ansible_distribution_major_version' from source: facts 24134 1727096413.34095: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096413.34287: variable 'network_state' from source: role '' defaults 24134 1727096413.34291: Evaluated conditional (network_state != {}): False 24134 1727096413.34293: when evaluation is False, skipping this task 24134 1727096413.34298: _execute() done 24134 1727096413.34301: dumping result to json 24134 1727096413.34303: done dumping result, returning 24134 1727096413.34306: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-1673-d3fc-000000000028] 24134 1727096413.34309: sending task result for task 0afff68d-5257-1673-d3fc-000000000028 24134 1727096413.34463: done sending task result for task 0afff68d-5257-1673-d3fc-000000000028 24134 1727096413.34466: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096413.34555: no more pending results, returning what we have 24134 1727096413.34561: results queue empty 24134 1727096413.34562: checking for any_errors_fatal 24134 1727096413.34579: done checking for any_errors_fatal 24134 1727096413.34580: checking for max_fail_percentage 24134 1727096413.34582: done checking for max_fail_percentage 24134 1727096413.34583: checking to see if all hosts have failed and the running result is not ok 24134 1727096413.34585: done checking to see if all hosts have failed 24134 1727096413.34586: getting the remaining hosts for this loop 24134 1727096413.34587: done getting the remaining hosts for this loop 24134 1727096413.34593: getting the next task for host managed_node1 24134 1727096413.34601: done getting next task for host managed_node1 24134 1727096413.34605: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24134 1727096413.34608: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096413.34622: getting variables 24134 1727096413.34623: in VariableManager get_vars() 24134 1727096413.34656: Calling all_inventory to load vars for managed_node1 24134 1727096413.34658: Calling groups_inventory to load vars for managed_node1 24134 1727096413.34660: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096413.34690: Calling all_plugins_play to load vars for managed_node1 24134 1727096413.34693: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096413.34697: Calling groups_plugins_play to load vars for managed_node1 24134 1727096413.36196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096413.37872: done with get_vars() 24134 1727096413.37895: done getting variables 24134 1727096413.37961: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 09:00:13 -0400 (0:00:00.058) 0:00:17.593 ****** 24134 1727096413.37998: entering _queue_task() for managed_node1/debug 24134 1727096413.38416: worker is 1 (out of 1 available) 24134 1727096413.38438: exiting _queue_task() for managed_node1/debug 24134 1727096413.38450: done queuing things up, now waiting for results queue to drain 24134 1727096413.38451: waiting for pending results... 24134 1727096413.38988: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24134 1727096413.39133: in run() - task 0afff68d-5257-1673-d3fc-000000000029 24134 1727096413.39145: variable 'ansible_search_path' from source: unknown 24134 1727096413.39157: variable 'ansible_search_path' from source: unknown 24134 1727096413.39272: calling self._execute() 24134 1727096413.39416: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096413.39452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096413.39559: variable 'omit' from source: magic vars 24134 1727096413.40150: variable 'ansible_distribution_major_version' from source: facts 24134 1727096413.40166: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096413.40182: variable 'omit' from source: magic vars 24134 1727096413.40275: variable 'omit' from source: magic vars 24134 1727096413.40340: variable 'omit' from source: magic vars 24134 1727096413.40454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096413.40552: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096413.40555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096413.40580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096413.40598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096413.40634: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096413.40642: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096413.40664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096413.40862: Set connection var ansible_shell_executable to /bin/sh 24134 1727096413.40931: Set connection var ansible_pipelining to False 24134 1727096413.40936: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096413.40938: Set connection var ansible_timeout to 10 24134 1727096413.40943: Set connection var ansible_connection to ssh 24134 1727096413.40948: Set connection var ansible_shell_type to sh 24134 1727096413.41009: variable 'ansible_shell_executable' from source: unknown 24134 1727096413.41019: variable 'ansible_connection' from source: unknown 24134 1727096413.41028: variable 'ansible_module_compression' from source: unknown 24134 1727096413.41035: variable 'ansible_shell_type' from source: unknown 24134 1727096413.41087: variable 'ansible_shell_executable' from source: unknown 24134 1727096413.41090: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096413.41092: variable 'ansible_pipelining' from source: unknown 24134 1727096413.41094: variable 'ansible_timeout' from source: unknown 24134 1727096413.41096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096413.41230: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096413.41259: variable 'omit' from source: magic vars 24134 1727096413.41274: starting attempt loop 24134 1727096413.41282: running the handler 24134 1727096413.41475: variable '__network_connections_result' from source: set_fact 24134 1727096413.41483: handler run complete 24134 1727096413.41505: attempt loop complete, returning result 24134 1727096413.41512: _execute() done 24134 1727096413.41627: dumping result to json 24134 1727096413.41631: done dumping result, returning 24134 1727096413.41634: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-1673-d3fc-000000000029] 24134 1727096413.41636: sending task result for task 0afff68d-5257-1673-d3fc-000000000029 24134 1727096413.41708: done sending task result for task 0afff68d-5257-1673-d3fc-000000000029 24134 1727096413.41711: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 7ebb5ff9-a77a-410c-88b2-c781d382a6fc" ] } 24134 1727096413.41777: no more pending results, returning what we have 24134 1727096413.41781: results queue empty 24134 1727096413.41781: checking for any_errors_fatal 24134 1727096413.41789: done checking for any_errors_fatal 24134 1727096413.41789: checking for max_fail_percentage 24134 1727096413.41791: done checking for max_fail_percentage 24134 1727096413.41792: checking to see if all hosts have failed and the running result is not ok 24134 1727096413.41793: done checking to see if all hosts have failed 24134 1727096413.41793: getting the remaining hosts for this loop 24134 1727096413.41795: done getting the remaining hosts for this loop 24134 1727096413.41799: getting the next task for host managed_node1 24134 1727096413.41806: done getting next task for host managed_node1 24134 1727096413.41809: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24134 1727096413.41812: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096413.41824: getting variables 24134 1727096413.41826: in VariableManager get_vars() 24134 1727096413.41862: Calling all_inventory to load vars for managed_node1 24134 1727096413.41865: Calling groups_inventory to load vars for managed_node1 24134 1727096413.41869: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096413.41879: Calling all_plugins_play to load vars for managed_node1 24134 1727096413.41881: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096413.41884: Calling groups_plugins_play to load vars for managed_node1 24134 1727096413.43499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096413.45474: done with get_vars() 24134 1727096413.45516: done getting variables 24134 1727096413.45588: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 09:00:13 -0400 (0:00:00.076) 0:00:17.669 ****** 24134 1727096413.45650: entering _queue_task() for managed_node1/debug 24134 1727096413.46202: worker is 1 (out of 1 available) 24134 1727096413.46220: exiting _queue_task() for managed_node1/debug 24134 1727096413.46231: done queuing things up, now waiting for results queue to drain 24134 1727096413.46233: waiting for pending results... 24134 1727096413.46629: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24134 1727096413.46843: in run() - task 0afff68d-5257-1673-d3fc-00000000002a 24134 1727096413.46848: variable 'ansible_search_path' from source: unknown 24134 1727096413.46852: variable 'ansible_search_path' from source: unknown 24134 1727096413.46864: calling self._execute() 24134 1727096413.47026: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096413.47059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096413.47161: variable 'omit' from source: magic vars 24134 1727096413.47635: variable 'ansible_distribution_major_version' from source: facts 24134 1727096413.47661: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096413.47721: variable 'omit' from source: magic vars 24134 1727096413.47839: variable 'omit' from source: magic vars 24134 1727096413.47900: variable 'omit' from source: magic vars 24134 1727096413.47961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096413.48013: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096413.48053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096413.48085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096413.48103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096413.48154: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096413.48162: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096413.48177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096413.48483: Set connection var ansible_shell_executable to /bin/sh 24134 1727096413.48486: Set connection var ansible_pipelining to False 24134 1727096413.48488: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096413.48490: Set connection var ansible_timeout to 10 24134 1727096413.48495: Set connection var ansible_connection to ssh 24134 1727096413.48498: Set connection var ansible_shell_type to sh 24134 1727096413.48505: variable 'ansible_shell_executable' from source: unknown 24134 1727096413.48523: variable 'ansible_connection' from source: unknown 24134 1727096413.48531: variable 'ansible_module_compression' from source: unknown 24134 1727096413.48538: variable 'ansible_shell_type' from source: unknown 24134 1727096413.48544: variable 'ansible_shell_executable' from source: unknown 24134 1727096413.48551: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096413.48562: variable 'ansible_pipelining' from source: unknown 24134 1727096413.48629: variable 'ansible_timeout' from source: unknown 24134 1727096413.48632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096413.48977: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096413.48981: variable 'omit' from source: magic vars 24134 1727096413.48983: starting attempt loop 24134 1727096413.48986: running the handler 24134 1727096413.49095: variable '__network_connections_result' from source: set_fact 24134 1727096413.49227: variable '__network_connections_result' from source: set_fact 24134 1727096413.49349: handler run complete 24134 1727096413.49385: attempt loop complete, returning result 24134 1727096413.49401: _execute() done 24134 1727096413.49574: dumping result to json 24134 1727096413.49578: done dumping result, returning 24134 1727096413.49580: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-1673-d3fc-00000000002a] 24134 1727096413.49583: sending task result for task 0afff68d-5257-1673-d3fc-00000000002a 24134 1727096413.49656: done sending task result for task 0afff68d-5257-1673-d3fc-00000000002a 24134 1727096413.49659: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 7ebb5ff9-a77a-410c-88b2-c781d382a6fc\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 7ebb5ff9-a77a-410c-88b2-c781d382a6fc" ] } } 24134 1727096413.49747: no more pending results, returning what we have 24134 1727096413.49751: results queue empty 24134 1727096413.49752: checking for any_errors_fatal 24134 1727096413.49759: done checking for any_errors_fatal 24134 1727096413.49760: checking for max_fail_percentage 24134 1727096413.49762: done checking for max_fail_percentage 24134 1727096413.49763: checking to see if all hosts have failed and the running result is not ok 24134 1727096413.49764: done checking to see if all hosts have failed 24134 1727096413.49765: getting the remaining hosts for this loop 24134 1727096413.49766: done getting the remaining hosts for this loop 24134 1727096413.49774: getting the next task for host managed_node1 24134 1727096413.49782: done getting next task for host managed_node1 24134 1727096413.49786: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24134 1727096413.49789: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096413.49800: getting variables 24134 1727096413.49802: in VariableManager get_vars() 24134 1727096413.49841: Calling all_inventory to load vars for managed_node1 24134 1727096413.49845: Calling groups_inventory to load vars for managed_node1 24134 1727096413.49848: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096413.49858: Calling all_plugins_play to load vars for managed_node1 24134 1727096413.49861: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096413.49865: Calling groups_plugins_play to load vars for managed_node1 24134 1727096413.52622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096413.56440: done with get_vars() 24134 1727096413.56471: done getting variables 24134 1727096413.56541: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 09:00:13 -0400 (0:00:00.109) 0:00:17.779 ****** 24134 1727096413.56782: entering _queue_task() for managed_node1/debug 24134 1727096413.58102: worker is 1 (out of 1 available) 24134 1727096413.58115: exiting _queue_task() for managed_node1/debug 24134 1727096413.58131: done queuing things up, now waiting for results queue to drain 24134 1727096413.58133: waiting for pending results... 24134 1727096413.58587: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24134 1727096413.58732: in run() - task 0afff68d-5257-1673-d3fc-00000000002b 24134 1727096413.58815: variable 'ansible_search_path' from source: unknown 24134 1727096413.58881: variable 'ansible_search_path' from source: unknown 24134 1727096413.58932: calling self._execute() 24134 1727096413.59227: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096413.59239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096413.59253: variable 'omit' from source: magic vars 24134 1727096413.60078: variable 'ansible_distribution_major_version' from source: facts 24134 1727096413.60108: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096413.60392: variable 'network_state' from source: role '' defaults 24134 1727096413.60408: Evaluated conditional (network_state != {}): False 24134 1727096413.60416: when evaluation is False, skipping this task 24134 1727096413.60430: _execute() done 24134 1727096413.60438: dumping result to json 24134 1727096413.60446: done dumping result, returning 24134 1727096413.60576: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-1673-d3fc-00000000002b] 24134 1727096413.60579: sending task result for task 0afff68d-5257-1673-d3fc-00000000002b 24134 1727096413.60886: done sending task result for task 0afff68d-5257-1673-d3fc-00000000002b 24134 1727096413.60889: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 24134 1727096413.60939: no more pending results, returning what we have 24134 1727096413.60942: results queue empty 24134 1727096413.60945: checking for any_errors_fatal 24134 1727096413.60955: done checking for any_errors_fatal 24134 1727096413.60956: checking for max_fail_percentage 24134 1727096413.60958: done checking for max_fail_percentage 24134 1727096413.60959: checking to see if all hosts have failed and the running result is not ok 24134 1727096413.60959: done checking to see if all hosts have failed 24134 1727096413.60960: getting the remaining hosts for this loop 24134 1727096413.60962: done getting the remaining hosts for this loop 24134 1727096413.60965: getting the next task for host managed_node1 24134 1727096413.60976: done getting next task for host managed_node1 24134 1727096413.60981: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 24134 1727096413.60984: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096413.60999: getting variables 24134 1727096413.61000: in VariableManager get_vars() 24134 1727096413.61040: Calling all_inventory to load vars for managed_node1 24134 1727096413.61045: Calling groups_inventory to load vars for managed_node1 24134 1727096413.61048: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096413.61061: Calling all_plugins_play to load vars for managed_node1 24134 1727096413.61064: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096413.61067: Calling groups_plugins_play to load vars for managed_node1 24134 1727096413.64646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096413.69293: done with get_vars() 24134 1727096413.69327: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 09:00:13 -0400 (0:00:00.127) 0:00:17.908 ****** 24134 1727096413.69520: entering _queue_task() for managed_node1/ping 24134 1727096413.69522: Creating lock for ping 24134 1727096413.70245: worker is 1 (out of 1 available) 24134 1727096413.70258: exiting _queue_task() for managed_node1/ping 24134 1727096413.70383: done queuing things up, now waiting for results queue to drain 24134 1727096413.70385: waiting for pending results... 24134 1727096413.70825: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 24134 1727096413.71131: in run() - task 0afff68d-5257-1673-d3fc-00000000002c 24134 1727096413.71145: variable 'ansible_search_path' from source: unknown 24134 1727096413.71148: variable 'ansible_search_path' from source: unknown 24134 1727096413.71187: calling self._execute() 24134 1727096413.71685: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096413.71689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096413.71692: variable 'omit' from source: magic vars 24134 1727096413.72566: variable 'ansible_distribution_major_version' from source: facts 24134 1727096413.72581: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096413.72587: variable 'omit' from source: magic vars 24134 1727096413.73074: variable 'omit' from source: magic vars 24134 1727096413.73078: variable 'omit' from source: magic vars 24134 1727096413.73080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096413.73108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096413.73127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096413.73141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096413.73153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096413.73183: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096413.73186: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096413.73279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096413.73474: Set connection var ansible_shell_executable to /bin/sh 24134 1727096413.73478: Set connection var ansible_pipelining to False 24134 1727096413.73480: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096413.73482: Set connection var ansible_timeout to 10 24134 1727096413.73485: Set connection var ansible_connection to ssh 24134 1727096413.73487: Set connection var ansible_shell_type to sh 24134 1727096413.73489: variable 'ansible_shell_executable' from source: unknown 24134 1727096413.73492: variable 'ansible_connection' from source: unknown 24134 1727096413.73494: variable 'ansible_module_compression' from source: unknown 24134 1727096413.73496: variable 'ansible_shell_type' from source: unknown 24134 1727096413.73498: variable 'ansible_shell_executable' from source: unknown 24134 1727096413.73500: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096413.73502: variable 'ansible_pipelining' from source: unknown 24134 1727096413.73504: variable 'ansible_timeout' from source: unknown 24134 1727096413.73506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096413.73981: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096413.74072: variable 'omit' from source: magic vars 24134 1727096413.74076: starting attempt loop 24134 1727096413.74078: running the handler 24134 1727096413.74092: _low_level_execute_command(): starting 24134 1727096413.74105: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096413.75435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096413.75439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096413.75595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096413.75651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096413.75808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096413.75866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096413.77597: stdout chunk (state=3): >>>/root <<< 24134 1727096413.77749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096413.77754: stdout chunk (state=3): >>><<< 24134 1727096413.77763: stderr chunk (state=3): >>><<< 24134 1727096413.77791: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096413.77805: _low_level_execute_command(): starting 24134 1727096413.77811: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595 `" && echo ansible-tmp-1727096413.7779133-25035-50930645814595="` echo /root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595 `" ) && sleep 0' 24134 1727096413.78552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096413.78558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096413.78641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096413.78682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096413.78731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096413.78825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096413.80764: stdout chunk (state=3): >>>ansible-tmp-1727096413.7779133-25035-50930645814595=/root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595 <<< 24134 1727096413.80901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096413.80991: stderr chunk (state=3): >>><<< 24134 1727096413.80994: stdout chunk (state=3): >>><<< 24134 1727096413.81277: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096413.7779133-25035-50930645814595=/root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096413.81280: variable 'ansible_module_compression' from source: unknown 24134 1727096413.81283: ANSIBALLZ: Using lock for ping 24134 1727096413.81285: ANSIBALLZ: Acquiring lock 24134 1727096413.81287: ANSIBALLZ: Lock acquired: 140085157957792 24134 1727096413.81289: ANSIBALLZ: Creating module 24134 1727096414.07220: ANSIBALLZ: Writing module into payload 24134 1727096414.07288: ANSIBALLZ: Writing module 24134 1727096414.07309: ANSIBALLZ: Renaming module 24134 1727096414.07315: ANSIBALLZ: Done creating module 24134 1727096414.07346: variable 'ansible_facts' from source: unknown 24134 1727096414.07422: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595/AnsiballZ_ping.py 24134 1727096414.07574: Sending initial data 24134 1727096414.07577: Sent initial data (152 bytes) 24134 1727096414.08213: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096414.08219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096414.08323: stderr chunk (state=3): >>>debug2: match found <<< 24134 1727096414.08326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.08328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096414.08330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096414.08549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096414.08625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096414.10318: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096414.10379: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096414.10476: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpndxd684n /root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595/AnsiballZ_ping.py <<< 24134 1727096414.10480: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595/AnsiballZ_ping.py" <<< 24134 1727096414.10573: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpndxd684n" to remote "/root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595/AnsiballZ_ping.py" <<< 24134 1727096414.11434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096414.11438: stdout chunk (state=3): >>><<< 24134 1727096414.11441: stderr chunk (state=3): >>><<< 24134 1727096414.11491: done transferring module to remote 24134 1727096414.11507: _low_level_execute_command(): starting 24134 1727096414.11537: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595/ /root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595/AnsiballZ_ping.py && sleep 0' 24134 1727096414.12453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.12471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096414.12483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096414.12693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096414.12917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096414.14621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096414.14651: stderr chunk (state=3): >>><<< 24134 1727096414.14654: stdout chunk (state=3): >>><<< 24134 1727096414.14689: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096414.14693: _low_level_execute_command(): starting 24134 1727096414.14695: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595/AnsiballZ_ping.py && sleep 0' 24134 1727096414.15998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096414.16002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096414.16088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096414.16098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24134 1727096414.16106: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096414.16188: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096414.16191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.16356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096414.16572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096414.31716: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 24134 1727096414.33131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096414.33135: stderr chunk (state=3): >>><<< 24134 1727096414.33146: stdout chunk (state=3): >>><<< 24134 1727096414.33252: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096414.33258: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096414.33261: _low_level_execute_command(): starting 24134 1727096414.33263: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096413.7779133-25035-50930645814595/ > /dev/null 2>&1 && sleep 0' 24134 1727096414.34214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096414.34337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096414.34380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096414.34413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096414.34499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096414.34502: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096414.34560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.34563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096414.34565: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096414.34784: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24134 1727096414.34787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096414.34789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096414.34792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096414.34794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096414.34796: stderr chunk (state=3): >>>debug2: match found <<< 24134 1727096414.34798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.34801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096414.35087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096414.35150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096414.37063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096414.37069: stdout chunk (state=3): >>><<< 24134 1727096414.37079: stderr chunk (state=3): >>><<< 24134 1727096414.37095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096414.37102: handler run complete 24134 1727096414.37273: attempt loop complete, returning result 24134 1727096414.37276: _execute() done 24134 1727096414.37278: dumping result to json 24134 1727096414.37280: done dumping result, returning 24134 1727096414.37282: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-1673-d3fc-00000000002c] 24134 1727096414.37283: sending task result for task 0afff68d-5257-1673-d3fc-00000000002c 24134 1727096414.37339: done sending task result for task 0afff68d-5257-1673-d3fc-00000000002c 24134 1727096414.37341: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 24134 1727096414.37401: no more pending results, returning what we have 24134 1727096414.37404: results queue empty 24134 1727096414.37405: checking for any_errors_fatal 24134 1727096414.37409: done checking for any_errors_fatal 24134 1727096414.37410: checking for max_fail_percentage 24134 1727096414.37411: done checking for max_fail_percentage 24134 1727096414.37412: checking to see if all hosts have failed and the running result is not ok 24134 1727096414.37413: done checking to see if all hosts have failed 24134 1727096414.37413: getting the remaining hosts for this loop 24134 1727096414.37415: done getting the remaining hosts for this loop 24134 1727096414.37418: getting the next task for host managed_node1 24134 1727096414.37426: done getting next task for host managed_node1 24134 1727096414.37428: ^ task is: TASK: meta (role_complete) 24134 1727096414.37431: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096414.37440: getting variables 24134 1727096414.37442: in VariableManager get_vars() 24134 1727096414.37682: Calling all_inventory to load vars for managed_node1 24134 1727096414.37694: Calling groups_inventory to load vars for managed_node1 24134 1727096414.37698: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096414.37706: Calling all_plugins_play to load vars for managed_node1 24134 1727096414.37709: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096414.37712: Calling groups_plugins_play to load vars for managed_node1 24134 1727096414.39410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096414.42176: done with get_vars() 24134 1727096414.42383: done getting variables 24134 1727096414.42587: done queuing things up, now waiting for results queue to drain 24134 1727096414.42589: results queue empty 24134 1727096414.42590: checking for any_errors_fatal 24134 1727096414.42592: done checking for any_errors_fatal 24134 1727096414.42593: checking for max_fail_percentage 24134 1727096414.42594: done checking for max_fail_percentage 24134 1727096414.42595: checking to see if all hosts have failed and the running result is not ok 24134 1727096414.42595: done checking to see if all hosts have failed 24134 1727096414.42596: getting the remaining hosts for this loop 24134 1727096414.42597: done getting the remaining hosts for this loop 24134 1727096414.42600: getting the next task for host managed_node1 24134 1727096414.42603: done getting next task for host managed_node1 24134 1727096414.42606: ^ task is: TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 24134 1727096414.42607: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096414.42610: getting variables 24134 1727096414.42611: in VariableManager get_vars() 24134 1727096414.42757: Calling all_inventory to load vars for managed_node1 24134 1727096414.42760: Calling groups_inventory to load vars for managed_node1 24134 1727096414.42762: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096414.42769: Calling all_plugins_play to load vars for managed_node1 24134 1727096414.42774: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096414.42777: Calling groups_plugins_play to load vars for managed_node1 24134 1727096414.44774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096414.46154: done with get_vars() 24134 1727096414.46195: done getting variables 24134 1727096414.46246: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:41 Monday 23 September 2024 09:00:14 -0400 (0:00:00.767) 0:00:18.676 ****** 24134 1727096414.46292: entering _queue_task() for managed_node1/assert 24134 1727096414.46785: worker is 1 (out of 1 available) 24134 1727096414.46797: exiting _queue_task() for managed_node1/assert 24134 1727096414.46808: done queuing things up, now waiting for results queue to drain 24134 1727096414.46809: waiting for pending results... 24134 1727096414.47117: running TaskExecutor() for managed_node1/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 24134 1727096414.47265: in run() - task 0afff68d-5257-1673-d3fc-00000000005c 24134 1727096414.47276: variable 'ansible_search_path' from source: unknown 24134 1727096414.47281: calling self._execute() 24134 1727096414.47453: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096414.47456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096414.47459: variable 'omit' from source: magic vars 24134 1727096414.47988: variable 'ansible_distribution_major_version' from source: facts 24134 1727096414.47991: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096414.48077: variable '__network_connections_result' from source: set_fact 24134 1727096414.48081: Evaluated conditional (__network_connections_result.failed): False 24134 1727096414.48084: when evaluation is False, skipping this task 24134 1727096414.48087: _execute() done 24134 1727096414.48089: dumping result to json 24134 1727096414.48091: done dumping result, returning 24134 1727096414.48095: done running TaskExecutor() for managed_node1/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it [0afff68d-5257-1673-d3fc-00000000005c] 24134 1727096414.48101: sending task result for task 0afff68d-5257-1673-d3fc-00000000005c 24134 1727096414.48427: done sending task result for task 0afff68d-5257-1673-d3fc-00000000005c 24134 1727096414.48429: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 24134 1727096414.48514: no more pending results, returning what we have 24134 1727096414.48517: results queue empty 24134 1727096414.48517: checking for any_errors_fatal 24134 1727096414.48519: done checking for any_errors_fatal 24134 1727096414.48519: checking for max_fail_percentage 24134 1727096414.48521: done checking for max_fail_percentage 24134 1727096414.48522: checking to see if all hosts have failed and the running result is not ok 24134 1727096414.48523: done checking to see if all hosts have failed 24134 1727096414.48523: getting the remaining hosts for this loop 24134 1727096414.48525: done getting the remaining hosts for this loop 24134 1727096414.48528: getting the next task for host managed_node1 24134 1727096414.48532: done getting next task for host managed_node1 24134 1727096414.48535: ^ task is: TASK: Verify nmcli connection ipv6.method 24134 1727096414.48537: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096414.48540: getting variables 24134 1727096414.48542: in VariableManager get_vars() 24134 1727096414.48583: Calling all_inventory to load vars for managed_node1 24134 1727096414.48627: Calling groups_inventory to load vars for managed_node1 24134 1727096414.48630: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096414.48639: Calling all_plugins_play to load vars for managed_node1 24134 1727096414.48642: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096414.48646: Calling groups_plugins_play to load vars for managed_node1 24134 1727096414.50063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096414.51527: done with get_vars() 24134 1727096414.51555: done getting variables 24134 1727096414.51651: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Verify nmcli connection ipv6.method] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:48 Monday 23 September 2024 09:00:14 -0400 (0:00:00.054) 0:00:18.731 ****** 24134 1727096414.51791: entering _queue_task() for managed_node1/shell 24134 1727096414.51802: Creating lock for shell 24134 1727096414.52312: worker is 1 (out of 1 available) 24134 1727096414.52326: exiting _queue_task() for managed_node1/shell 24134 1727096414.52343: done queuing things up, now waiting for results queue to drain 24134 1727096414.52345: waiting for pending results... 24134 1727096414.52546: running TaskExecutor() for managed_node1/TASK: Verify nmcli connection ipv6.method 24134 1727096414.52607: in run() - task 0afff68d-5257-1673-d3fc-00000000005d 24134 1727096414.52617: variable 'ansible_search_path' from source: unknown 24134 1727096414.52655: calling self._execute() 24134 1727096414.52746: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096414.52749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096414.52757: variable 'omit' from source: magic vars 24134 1727096414.53107: variable 'ansible_distribution_major_version' from source: facts 24134 1727096414.53119: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096414.53270: variable '__network_connections_result' from source: set_fact 24134 1727096414.53275: Evaluated conditional (not __network_connections_result.failed): True 24134 1727096414.53297: variable 'omit' from source: magic vars 24134 1727096414.53317: variable 'omit' from source: magic vars 24134 1727096414.53396: variable 'interface' from source: set_fact 24134 1727096414.53432: variable 'omit' from source: magic vars 24134 1727096414.53465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096414.53539: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096414.53543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096414.53548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096414.53551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096414.53660: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096414.53664: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096414.53666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096414.53723: Set connection var ansible_shell_executable to /bin/sh 24134 1727096414.53726: Set connection var ansible_pipelining to False 24134 1727096414.53728: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096414.53730: Set connection var ansible_timeout to 10 24134 1727096414.53732: Set connection var ansible_connection to ssh 24134 1727096414.53734: Set connection var ansible_shell_type to sh 24134 1727096414.53983: variable 'ansible_shell_executable' from source: unknown 24134 1727096414.53987: variable 'ansible_connection' from source: unknown 24134 1727096414.53989: variable 'ansible_module_compression' from source: unknown 24134 1727096414.53991: variable 'ansible_shell_type' from source: unknown 24134 1727096414.53993: variable 'ansible_shell_executable' from source: unknown 24134 1727096414.53995: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096414.53997: variable 'ansible_pipelining' from source: unknown 24134 1727096414.53999: variable 'ansible_timeout' from source: unknown 24134 1727096414.54001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096414.54004: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096414.54007: variable 'omit' from source: magic vars 24134 1727096414.54009: starting attempt loop 24134 1727096414.54011: running the handler 24134 1727096414.54014: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096414.54016: _low_level_execute_command(): starting 24134 1727096414.54018: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096414.54673: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096414.54680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096414.54707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.54711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096414.54713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.54763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096414.54841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096414.56596: stdout chunk (state=3): >>>/root <<< 24134 1727096414.56687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096414.56728: stderr chunk (state=3): >>><<< 24134 1727096414.56730: stdout chunk (state=3): >>><<< 24134 1727096414.56744: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096414.56758: _low_level_execute_command(): starting 24134 1727096414.56774: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130 `" && echo ansible-tmp-1727096414.567493-25070-130018188139130="` echo /root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130 `" ) && sleep 0' 24134 1727096414.57215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096414.57228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096414.57230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24134 1727096414.57233: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096414.57237: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.57288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096414.57291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096414.57292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096414.57359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096414.59376: stdout chunk (state=3): >>>ansible-tmp-1727096414.567493-25070-130018188139130=/root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130 <<< 24134 1727096414.59484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096414.59515: stderr chunk (state=3): >>><<< 24134 1727096414.59518: stdout chunk (state=3): >>><<< 24134 1727096414.59538: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096414.567493-25070-130018188139130=/root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096414.59576: variable 'ansible_module_compression' from source: unknown 24134 1727096414.59625: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24134 1727096414.59655: variable 'ansible_facts' from source: unknown 24134 1727096414.59711: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130/AnsiballZ_command.py 24134 1727096414.59873: Sending initial data 24134 1727096414.59876: Sent initial data (155 bytes) 24134 1727096414.60404: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096414.60407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.60410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096414.60414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096414.60417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.60475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096414.60501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096414.60504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096414.60563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096414.62203: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096414.62274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096414.62342: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpl58f1_1e /root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130/AnsiballZ_command.py <<< 24134 1727096414.62349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130/AnsiballZ_command.py" <<< 24134 1727096414.62411: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpl58f1_1e" to remote "/root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130/AnsiballZ_command.py" <<< 24134 1727096414.63163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096414.63213: stderr chunk (state=3): >>><<< 24134 1727096414.63217: stdout chunk (state=3): >>><<< 24134 1727096414.63238: done transferring module to remote 24134 1727096414.63248: _low_level_execute_command(): starting 24134 1727096414.63252: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130/ /root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130/AnsiballZ_command.py && sleep 0' 24134 1727096414.63777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096414.63781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.63832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.63874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096414.63881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096414.63884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096414.63948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096414.65820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096414.65860: stderr chunk (state=3): >>><<< 24134 1727096414.65863: stdout chunk (state=3): >>><<< 24134 1727096414.65890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096414.65893: _low_level_execute_command(): starting 24134 1727096414.65896: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130/AnsiballZ_command.py && sleep 0' 24134 1727096414.66513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096414.66516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096414.66519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096414.66521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.66523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096414.66546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096414.66622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096414.83909: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-23 09:00:14.819207", "end": "2024-09-23 09:00:14.837422", "delta": "0:00:00.018215", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24134 1727096414.85545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096414.85593: stderr chunk (state=3): >>><<< 24134 1727096414.85617: stdout chunk (state=3): >>><<< 24134 1727096414.85641: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-23 09:00:14.819207", "end": "2024-09-23 09:00:14.837422", "delta": "0:00:00.018215", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096414.85696: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096414.85703: _low_level_execute_command(): starting 24134 1727096414.85744: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096414.567493-25070-130018188139130/ > /dev/null 2>&1 && sleep 0' 24134 1727096414.86797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096414.86801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096414.86805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.86808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096414.86810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096414.86844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096414.86849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096414.86891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096414.87065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096414.88865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096414.88889: stderr chunk (state=3): >>><<< 24134 1727096414.88892: stdout chunk (state=3): >>><<< 24134 1727096414.88908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096414.88914: handler run complete 24134 1727096414.88939: Evaluated conditional (False): False 24134 1727096414.88950: attempt loop complete, returning result 24134 1727096414.88953: _execute() done 24134 1727096414.88956: dumping result to json 24134 1727096414.88958: done dumping result, returning 24134 1727096414.88976: done running TaskExecutor() for managed_node1/TASK: Verify nmcli connection ipv6.method [0afff68d-5257-1673-d3fc-00000000005d] 24134 1727096414.88981: sending task result for task 0afff68d-5257-1673-d3fc-00000000005d 24134 1727096414.89073: done sending task result for task 0afff68d-5257-1673-d3fc-00000000005d 24134 1727096414.89076: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "delta": "0:00:00.018215", "end": "2024-09-23 09:00:14.837422", "rc": 0, "start": "2024-09-23 09:00:14.819207" } STDOUT: ipv6.method: disabled STDERR: + nmcli connection show ethtest0 + grep ipv6.method 24134 1727096414.89144: no more pending results, returning what we have 24134 1727096414.89147: results queue empty 24134 1727096414.89148: checking for any_errors_fatal 24134 1727096414.89156: done checking for any_errors_fatal 24134 1727096414.89157: checking for max_fail_percentage 24134 1727096414.89159: done checking for max_fail_percentage 24134 1727096414.89160: checking to see if all hosts have failed and the running result is not ok 24134 1727096414.89160: done checking to see if all hosts have failed 24134 1727096414.89161: getting the remaining hosts for this loop 24134 1727096414.89163: done getting the remaining hosts for this loop 24134 1727096414.89166: getting the next task for host managed_node1 24134 1727096414.89177: done getting next task for host managed_node1 24134 1727096414.89180: ^ task is: TASK: Assert that ipv6.method disabled is configured correctly 24134 1727096414.89182: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096414.89187: getting variables 24134 1727096414.89193: in VariableManager get_vars() 24134 1727096414.89237: Calling all_inventory to load vars for managed_node1 24134 1727096414.89240: Calling groups_inventory to load vars for managed_node1 24134 1727096414.89243: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096414.89253: Calling all_plugins_play to load vars for managed_node1 24134 1727096414.89255: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096414.89257: Calling groups_plugins_play to load vars for managed_node1 24134 1727096414.91928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096414.95826: done with get_vars() 24134 1727096414.95861: done getting variables 24134 1727096414.96450: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that ipv6.method disabled is configured correctly] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:57 Monday 23 September 2024 09:00:14 -0400 (0:00:00.447) 0:00:19.178 ****** 24134 1727096414.96481: entering _queue_task() for managed_node1/assert 24134 1727096414.97558: worker is 1 (out of 1 available) 24134 1727096414.97674: exiting _queue_task() for managed_node1/assert 24134 1727096414.97734: done queuing things up, now waiting for results queue to drain 24134 1727096414.97735: waiting for pending results... 24134 1727096414.98239: running TaskExecutor() for managed_node1/TASK: Assert that ipv6.method disabled is configured correctly 24134 1727096414.98289: in run() - task 0afff68d-5257-1673-d3fc-00000000005e 24134 1727096414.98305: variable 'ansible_search_path' from source: unknown 24134 1727096414.98340: calling self._execute() 24134 1727096414.98664: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096414.98670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096414.98673: variable 'omit' from source: magic vars 24134 1727096414.99430: variable 'ansible_distribution_major_version' from source: facts 24134 1727096414.99437: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096414.99649: variable '__network_connections_result' from source: set_fact 24134 1727096414.99707: Evaluated conditional (not __network_connections_result.failed): True 24134 1727096414.99714: variable 'omit' from source: magic vars 24134 1727096414.99734: variable 'omit' from source: magic vars 24134 1727096414.99771: variable 'omit' from source: magic vars 24134 1727096414.99961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096414.99998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096415.00019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096415.00152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096415.00165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096415.00198: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096415.00201: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096415.00204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096415.00427: Set connection var ansible_shell_executable to /bin/sh 24134 1727096415.00431: Set connection var ansible_pipelining to False 24134 1727096415.00438: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096415.00448: Set connection var ansible_timeout to 10 24134 1727096415.00451: Set connection var ansible_connection to ssh 24134 1727096415.00453: Set connection var ansible_shell_type to sh 24134 1727096415.00717: variable 'ansible_shell_executable' from source: unknown 24134 1727096415.00721: variable 'ansible_connection' from source: unknown 24134 1727096415.00723: variable 'ansible_module_compression' from source: unknown 24134 1727096415.00726: variable 'ansible_shell_type' from source: unknown 24134 1727096415.00727: variable 'ansible_shell_executable' from source: unknown 24134 1727096415.00731: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096415.00733: variable 'ansible_pipelining' from source: unknown 24134 1727096415.00735: variable 'ansible_timeout' from source: unknown 24134 1727096415.00737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096415.00861: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096415.00876: variable 'omit' from source: magic vars 24134 1727096415.00883: starting attempt loop 24134 1727096415.00886: running the handler 24134 1727096415.01275: variable 'ipv6_method' from source: set_fact 24134 1727096415.01280: Evaluated conditional ('disabled' in ipv6_method.stdout): True 24134 1727096415.01283: handler run complete 24134 1727096415.01285: attempt loop complete, returning result 24134 1727096415.01287: _execute() done 24134 1727096415.01289: dumping result to json 24134 1727096415.01291: done dumping result, returning 24134 1727096415.01293: done running TaskExecutor() for managed_node1/TASK: Assert that ipv6.method disabled is configured correctly [0afff68d-5257-1673-d3fc-00000000005e] 24134 1727096415.01383: sending task result for task 0afff68d-5257-1673-d3fc-00000000005e 24134 1727096415.01542: done sending task result for task 0afff68d-5257-1673-d3fc-00000000005e 24134 1727096415.01546: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 24134 1727096415.01598: no more pending results, returning what we have 24134 1727096415.01602: results queue empty 24134 1727096415.01603: checking for any_errors_fatal 24134 1727096415.01610: done checking for any_errors_fatal 24134 1727096415.01611: checking for max_fail_percentage 24134 1727096415.01612: done checking for max_fail_percentage 24134 1727096415.01613: checking to see if all hosts have failed and the running result is not ok 24134 1727096415.01614: done checking to see if all hosts have failed 24134 1727096415.01615: getting the remaining hosts for this loop 24134 1727096415.01616: done getting the remaining hosts for this loop 24134 1727096415.01620: getting the next task for host managed_node1 24134 1727096415.01626: done getting next task for host managed_node1 24134 1727096415.01628: ^ task is: TASK: Set the connection_failed flag 24134 1727096415.01630: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096415.01633: getting variables 24134 1727096415.01634: in VariableManager get_vars() 24134 1727096415.01674: Calling all_inventory to load vars for managed_node1 24134 1727096415.01676: Calling groups_inventory to load vars for managed_node1 24134 1727096415.01679: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096415.01689: Calling all_plugins_play to load vars for managed_node1 24134 1727096415.01691: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096415.01694: Calling groups_plugins_play to load vars for managed_node1 24134 1727096415.05708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096415.09143: done with get_vars() 24134 1727096415.09284: done getting variables 24134 1727096415.09345: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set the connection_failed flag] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:64 Monday 23 September 2024 09:00:15 -0400 (0:00:00.129) 0:00:19.308 ****** 24134 1727096415.09481: entering _queue_task() for managed_node1/set_fact 24134 1727096415.10291: worker is 1 (out of 1 available) 24134 1727096415.10304: exiting _queue_task() for managed_node1/set_fact 24134 1727096415.10316: done queuing things up, now waiting for results queue to drain 24134 1727096415.10317: waiting for pending results... 24134 1727096415.10890: running TaskExecutor() for managed_node1/TASK: Set the connection_failed flag 24134 1727096415.11039: in run() - task 0afff68d-5257-1673-d3fc-00000000005f 24134 1727096415.11053: variable 'ansible_search_path' from source: unknown 24134 1727096415.11096: calling self._execute() 24134 1727096415.11355: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096415.11361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096415.11374: variable 'omit' from source: magic vars 24134 1727096415.12188: variable 'ansible_distribution_major_version' from source: facts 24134 1727096415.12191: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096415.12592: variable '__network_connections_result' from source: set_fact 24134 1727096415.12612: Evaluated conditional (__network_connections_result.failed): False 24134 1727096415.12622: when evaluation is False, skipping this task 24134 1727096415.12626: _execute() done 24134 1727096415.12628: dumping result to json 24134 1727096415.12631: done dumping result, returning 24134 1727096415.12633: done running TaskExecutor() for managed_node1/TASK: Set the connection_failed flag [0afff68d-5257-1673-d3fc-00000000005f] 24134 1727096415.12635: sending task result for task 0afff68d-5257-1673-d3fc-00000000005f 24134 1727096415.12906: done sending task result for task 0afff68d-5257-1673-d3fc-00000000005f 24134 1727096415.12910: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 24134 1727096415.12966: no more pending results, returning what we have 24134 1727096415.12972: results queue empty 24134 1727096415.12974: checking for any_errors_fatal 24134 1727096415.12984: done checking for any_errors_fatal 24134 1727096415.12984: checking for max_fail_percentage 24134 1727096415.12987: done checking for max_fail_percentage 24134 1727096415.12987: checking to see if all hosts have failed and the running result is not ok 24134 1727096415.12988: done checking to see if all hosts have failed 24134 1727096415.12989: getting the remaining hosts for this loop 24134 1727096415.12991: done getting the remaining hosts for this loop 24134 1727096415.12995: getting the next task for host managed_node1 24134 1727096415.13002: done getting next task for host managed_node1 24134 1727096415.13005: ^ task is: TASK: meta (flush_handlers) 24134 1727096415.13008: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096415.13013: getting variables 24134 1727096415.13015: in VariableManager get_vars() 24134 1727096415.13057: Calling all_inventory to load vars for managed_node1 24134 1727096415.13060: Calling groups_inventory to load vars for managed_node1 24134 1727096415.13063: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096415.13315: Calling all_plugins_play to load vars for managed_node1 24134 1727096415.13320: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096415.13323: Calling groups_plugins_play to load vars for managed_node1 24134 1727096415.25154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096415.28375: done with get_vars() 24134 1727096415.28402: done getting variables 24134 1727096415.28460: in VariableManager get_vars() 24134 1727096415.28603: Calling all_inventory to load vars for managed_node1 24134 1727096415.28606: Calling groups_inventory to load vars for managed_node1 24134 1727096415.28608: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096415.28613: Calling all_plugins_play to load vars for managed_node1 24134 1727096415.28615: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096415.28618: Calling groups_plugins_play to load vars for managed_node1 24134 1727096415.30319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096415.31936: done with get_vars() 24134 1727096415.31965: done queuing things up, now waiting for results queue to drain 24134 1727096415.31970: results queue empty 24134 1727096415.31971: checking for any_errors_fatal 24134 1727096415.31973: done checking for any_errors_fatal 24134 1727096415.31974: checking for max_fail_percentage 24134 1727096415.31975: done checking for max_fail_percentage 24134 1727096415.31976: checking to see if all hosts have failed and the running result is not ok 24134 1727096415.31976: done checking to see if all hosts have failed 24134 1727096415.31977: getting the remaining hosts for this loop 24134 1727096415.31978: done getting the remaining hosts for this loop 24134 1727096415.31981: getting the next task for host managed_node1 24134 1727096415.31984: done getting next task for host managed_node1 24134 1727096415.31985: ^ task is: TASK: meta (flush_handlers) 24134 1727096415.31986: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096415.31989: getting variables 24134 1727096415.31990: in VariableManager get_vars() 24134 1727096415.31999: Calling all_inventory to load vars for managed_node1 24134 1727096415.32001: Calling groups_inventory to load vars for managed_node1 24134 1727096415.32003: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096415.32007: Calling all_plugins_play to load vars for managed_node1 24134 1727096415.32009: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096415.32012: Calling groups_plugins_play to load vars for managed_node1 24134 1727096415.34934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096415.37594: done with get_vars() 24134 1727096415.37613: done getting variables 24134 1727096415.37659: in VariableManager get_vars() 24134 1727096415.37673: Calling all_inventory to load vars for managed_node1 24134 1727096415.37675: Calling groups_inventory to load vars for managed_node1 24134 1727096415.37676: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096415.37679: Calling all_plugins_play to load vars for managed_node1 24134 1727096415.37681: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096415.37683: Calling groups_plugins_play to load vars for managed_node1 24134 1727096415.38347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096415.40710: done with get_vars() 24134 1727096415.40735: done queuing things up, now waiting for results queue to drain 24134 1727096415.40737: results queue empty 24134 1727096415.40738: checking for any_errors_fatal 24134 1727096415.40739: done checking for any_errors_fatal 24134 1727096415.40740: checking for max_fail_percentage 24134 1727096415.40741: done checking for max_fail_percentage 24134 1727096415.40742: checking to see if all hosts have failed and the running result is not ok 24134 1727096415.40742: done checking to see if all hosts have failed 24134 1727096415.40743: getting the remaining hosts for this loop 24134 1727096415.40744: done getting the remaining hosts for this loop 24134 1727096415.40746: getting the next task for host managed_node1 24134 1727096415.40753: done getting next task for host managed_node1 24134 1727096415.40754: ^ task is: None 24134 1727096415.40755: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096415.40756: done queuing things up, now waiting for results queue to drain 24134 1727096415.40757: results queue empty 24134 1727096415.40758: checking for any_errors_fatal 24134 1727096415.40758: done checking for any_errors_fatal 24134 1727096415.40759: checking for max_fail_percentage 24134 1727096415.40760: done checking for max_fail_percentage 24134 1727096415.40760: checking to see if all hosts have failed and the running result is not ok 24134 1727096415.40761: done checking to see if all hosts have failed 24134 1727096415.40762: getting the next task for host managed_node1 24134 1727096415.40764: done getting next task for host managed_node1 24134 1727096415.40765: ^ task is: None 24134 1727096415.40766: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096415.40958: in VariableManager get_vars() 24134 1727096415.40991: done with get_vars() 24134 1727096415.40997: in VariableManager get_vars() 24134 1727096415.41010: done with get_vars() 24134 1727096415.41015: variable 'omit' from source: magic vars 24134 1727096415.41150: variable 'profile' from source: play vars 24134 1727096415.41260: in VariableManager get_vars() 24134 1727096415.41273: done with get_vars() 24134 1727096415.41287: variable 'omit' from source: magic vars 24134 1727096415.41329: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 24134 1727096415.41743: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24134 1727096415.41760: getting the remaining hosts for this loop 24134 1727096415.41761: done getting the remaining hosts for this loop 24134 1727096415.41763: getting the next task for host managed_node1 24134 1727096415.41813: done getting next task for host managed_node1 24134 1727096415.41815: ^ task is: TASK: Gathering Facts 24134 1727096415.41816: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096415.41818: getting variables 24134 1727096415.41818: in VariableManager get_vars() 24134 1727096415.41826: Calling all_inventory to load vars for managed_node1 24134 1727096415.41827: Calling groups_inventory to load vars for managed_node1 24134 1727096415.41828: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096415.41833: Calling all_plugins_play to load vars for managed_node1 24134 1727096415.41834: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096415.41836: Calling groups_plugins_play to load vars for managed_node1 24134 1727096415.42487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096415.44089: done with get_vars() 24134 1727096415.44115: done getting variables 24134 1727096415.44158: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Monday 23 September 2024 09:00:15 -0400 (0:00:00.347) 0:00:19.655 ****** 24134 1727096415.44190: entering _queue_task() for managed_node1/gather_facts 24134 1727096415.44599: worker is 1 (out of 1 available) 24134 1727096415.44613: exiting _queue_task() for managed_node1/gather_facts 24134 1727096415.44625: done queuing things up, now waiting for results queue to drain 24134 1727096415.44627: waiting for pending results... 24134 1727096415.44953: running TaskExecutor() for managed_node1/TASK: Gathering Facts 24134 1727096415.44957: in run() - task 0afff68d-5257-1673-d3fc-000000000454 24134 1727096415.44974: variable 'ansible_search_path' from source: unknown 24134 1727096415.45013: calling self._execute() 24134 1727096415.45117: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096415.45129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096415.45145: variable 'omit' from source: magic vars 24134 1727096415.45580: variable 'ansible_distribution_major_version' from source: facts 24134 1727096415.45606: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096415.45617: variable 'omit' from source: magic vars 24134 1727096415.45645: variable 'omit' from source: magic vars 24134 1727096415.45685: variable 'omit' from source: magic vars 24134 1727096415.45734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096415.45775: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096415.45806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096415.45840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096415.45858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096415.45892: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096415.45917: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096415.45922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096415.46029: Set connection var ansible_shell_executable to /bin/sh 24134 1727096415.46133: Set connection var ansible_pipelining to False 24134 1727096415.46136: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096415.46139: Set connection var ansible_timeout to 10 24134 1727096415.46142: Set connection var ansible_connection to ssh 24134 1727096415.46144: Set connection var ansible_shell_type to sh 24134 1727096415.46146: variable 'ansible_shell_executable' from source: unknown 24134 1727096415.46148: variable 'ansible_connection' from source: unknown 24134 1727096415.46149: variable 'ansible_module_compression' from source: unknown 24134 1727096415.46152: variable 'ansible_shell_type' from source: unknown 24134 1727096415.46154: variable 'ansible_shell_executable' from source: unknown 24134 1727096415.46155: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096415.46157: variable 'ansible_pipelining' from source: unknown 24134 1727096415.46159: variable 'ansible_timeout' from source: unknown 24134 1727096415.46161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096415.46322: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096415.46340: variable 'omit' from source: magic vars 24134 1727096415.46355: starting attempt loop 24134 1727096415.46362: running the handler 24134 1727096415.46388: variable 'ansible_facts' from source: unknown 24134 1727096415.46457: _low_level_execute_command(): starting 24134 1727096415.46460: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096415.47235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096415.47293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096415.47344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096415.47414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096415.49303: stdout chunk (state=3): >>>/root <<< 24134 1727096415.49335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096415.49338: stdout chunk (state=3): >>><<< 24134 1727096415.49343: stderr chunk (state=3): >>><<< 24134 1727096415.49499: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096415.49505: _low_level_execute_command(): starting 24134 1727096415.49508: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497 `" && echo ansible-tmp-1727096415.4938424-25112-51039790780497="` echo /root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497 `" ) && sleep 0' 24134 1727096415.50122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096415.50137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096415.50156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096415.50386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096415.50390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096415.50452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096415.50472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096415.50495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096415.50624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096415.52596: stdout chunk (state=3): >>>ansible-tmp-1727096415.4938424-25112-51039790780497=/root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497 <<< 24134 1727096415.52723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096415.52743: stderr chunk (state=3): >>><<< 24134 1727096415.52766: stdout chunk (state=3): >>><<< 24134 1727096415.52811: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096415.4938424-25112-51039790780497=/root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096415.52978: variable 'ansible_module_compression' from source: unknown 24134 1727096415.52986: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24134 1727096415.53032: variable 'ansible_facts' from source: unknown 24134 1727096415.53507: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497/AnsiballZ_setup.py 24134 1727096415.53788: Sending initial data 24134 1727096415.53909: Sent initial data (153 bytes) 24134 1727096415.54748: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096415.54785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096415.54870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096415.54920: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096415.54988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096415.55030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096415.55173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096415.56837: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096415.56897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096415.56964: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpgjtjc_8q /root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497/AnsiballZ_setup.py <<< 24134 1727096415.56967: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497/AnsiballZ_setup.py" <<< 24134 1727096415.57020: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpgjtjc_8q" to remote "/root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497/AnsiballZ_setup.py" <<< 24134 1727096415.58184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096415.58254: stderr chunk (state=3): >>><<< 24134 1727096415.58259: stdout chunk (state=3): >>><<< 24134 1727096415.58274: done transferring module to remote 24134 1727096415.58288: _low_level_execute_command(): starting 24134 1727096415.58349: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497/ /root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497/AnsiballZ_setup.py && sleep 0' 24134 1727096415.58975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096415.58978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096415.58980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24134 1727096415.58983: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096415.58992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096415.59050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096415.59072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096415.59150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096415.61031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096415.61075: stderr chunk (state=3): >>><<< 24134 1727096415.61084: stdout chunk (state=3): >>><<< 24134 1727096415.61125: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096415.61129: _low_level_execute_command(): starting 24134 1727096415.61131: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497/AnsiballZ_setup.py && sleep 0' 24134 1727096415.61871: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096415.61875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096415.61917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096415.62008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096416.27933: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "15", "epoch": "1727096415", "epoch_int": "1727096415", "date": "2024-09-23", "time": "09:00:15", "iso8601_micro": "2024-09-23T13:00:15.891928Z", "iso8601": "2024-09-23T13:00:15Z", "iso8601_basic": "20240923T090015891928", "iso8601_basic_short": "20240923T090015", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e12<<< 24134 1727096416.27970: stdout chunk (state=3): >>>0", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2975, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 556, "free": 2975}, "nocache": {"free": 3313, "used": 218}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 569, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795123200, "block_size": 4096, "block_total": 65519099, "block_available": 63914825, "block_used": 1604274, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["peerethtest0", "lo", "eth0", "ethtest0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address"<<< 24134 1727096416.27987: stdout chunk (state=3): >>>: "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "42:d5:59:35:93:c8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40d5:59ff:fe35:93c8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on"<<< 24134 1727096416.27997: stdout chunk (state=3): >>>, "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "72:05:50:f0:ff:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7005:50ff:fef0:ffb8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::40d5:59ff:fe35:93c8", "fe80::10ff:acff:fe3f:90f5", "fe80::7005:50ff:fef0:ffb8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::40d5:59ff:fe35:93c8", "fe80::7005:50ff:fef0:ffb8"]}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.783203125, "5m": 0.4912109375, "15m": 0.24560546875}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24134 1727096416.30136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096416.30140: stdout chunk (state=3): >>><<< 24134 1727096416.30142: stderr chunk (state=3): >>><<< 24134 1727096416.30201: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "15", "epoch": "1727096415", "epoch_int": "1727096415", "date": "2024-09-23", "time": "09:00:15", "iso8601_micro": "2024-09-23T13:00:15.891928Z", "iso8601": "2024-09-23T13:00:15Z", "iso8601_basic": "20240923T090015891928", "iso8601_basic_short": "20240923T090015", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2975, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 556, "free": 2975}, "nocache": {"free": 3313, "used": 218}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 569, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795123200, "block_size": 4096, "block_total": 65519099, "block_available": 63914825, "block_used": 1604274, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["peerethtest0", "lo", "eth0", "ethtest0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "42:d5:59:35:93:c8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40d5:59ff:fe35:93c8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "72:05:50:f0:ff:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7005:50ff:fef0:ffb8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::40d5:59ff:fe35:93c8", "fe80::10ff:acff:fe3f:90f5", "fe80::7005:50ff:fef0:ffb8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::40d5:59ff:fe35:93c8", "fe80::7005:50ff:fef0:ffb8"]}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.783203125, "5m": 0.4912109375, "15m": 0.24560546875}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096416.30707: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096416.30711: _low_level_execute_command(): starting 24134 1727096416.30713: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096415.4938424-25112-51039790780497/ > /dev/null 2>&1 && sleep 0' 24134 1727096416.31319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096416.31385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096416.31392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096416.31423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096416.31523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096416.33416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096416.33420: stdout chunk (state=3): >>><<< 24134 1727096416.33425: stderr chunk (state=3): >>><<< 24134 1727096416.33440: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096416.33449: handler run complete 24134 1727096416.33547: variable 'ansible_facts' from source: unknown 24134 1727096416.33623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096416.33839: variable 'ansible_facts' from source: unknown 24134 1727096416.33908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096416.34174: attempt loop complete, returning result 24134 1727096416.34177: _execute() done 24134 1727096416.34180: dumping result to json 24134 1727096416.34182: done dumping result, returning 24134 1727096416.34184: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-1673-d3fc-000000000454] 24134 1727096416.34186: sending task result for task 0afff68d-5257-1673-d3fc-000000000454 ok: [managed_node1] 24134 1727096416.35123: no more pending results, returning what we have 24134 1727096416.35126: results queue empty 24134 1727096416.35127: checking for any_errors_fatal 24134 1727096416.35128: done checking for any_errors_fatal 24134 1727096416.35129: checking for max_fail_percentage 24134 1727096416.35131: done checking for max_fail_percentage 24134 1727096416.35131: checking to see if all hosts have failed and the running result is not ok 24134 1727096416.35132: done checking to see if all hosts have failed 24134 1727096416.35133: getting the remaining hosts for this loop 24134 1727096416.35134: done getting the remaining hosts for this loop 24134 1727096416.35137: getting the next task for host managed_node1 24134 1727096416.35141: done getting next task for host managed_node1 24134 1727096416.35143: ^ task is: TASK: meta (flush_handlers) 24134 1727096416.35145: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096416.35149: getting variables 24134 1727096416.35150: in VariableManager get_vars() 24134 1727096416.35188: Calling all_inventory to load vars for managed_node1 24134 1727096416.35191: Calling groups_inventory to load vars for managed_node1 24134 1727096416.35195: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096416.35208: Calling all_plugins_play to load vars for managed_node1 24134 1727096416.35211: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096416.35214: Calling groups_plugins_play to load vars for managed_node1 24134 1727096416.35234: done sending task result for task 0afff68d-5257-1673-d3fc-000000000454 24134 1727096416.35238: WORKER PROCESS EXITING 24134 1727096416.36084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096416.36985: done with get_vars() 24134 1727096416.37004: done getting variables 24134 1727096416.37057: in VariableManager get_vars() 24134 1727096416.37066: Calling all_inventory to load vars for managed_node1 24134 1727096416.37071: Calling groups_inventory to load vars for managed_node1 24134 1727096416.37073: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096416.37076: Calling all_plugins_play to load vars for managed_node1 24134 1727096416.37078: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096416.37080: Calling groups_plugins_play to load vars for managed_node1 24134 1727096416.37751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096416.38743: done with get_vars() 24134 1727096416.38762: done queuing things up, now waiting for results queue to drain 24134 1727096416.38763: results queue empty 24134 1727096416.38764: checking for any_errors_fatal 24134 1727096416.38766: done checking for any_errors_fatal 24134 1727096416.38771: checking for max_fail_percentage 24134 1727096416.38772: done checking for max_fail_percentage 24134 1727096416.38772: checking to see if all hosts have failed and the running result is not ok 24134 1727096416.38773: done checking to see if all hosts have failed 24134 1727096416.38779: getting the remaining hosts for this loop 24134 1727096416.38780: done getting the remaining hosts for this loop 24134 1727096416.38782: getting the next task for host managed_node1 24134 1727096416.38786: done getting next task for host managed_node1 24134 1727096416.38788: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24134 1727096416.38789: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096416.38797: getting variables 24134 1727096416.38797: in VariableManager get_vars() 24134 1727096416.38807: Calling all_inventory to load vars for managed_node1 24134 1727096416.38809: Calling groups_inventory to load vars for managed_node1 24134 1727096416.38810: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096416.38813: Calling all_plugins_play to load vars for managed_node1 24134 1727096416.38815: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096416.38816: Calling groups_plugins_play to load vars for managed_node1 24134 1727096416.39481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096416.40356: done with get_vars() 24134 1727096416.40377: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 09:00:16 -0400 (0:00:00.962) 0:00:20.617 ****** 24134 1727096416.40436: entering _queue_task() for managed_node1/include_tasks 24134 1727096416.40715: worker is 1 (out of 1 available) 24134 1727096416.40729: exiting _queue_task() for managed_node1/include_tasks 24134 1727096416.40742: done queuing things up, now waiting for results queue to drain 24134 1727096416.40743: waiting for pending results... 24134 1727096416.40918: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24134 1727096416.40993: in run() - task 0afff68d-5257-1673-d3fc-000000000067 24134 1727096416.41007: variable 'ansible_search_path' from source: unknown 24134 1727096416.41010: variable 'ansible_search_path' from source: unknown 24134 1727096416.41038: calling self._execute() 24134 1727096416.41113: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096416.41117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096416.41125: variable 'omit' from source: magic vars 24134 1727096416.41412: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.41419: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.41496: variable 'connection_failed' from source: set_fact 24134 1727096416.41500: Evaluated conditional (not connection_failed): True 24134 1727096416.41577: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.41581: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.41650: variable 'connection_failed' from source: set_fact 24134 1727096416.41653: Evaluated conditional (not connection_failed): True 24134 1727096416.41659: _execute() done 24134 1727096416.41662: dumping result to json 24134 1727096416.41665: done dumping result, returning 24134 1727096416.41676: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-1673-d3fc-000000000067] 24134 1727096416.41681: sending task result for task 0afff68d-5257-1673-d3fc-000000000067 24134 1727096416.41762: done sending task result for task 0afff68d-5257-1673-d3fc-000000000067 24134 1727096416.41765: WORKER PROCESS EXITING 24134 1727096416.41804: no more pending results, returning what we have 24134 1727096416.41809: in VariableManager get_vars() 24134 1727096416.41849: Calling all_inventory to load vars for managed_node1 24134 1727096416.41853: Calling groups_inventory to load vars for managed_node1 24134 1727096416.41855: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096416.41866: Calling all_plugins_play to load vars for managed_node1 24134 1727096416.41873: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096416.41876: Calling groups_plugins_play to load vars for managed_node1 24134 1727096416.42810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096416.43724: done with get_vars() 24134 1727096416.43739: variable 'ansible_search_path' from source: unknown 24134 1727096416.43740: variable 'ansible_search_path' from source: unknown 24134 1727096416.43761: we have included files to process 24134 1727096416.43762: generating all_blocks data 24134 1727096416.43763: done generating all_blocks data 24134 1727096416.43763: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24134 1727096416.43764: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24134 1727096416.43765: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24134 1727096416.44151: done processing included file 24134 1727096416.44154: iterating over new_blocks loaded from include file 24134 1727096416.44155: in VariableManager get_vars() 24134 1727096416.44171: done with get_vars() 24134 1727096416.44172: filtering new block on tags 24134 1727096416.44183: done filtering new block on tags 24134 1727096416.44185: in VariableManager get_vars() 24134 1727096416.44195: done with get_vars() 24134 1727096416.44196: filtering new block on tags 24134 1727096416.44206: done filtering new block on tags 24134 1727096416.44208: in VariableManager get_vars() 24134 1727096416.44218: done with get_vars() 24134 1727096416.44219: filtering new block on tags 24134 1727096416.44228: done filtering new block on tags 24134 1727096416.44229: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 24134 1727096416.44232: extending task lists for all hosts with included blocks 24134 1727096416.44439: done extending task lists 24134 1727096416.44440: done processing included files 24134 1727096416.44440: results queue empty 24134 1727096416.44441: checking for any_errors_fatal 24134 1727096416.44442: done checking for any_errors_fatal 24134 1727096416.44442: checking for max_fail_percentage 24134 1727096416.44443: done checking for max_fail_percentage 24134 1727096416.44443: checking to see if all hosts have failed and the running result is not ok 24134 1727096416.44444: done checking to see if all hosts have failed 24134 1727096416.44444: getting the remaining hosts for this loop 24134 1727096416.44446: done getting the remaining hosts for this loop 24134 1727096416.44447: getting the next task for host managed_node1 24134 1727096416.44450: done getting next task for host managed_node1 24134 1727096416.44451: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24134 1727096416.44453: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096416.44459: getting variables 24134 1727096416.44460: in VariableManager get_vars() 24134 1727096416.44472: Calling all_inventory to load vars for managed_node1 24134 1727096416.44474: Calling groups_inventory to load vars for managed_node1 24134 1727096416.44475: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096416.44480: Calling all_plugins_play to load vars for managed_node1 24134 1727096416.44482: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096416.44485: Calling groups_plugins_play to load vars for managed_node1 24134 1727096416.45227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096416.46655: done with get_vars() 24134 1727096416.46677: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 09:00:16 -0400 (0:00:00.062) 0:00:20.680 ****** 24134 1727096416.46736: entering _queue_task() for managed_node1/setup 24134 1727096416.47000: worker is 1 (out of 1 available) 24134 1727096416.47013: exiting _queue_task() for managed_node1/setup 24134 1727096416.47025: done queuing things up, now waiting for results queue to drain 24134 1727096416.47026: waiting for pending results... 24134 1727096416.47204: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24134 1727096416.47295: in run() - task 0afff68d-5257-1673-d3fc-000000000495 24134 1727096416.47307: variable 'ansible_search_path' from source: unknown 24134 1727096416.47311: variable 'ansible_search_path' from source: unknown 24134 1727096416.47339: calling self._execute() 24134 1727096416.47411: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096416.47415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096416.47424: variable 'omit' from source: magic vars 24134 1727096416.47700: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.47708: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.47784: variable 'connection_failed' from source: set_fact 24134 1727096416.47788: Evaluated conditional (not connection_failed): True 24134 1727096416.47863: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.47869: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.47935: variable 'connection_failed' from source: set_fact 24134 1727096416.47939: Evaluated conditional (not connection_failed): True 24134 1727096416.48174: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.48178: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.48180: variable 'connection_failed' from source: set_fact 24134 1727096416.48183: Evaluated conditional (not connection_failed): True 24134 1727096416.48348: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.48359: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.48457: variable 'connection_failed' from source: set_fact 24134 1727096416.48472: Evaluated conditional (not connection_failed): True 24134 1727096416.48659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096416.50816: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096416.50886: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096416.50926: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096416.50962: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096416.50998: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096416.51080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096416.51116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096416.51143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096416.51197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096416.51217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096416.51273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096416.51302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096416.51330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096416.51474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096416.51478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096416.51550: variable '__network_required_facts' from source: role '' defaults 24134 1727096416.51563: variable 'ansible_facts' from source: unknown 24134 1727096416.52283: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 24134 1727096416.52293: when evaluation is False, skipping this task 24134 1727096416.52301: _execute() done 24134 1727096416.52308: dumping result to json 24134 1727096416.52315: done dumping result, returning 24134 1727096416.52325: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-1673-d3fc-000000000495] 24134 1727096416.52332: sending task result for task 0afff68d-5257-1673-d3fc-000000000495 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096416.52461: no more pending results, returning what we have 24134 1727096416.52465: results queue empty 24134 1727096416.52466: checking for any_errors_fatal 24134 1727096416.52470: done checking for any_errors_fatal 24134 1727096416.52470: checking for max_fail_percentage 24134 1727096416.52472: done checking for max_fail_percentage 24134 1727096416.52473: checking to see if all hosts have failed and the running result is not ok 24134 1727096416.52473: done checking to see if all hosts have failed 24134 1727096416.52474: getting the remaining hosts for this loop 24134 1727096416.52476: done getting the remaining hosts for this loop 24134 1727096416.52479: getting the next task for host managed_node1 24134 1727096416.52487: done getting next task for host managed_node1 24134 1727096416.52491: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 24134 1727096416.52572: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096416.52591: getting variables 24134 1727096416.52593: in VariableManager get_vars() 24134 1727096416.52637: Calling all_inventory to load vars for managed_node1 24134 1727096416.52640: Calling groups_inventory to load vars for managed_node1 24134 1727096416.52642: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096416.52658: Calling all_plugins_play to load vars for managed_node1 24134 1727096416.52661: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096416.52664: Calling groups_plugins_play to load vars for managed_node1 24134 1727096416.53237: done sending task result for task 0afff68d-5257-1673-d3fc-000000000495 24134 1727096416.53241: WORKER PROCESS EXITING 24134 1727096416.54414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096416.55728: done with get_vars() 24134 1727096416.55744: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 09:00:16 -0400 (0:00:00.090) 0:00:20.771 ****** 24134 1727096416.55818: entering _queue_task() for managed_node1/stat 24134 1727096416.56065: worker is 1 (out of 1 available) 24134 1727096416.56083: exiting _queue_task() for managed_node1/stat 24134 1727096416.56095: done queuing things up, now waiting for results queue to drain 24134 1727096416.56097: waiting for pending results... 24134 1727096416.56265: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 24134 1727096416.56363: in run() - task 0afff68d-5257-1673-d3fc-000000000497 24134 1727096416.56382: variable 'ansible_search_path' from source: unknown 24134 1727096416.56385: variable 'ansible_search_path' from source: unknown 24134 1727096416.56411: calling self._execute() 24134 1727096416.56483: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096416.56487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096416.56497: variable 'omit' from source: magic vars 24134 1727096416.56771: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.56777: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.56851: variable 'connection_failed' from source: set_fact 24134 1727096416.56854: Evaluated conditional (not connection_failed): True 24134 1727096416.56931: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.56935: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.57005: variable 'connection_failed' from source: set_fact 24134 1727096416.57008: Evaluated conditional (not connection_failed): True 24134 1727096416.57274: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.57278: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.57280: variable 'connection_failed' from source: set_fact 24134 1727096416.57283: Evaluated conditional (not connection_failed): True 24134 1727096416.57327: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.57341: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.57441: variable 'connection_failed' from source: set_fact 24134 1727096416.57456: Evaluated conditional (not connection_failed): True 24134 1727096416.57617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096416.57890: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096416.57940: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096416.57981: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096416.58024: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096416.58165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096416.58187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096416.58213: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096416.58236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096416.58303: variable '__network_is_ostree' from source: set_fact 24134 1727096416.58307: Evaluated conditional (not __network_is_ostree is defined): False 24134 1727096416.58311: when evaluation is False, skipping this task 24134 1727096416.58320: _execute() done 24134 1727096416.58325: dumping result to json 24134 1727096416.58327: done dumping result, returning 24134 1727096416.58335: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-1673-d3fc-000000000497] 24134 1727096416.58340: sending task result for task 0afff68d-5257-1673-d3fc-000000000497 24134 1727096416.58418: done sending task result for task 0afff68d-5257-1673-d3fc-000000000497 24134 1727096416.58423: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24134 1727096416.58486: no more pending results, returning what we have 24134 1727096416.58489: results queue empty 24134 1727096416.58490: checking for any_errors_fatal 24134 1727096416.58498: done checking for any_errors_fatal 24134 1727096416.58499: checking for max_fail_percentage 24134 1727096416.58500: done checking for max_fail_percentage 24134 1727096416.58501: checking to see if all hosts have failed and the running result is not ok 24134 1727096416.58502: done checking to see if all hosts have failed 24134 1727096416.58502: getting the remaining hosts for this loop 24134 1727096416.58503: done getting the remaining hosts for this loop 24134 1727096416.58507: getting the next task for host managed_node1 24134 1727096416.58513: done getting next task for host managed_node1 24134 1727096416.58517: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24134 1727096416.58519: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096416.58534: getting variables 24134 1727096416.58536: in VariableManager get_vars() 24134 1727096416.58574: Calling all_inventory to load vars for managed_node1 24134 1727096416.58576: Calling groups_inventory to load vars for managed_node1 24134 1727096416.58578: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096416.58586: Calling all_plugins_play to load vars for managed_node1 24134 1727096416.58589: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096416.58591: Calling groups_plugins_play to load vars for managed_node1 24134 1727096416.59389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096416.60585: done with get_vars() 24134 1727096416.60606: done getting variables 24134 1727096416.60666: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 09:00:16 -0400 (0:00:00.048) 0:00:20.820 ****** 24134 1727096416.60703: entering _queue_task() for managed_node1/set_fact 24134 1727096416.61011: worker is 1 (out of 1 available) 24134 1727096416.61022: exiting _queue_task() for managed_node1/set_fact 24134 1727096416.61035: done queuing things up, now waiting for results queue to drain 24134 1727096416.61036: waiting for pending results... 24134 1727096416.61388: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24134 1727096416.61454: in run() - task 0afff68d-5257-1673-d3fc-000000000498 24134 1727096416.61483: variable 'ansible_search_path' from source: unknown 24134 1727096416.61493: variable 'ansible_search_path' from source: unknown 24134 1727096416.61528: calling self._execute() 24134 1727096416.61618: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096416.61623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096416.61625: variable 'omit' from source: magic vars 24134 1727096416.61889: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.61898: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.61982: variable 'connection_failed' from source: set_fact 24134 1727096416.61985: Evaluated conditional (not connection_failed): True 24134 1727096416.62057: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.62061: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.62129: variable 'connection_failed' from source: set_fact 24134 1727096416.62133: Evaluated conditional (not connection_failed): True 24134 1727096416.62210: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.62213: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.62282: variable 'connection_failed' from source: set_fact 24134 1727096416.62285: Evaluated conditional (not connection_failed): True 24134 1727096416.62354: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.62358: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.62425: variable 'connection_failed' from source: set_fact 24134 1727096416.62430: Evaluated conditional (not connection_failed): True 24134 1727096416.62538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096416.62736: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096416.62770: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096416.62804: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096416.62824: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096416.62923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096416.62939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096416.62957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096416.62979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096416.63039: variable '__network_is_ostree' from source: set_fact 24134 1727096416.63046: Evaluated conditional (not __network_is_ostree is defined): False 24134 1727096416.63049: when evaluation is False, skipping this task 24134 1727096416.63102: _execute() done 24134 1727096416.63106: dumping result to json 24134 1727096416.63109: done dumping result, returning 24134 1727096416.63117: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-1673-d3fc-000000000498] 24134 1727096416.63122: sending task result for task 0afff68d-5257-1673-d3fc-000000000498 24134 1727096416.63181: done sending task result for task 0afff68d-5257-1673-d3fc-000000000498 24134 1727096416.63183: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24134 1727096416.63231: no more pending results, returning what we have 24134 1727096416.63234: results queue empty 24134 1727096416.63235: checking for any_errors_fatal 24134 1727096416.63241: done checking for any_errors_fatal 24134 1727096416.63241: checking for max_fail_percentage 24134 1727096416.63243: done checking for max_fail_percentage 24134 1727096416.63243: checking to see if all hosts have failed and the running result is not ok 24134 1727096416.63244: done checking to see if all hosts have failed 24134 1727096416.63245: getting the remaining hosts for this loop 24134 1727096416.63246: done getting the remaining hosts for this loop 24134 1727096416.63250: getting the next task for host managed_node1 24134 1727096416.63258: done getting next task for host managed_node1 24134 1727096416.63262: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 24134 1727096416.63264: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096416.63279: getting variables 24134 1727096416.63280: in VariableManager get_vars() 24134 1727096416.63318: Calling all_inventory to load vars for managed_node1 24134 1727096416.63321: Calling groups_inventory to load vars for managed_node1 24134 1727096416.63323: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096416.63331: Calling all_plugins_play to load vars for managed_node1 24134 1727096416.63333: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096416.63336: Calling groups_plugins_play to load vars for managed_node1 24134 1727096416.64685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096416.65805: done with get_vars() 24134 1727096416.65821: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 09:00:16 -0400 (0:00:00.051) 0:00:20.872 ****** 24134 1727096416.65896: entering _queue_task() for managed_node1/service_facts 24134 1727096416.66135: worker is 1 (out of 1 available) 24134 1727096416.66148: exiting _queue_task() for managed_node1/service_facts 24134 1727096416.66161: done queuing things up, now waiting for results queue to drain 24134 1727096416.66162: waiting for pending results... 24134 1727096416.66339: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 24134 1727096416.66426: in run() - task 0afff68d-5257-1673-d3fc-00000000049a 24134 1727096416.66438: variable 'ansible_search_path' from source: unknown 24134 1727096416.66441: variable 'ansible_search_path' from source: unknown 24134 1727096416.66471: calling self._execute() 24134 1727096416.66540: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096416.66544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096416.66553: variable 'omit' from source: magic vars 24134 1727096416.66821: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.66831: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.66912: variable 'connection_failed' from source: set_fact 24134 1727096416.66917: Evaluated conditional (not connection_failed): True 24134 1727096416.67174: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.67177: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.67180: variable 'connection_failed' from source: set_fact 24134 1727096416.67182: Evaluated conditional (not connection_failed): True 24134 1727096416.67249: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.67261: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.67381: variable 'connection_failed' from source: set_fact 24134 1727096416.67392: Evaluated conditional (not connection_failed): True 24134 1727096416.67520: variable 'ansible_distribution_major_version' from source: facts 24134 1727096416.67533: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096416.67641: variable 'connection_failed' from source: set_fact 24134 1727096416.67652: Evaluated conditional (not connection_failed): True 24134 1727096416.67663: variable 'omit' from source: magic vars 24134 1727096416.67740: variable 'omit' from source: magic vars 24134 1727096416.67780: variable 'omit' from source: magic vars 24134 1727096416.67824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096416.67880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096416.67907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096416.67933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096416.67982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096416.68013: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096416.68016: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096416.68019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096416.68099: Set connection var ansible_shell_executable to /bin/sh 24134 1727096416.68102: Set connection var ansible_pipelining to False 24134 1727096416.68105: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096416.68113: Set connection var ansible_timeout to 10 24134 1727096416.68115: Set connection var ansible_connection to ssh 24134 1727096416.68117: Set connection var ansible_shell_type to sh 24134 1727096416.68135: variable 'ansible_shell_executable' from source: unknown 24134 1727096416.68137: variable 'ansible_connection' from source: unknown 24134 1727096416.68140: variable 'ansible_module_compression' from source: unknown 24134 1727096416.68142: variable 'ansible_shell_type' from source: unknown 24134 1727096416.68145: variable 'ansible_shell_executable' from source: unknown 24134 1727096416.68147: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096416.68149: variable 'ansible_pipelining' from source: unknown 24134 1727096416.68153: variable 'ansible_timeout' from source: unknown 24134 1727096416.68156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096416.68311: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096416.68320: variable 'omit' from source: magic vars 24134 1727096416.68323: starting attempt loop 24134 1727096416.68329: running the handler 24134 1727096416.68340: _low_level_execute_command(): starting 24134 1727096416.68347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096416.68837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096416.68842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096416.68845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096416.68902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096416.68909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096416.68911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096416.68986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096416.70686: stdout chunk (state=3): >>>/root <<< 24134 1727096416.70787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096416.70819: stderr chunk (state=3): >>><<< 24134 1727096416.70822: stdout chunk (state=3): >>><<< 24134 1727096416.70840: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096416.70853: _low_level_execute_command(): starting 24134 1727096416.70858: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439 `" && echo ansible-tmp-1727096416.708403-25176-224511446231439="` echo /root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439 `" ) && sleep 0' 24134 1727096416.71302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096416.71305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096416.71307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096416.71310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096416.71312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 24134 1727096416.71314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096416.71355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096416.71364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096416.71443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096416.73384: stdout chunk (state=3): >>>ansible-tmp-1727096416.708403-25176-224511446231439=/root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439 <<< 24134 1727096416.73489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096416.73518: stderr chunk (state=3): >>><<< 24134 1727096416.73521: stdout chunk (state=3): >>><<< 24134 1727096416.73534: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096416.708403-25176-224511446231439=/root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096416.73576: variable 'ansible_module_compression' from source: unknown 24134 1727096416.73617: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 24134 1727096416.73649: variable 'ansible_facts' from source: unknown 24134 1727096416.73703: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439/AnsiballZ_service_facts.py 24134 1727096416.73805: Sending initial data 24134 1727096416.73808: Sent initial data (161 bytes) 24134 1727096416.74260: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096416.74264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096416.74266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096416.74272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096416.74275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096416.74316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096416.74320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096416.74394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096416.75981: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 24134 1727096416.75984: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096416.76041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096416.76116: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpdi3cnavm /root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439/AnsiballZ_service_facts.py <<< 24134 1727096416.76119: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439/AnsiballZ_service_facts.py" <<< 24134 1727096416.76178: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpdi3cnavm" to remote "/root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439/AnsiballZ_service_facts.py" <<< 24134 1727096416.76182: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439/AnsiballZ_service_facts.py" <<< 24134 1727096416.76830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096416.76870: stderr chunk (state=3): >>><<< 24134 1727096416.76880: stdout chunk (state=3): >>><<< 24134 1727096416.76923: done transferring module to remote 24134 1727096416.76933: _low_level_execute_command(): starting 24134 1727096416.76938: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439/ /root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439/AnsiballZ_service_facts.py && sleep 0' 24134 1727096416.77505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096416.77509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096416.77589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096416.77630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096416.77650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096416.77683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096416.77776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096416.79656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096416.79679: stderr chunk (state=3): >>><<< 24134 1727096416.79691: stdout chunk (state=3): >>><<< 24134 1727096416.79725: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096416.79819: _low_level_execute_command(): starting 24134 1727096416.79822: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439/AnsiballZ_service_facts.py && sleep 0' 24134 1727096416.80410: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096416.80426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096416.80441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096416.80529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096416.80563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096416.80586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096416.80610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096416.80701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096418.39131: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 24134 1727096418.39152: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 24134 1727096418.39167: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 24134 1727096418.39180: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 24134 1727096418.39192: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 24134 1727096418.39204: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 24134 1727096418.40879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096418.40883: stdout chunk (state=3): >>><<< 24134 1727096418.40886: stderr chunk (state=3): >>><<< 24134 1727096418.40893: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096418.42031: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096418.42049: _low_level_execute_command(): starting 24134 1727096418.42059: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096416.708403-25176-224511446231439/ > /dev/null 2>&1 && sleep 0' 24134 1727096418.42787: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096418.42791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096418.42856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096418.42886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096418.42907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096418.43085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096418.44854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096418.44895: stderr chunk (state=3): >>><<< 24134 1727096418.44899: stdout chunk (state=3): >>><<< 24134 1727096418.44910: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096418.44918: handler run complete 24134 1727096418.45029: variable 'ansible_facts' from source: unknown 24134 1727096418.45135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096418.45675: variable 'ansible_facts' from source: unknown 24134 1727096418.45711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096418.45911: attempt loop complete, returning result 24134 1727096418.45922: _execute() done 24134 1727096418.45930: dumping result to json 24134 1727096418.45999: done dumping result, returning 24134 1727096418.46012: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-1673-d3fc-00000000049a] 24134 1727096418.46020: sending task result for task 0afff68d-5257-1673-d3fc-00000000049a 24134 1727096418.47474: done sending task result for task 0afff68d-5257-1673-d3fc-00000000049a 24134 1727096418.47478: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096418.47595: no more pending results, returning what we have 24134 1727096418.47598: results queue empty 24134 1727096418.47599: checking for any_errors_fatal 24134 1727096418.47603: done checking for any_errors_fatal 24134 1727096418.47604: checking for max_fail_percentage 24134 1727096418.47606: done checking for max_fail_percentage 24134 1727096418.47606: checking to see if all hosts have failed and the running result is not ok 24134 1727096418.47607: done checking to see if all hosts have failed 24134 1727096418.47608: getting the remaining hosts for this loop 24134 1727096418.47609: done getting the remaining hosts for this loop 24134 1727096418.47613: getting the next task for host managed_node1 24134 1727096418.47618: done getting next task for host managed_node1 24134 1727096418.47739: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 24134 1727096418.47743: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096418.47753: getting variables 24134 1727096418.47755: in VariableManager get_vars() 24134 1727096418.47786: Calling all_inventory to load vars for managed_node1 24134 1727096418.47788: Calling groups_inventory to load vars for managed_node1 24134 1727096418.47791: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096418.47799: Calling all_plugins_play to load vars for managed_node1 24134 1727096418.47802: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096418.47805: Calling groups_plugins_play to load vars for managed_node1 24134 1727096418.49778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096418.53489: done with get_vars() 24134 1727096418.53521: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 09:00:18 -0400 (0:00:01.877) 0:00:22.749 ****** 24134 1727096418.53613: entering _queue_task() for managed_node1/package_facts 24134 1727096418.54389: worker is 1 (out of 1 available) 24134 1727096418.54404: exiting _queue_task() for managed_node1/package_facts 24134 1727096418.54419: done queuing things up, now waiting for results queue to drain 24134 1727096418.54420: waiting for pending results... 24134 1727096418.55076: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 24134 1727096418.55272: in run() - task 0afff68d-5257-1673-d3fc-00000000049b 24134 1727096418.55281: variable 'ansible_search_path' from source: unknown 24134 1727096418.55289: variable 'ansible_search_path' from source: unknown 24134 1727096418.55329: calling self._execute() 24134 1727096418.55462: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096418.55604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096418.55621: variable 'omit' from source: magic vars 24134 1727096418.56355: variable 'ansible_distribution_major_version' from source: facts 24134 1727096418.56419: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096418.56645: variable 'connection_failed' from source: set_fact 24134 1727096418.56657: Evaluated conditional (not connection_failed): True 24134 1727096418.56974: variable 'ansible_distribution_major_version' from source: facts 24134 1727096418.56978: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096418.57278: variable 'connection_failed' from source: set_fact 24134 1727096418.57282: Evaluated conditional (not connection_failed): True 24134 1727096418.57393: variable 'ansible_distribution_major_version' from source: facts 24134 1727096418.57404: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096418.57640: variable 'connection_failed' from source: set_fact 24134 1727096418.57652: Evaluated conditional (not connection_failed): True 24134 1727096418.57833: variable 'ansible_distribution_major_version' from source: facts 24134 1727096418.57939: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096418.58056: variable 'connection_failed' from source: set_fact 24134 1727096418.58069: Evaluated conditional (not connection_failed): True 24134 1727096418.58257: variable 'omit' from source: magic vars 24134 1727096418.58261: variable 'omit' from source: magic vars 24134 1727096418.58389: variable 'omit' from source: magic vars 24134 1727096418.58435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096418.58515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096418.58603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096418.58624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096418.58709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096418.58746: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096418.58754: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096418.58761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096418.59032: Set connection var ansible_shell_executable to /bin/sh 24134 1727096418.59036: Set connection var ansible_pipelining to False 24134 1727096418.59038: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096418.59040: Set connection var ansible_timeout to 10 24134 1727096418.59042: Set connection var ansible_connection to ssh 24134 1727096418.59045: Set connection var ansible_shell_type to sh 24134 1727096418.59144: variable 'ansible_shell_executable' from source: unknown 24134 1727096418.59153: variable 'ansible_connection' from source: unknown 24134 1727096418.59161: variable 'ansible_module_compression' from source: unknown 24134 1727096418.59167: variable 'ansible_shell_type' from source: unknown 24134 1727096418.59176: variable 'ansible_shell_executable' from source: unknown 24134 1727096418.59182: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096418.59189: variable 'ansible_pipelining' from source: unknown 24134 1727096418.59195: variable 'ansible_timeout' from source: unknown 24134 1727096418.59202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096418.59406: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096418.59430: variable 'omit' from source: magic vars 24134 1727096418.59440: starting attempt loop 24134 1727096418.59449: running the handler 24134 1727096418.59476: _low_level_execute_command(): starting 24134 1727096418.59490: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096418.60223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096418.60293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096418.60308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096418.60351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096418.60411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096418.60423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096418.60560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096418.60577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096418.62285: stdout chunk (state=3): >>>/root <<< 24134 1727096418.62464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096418.62479: stdout chunk (state=3): >>><<< 24134 1727096418.62490: stderr chunk (state=3): >>><<< 24134 1727096418.62519: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096418.62573: _low_level_execute_command(): starting 24134 1727096418.62609: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249 `" && echo ansible-tmp-1727096418.6255689-25253-139548126432249="` echo /root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249 `" ) && sleep 0' 24134 1727096418.63355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096418.63376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096418.63413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096418.63510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096418.65478: stdout chunk (state=3): >>>ansible-tmp-1727096418.6255689-25253-139548126432249=/root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249 <<< 24134 1727096418.65578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096418.65612: stderr chunk (state=3): >>><<< 24134 1727096418.65614: stdout chunk (state=3): >>><<< 24134 1727096418.65625: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096418.6255689-25253-139548126432249=/root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096418.65674: variable 'ansible_module_compression' from source: unknown 24134 1727096418.65719: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 24134 1727096418.65774: variable 'ansible_facts' from source: unknown 24134 1727096418.65976: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249/AnsiballZ_package_facts.py 24134 1727096418.66091: Sending initial data 24134 1727096418.66094: Sent initial data (162 bytes) 24134 1727096418.66524: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096418.66539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096418.66598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096418.66601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096418.66607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096418.66676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096418.68292: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096418.68359: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096418.68441: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmp3kr681o5 /root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249/AnsiballZ_package_facts.py <<< 24134 1727096418.68445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249/AnsiballZ_package_facts.py" <<< 24134 1727096418.68529: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmp3kr681o5" to remote "/root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249/AnsiballZ_package_facts.py" <<< 24134 1727096418.70267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096418.70305: stderr chunk (state=3): >>><<< 24134 1727096418.70308: stdout chunk (state=3): >>><<< 24134 1727096418.70416: done transferring module to remote 24134 1727096418.70419: _low_level_execute_command(): starting 24134 1727096418.70421: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249/ /root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249/AnsiballZ_package_facts.py && sleep 0' 24134 1727096418.71052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096418.71081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096418.71197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096418.71239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096418.71331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096418.73232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096418.73247: stderr chunk (state=3): >>><<< 24134 1727096418.73257: stdout chunk (state=3): >>><<< 24134 1727096418.73283: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096418.73377: _low_level_execute_command(): starting 24134 1727096418.73381: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249/AnsiballZ_package_facts.py && sleep 0' 24134 1727096418.74488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096418.74491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096418.74493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096418.74496: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096418.74498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096418.74567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096418.74708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096419.19133: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 24134 1727096419.19154: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 24134 1727096419.19173: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 24134 1727096419.19206: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 24134 1727096419.19215: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 24134 1727096419.19225: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 24134 1727096419.19251: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.<<< 24134 1727096419.19262: stdout chunk (state=3): >>>rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Dige<<< 24134 1727096419.19284: stdout chunk (state=3): >>>st-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "ep<<< 24134 1727096419.19291: stdout chunk (state=3): >>>och": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch<<< 24134 1727096419.19325: stdout chunk (state=3): >>>": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "ep<<< 24134 1727096419.19341: stdout chunk (state=3): >>>och": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "dev<<< 24134 1727096419.19345: stdout chunk (state=3): >>>ice-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 24134 1727096419.21090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096419.21120: stderr chunk (state=3): >>><<< 24134 1727096419.21124: stdout chunk (state=3): >>><<< 24134 1727096419.21163: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096419.22457: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096419.22475: _low_level_execute_command(): starting 24134 1727096419.22479: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096418.6255689-25253-139548126432249/ > /dev/null 2>&1 && sleep 0' 24134 1727096419.22938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096419.22941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096419.22944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096419.22946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096419.23006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096419.23009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096419.23014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096419.23087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096419.24957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096419.24986: stderr chunk (state=3): >>><<< 24134 1727096419.24989: stdout chunk (state=3): >>><<< 24134 1727096419.25008: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096419.25014: handler run complete 24134 1727096419.25479: variable 'ansible_facts' from source: unknown 24134 1727096419.25745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096419.26850: variable 'ansible_facts' from source: unknown 24134 1727096419.27093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096419.27472: attempt loop complete, returning result 24134 1727096419.27485: _execute() done 24134 1727096419.27488: dumping result to json 24134 1727096419.27608: done dumping result, returning 24134 1727096419.27615: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-1673-d3fc-00000000049b] 24134 1727096419.27619: sending task result for task 0afff68d-5257-1673-d3fc-00000000049b 24134 1727096419.28865: done sending task result for task 0afff68d-5257-1673-d3fc-00000000049b 24134 1727096419.28872: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096419.28952: no more pending results, returning what we have 24134 1727096419.28954: results queue empty 24134 1727096419.28954: checking for any_errors_fatal 24134 1727096419.28957: done checking for any_errors_fatal 24134 1727096419.28958: checking for max_fail_percentage 24134 1727096419.28959: done checking for max_fail_percentage 24134 1727096419.28959: checking to see if all hosts have failed and the running result is not ok 24134 1727096419.28960: done checking to see if all hosts have failed 24134 1727096419.28960: getting the remaining hosts for this loop 24134 1727096419.28961: done getting the remaining hosts for this loop 24134 1727096419.28963: getting the next task for host managed_node1 24134 1727096419.28970: done getting next task for host managed_node1 24134 1727096419.28973: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 24134 1727096419.28974: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096419.28980: getting variables 24134 1727096419.28981: in VariableManager get_vars() 24134 1727096419.29007: Calling all_inventory to load vars for managed_node1 24134 1727096419.29009: Calling groups_inventory to load vars for managed_node1 24134 1727096419.29010: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096419.29017: Calling all_plugins_play to load vars for managed_node1 24134 1727096419.29019: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096419.29021: Calling groups_plugins_play to load vars for managed_node1 24134 1727096419.29741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096419.30619: done with get_vars() 24134 1727096419.30638: done getting variables 24134 1727096419.30688: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 09:00:19 -0400 (0:00:00.770) 0:00:23.520 ****** 24134 1727096419.30711: entering _queue_task() for managed_node1/debug 24134 1727096419.30969: worker is 1 (out of 1 available) 24134 1727096419.30982: exiting _queue_task() for managed_node1/debug 24134 1727096419.30994: done queuing things up, now waiting for results queue to drain 24134 1727096419.30995: waiting for pending results... 24134 1727096419.31175: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 24134 1727096419.31253: in run() - task 0afff68d-5257-1673-d3fc-000000000068 24134 1727096419.31266: variable 'ansible_search_path' from source: unknown 24134 1727096419.31271: variable 'ansible_search_path' from source: unknown 24134 1727096419.31302: calling self._execute() 24134 1727096419.31376: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096419.31379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096419.31389: variable 'omit' from source: magic vars 24134 1727096419.31665: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.31674: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.31752: variable 'connection_failed' from source: set_fact 24134 1727096419.31756: Evaluated conditional (not connection_failed): True 24134 1727096419.31835: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.31838: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.31911: variable 'connection_failed' from source: set_fact 24134 1727096419.31915: Evaluated conditional (not connection_failed): True 24134 1727096419.31924: variable 'omit' from source: magic vars 24134 1727096419.31950: variable 'omit' from source: magic vars 24134 1727096419.32022: variable 'network_provider' from source: set_fact 24134 1727096419.32036: variable 'omit' from source: magic vars 24134 1727096419.32071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096419.32104: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096419.32121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096419.32134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096419.32145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096419.32169: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096419.32173: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096419.32178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096419.32251: Set connection var ansible_shell_executable to /bin/sh 24134 1727096419.32255: Set connection var ansible_pipelining to False 24134 1727096419.32260: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096419.32269: Set connection var ansible_timeout to 10 24134 1727096419.32274: Set connection var ansible_connection to ssh 24134 1727096419.32277: Set connection var ansible_shell_type to sh 24134 1727096419.32294: variable 'ansible_shell_executable' from source: unknown 24134 1727096419.32297: variable 'ansible_connection' from source: unknown 24134 1727096419.32299: variable 'ansible_module_compression' from source: unknown 24134 1727096419.32303: variable 'ansible_shell_type' from source: unknown 24134 1727096419.32306: variable 'ansible_shell_executable' from source: unknown 24134 1727096419.32309: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096419.32311: variable 'ansible_pipelining' from source: unknown 24134 1727096419.32314: variable 'ansible_timeout' from source: unknown 24134 1727096419.32316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096419.32419: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096419.32435: variable 'omit' from source: magic vars 24134 1727096419.32438: starting attempt loop 24134 1727096419.32440: running the handler 24134 1727096419.32476: handler run complete 24134 1727096419.32487: attempt loop complete, returning result 24134 1727096419.32490: _execute() done 24134 1727096419.32493: dumping result to json 24134 1727096419.32495: done dumping result, returning 24134 1727096419.32502: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-1673-d3fc-000000000068] 24134 1727096419.32507: sending task result for task 0afff68d-5257-1673-d3fc-000000000068 ok: [managed_node1] => {} MSG: Using network provider: nm 24134 1727096419.32648: no more pending results, returning what we have 24134 1727096419.32651: results queue empty 24134 1727096419.32652: checking for any_errors_fatal 24134 1727096419.32664: done checking for any_errors_fatal 24134 1727096419.32665: checking for max_fail_percentage 24134 1727096419.32666: done checking for max_fail_percentage 24134 1727096419.32669: checking to see if all hosts have failed and the running result is not ok 24134 1727096419.32669: done checking to see if all hosts have failed 24134 1727096419.32670: getting the remaining hosts for this loop 24134 1727096419.32672: done getting the remaining hosts for this loop 24134 1727096419.32675: getting the next task for host managed_node1 24134 1727096419.32682: done getting next task for host managed_node1 24134 1727096419.32685: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24134 1727096419.32687: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096419.32697: getting variables 24134 1727096419.32698: in VariableManager get_vars() 24134 1727096419.32732: Calling all_inventory to load vars for managed_node1 24134 1727096419.32735: Calling groups_inventory to load vars for managed_node1 24134 1727096419.32737: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096419.32746: Calling all_plugins_play to load vars for managed_node1 24134 1727096419.32749: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096419.32751: Calling groups_plugins_play to load vars for managed_node1 24134 1727096419.33281: done sending task result for task 0afff68d-5257-1673-d3fc-000000000068 24134 1727096419.33285: WORKER PROCESS EXITING 24134 1727096419.34084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096419.35072: done with get_vars() 24134 1727096419.35090: done getting variables 24134 1727096419.35134: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 09:00:19 -0400 (0:00:00.044) 0:00:23.565 ****** 24134 1727096419.35159: entering _queue_task() for managed_node1/fail 24134 1727096419.35425: worker is 1 (out of 1 available) 24134 1727096419.35439: exiting _queue_task() for managed_node1/fail 24134 1727096419.35453: done queuing things up, now waiting for results queue to drain 24134 1727096419.35454: waiting for pending results... 24134 1727096419.35634: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24134 1727096419.35722: in run() - task 0afff68d-5257-1673-d3fc-000000000069 24134 1727096419.35776: variable 'ansible_search_path' from source: unknown 24134 1727096419.35779: variable 'ansible_search_path' from source: unknown 24134 1727096419.35800: calling self._execute() 24134 1727096419.35905: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096419.35926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096419.35941: variable 'omit' from source: magic vars 24134 1727096419.36329: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.36347: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.36466: variable 'connection_failed' from source: set_fact 24134 1727096419.36482: Evaluated conditional (not connection_failed): True 24134 1727096419.36601: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.36611: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.36725: variable 'connection_failed' from source: set_fact 24134 1727096419.36735: Evaluated conditional (not connection_failed): True 24134 1727096419.36873: variable 'network_state' from source: role '' defaults 24134 1727096419.36891: Evaluated conditional (network_state != {}): False 24134 1727096419.36898: when evaluation is False, skipping this task 24134 1727096419.36903: _execute() done 24134 1727096419.36909: dumping result to json 24134 1727096419.36976: done dumping result, returning 24134 1727096419.36981: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-1673-d3fc-000000000069] 24134 1727096419.36991: sending task result for task 0afff68d-5257-1673-d3fc-000000000069 24134 1727096419.37060: done sending task result for task 0afff68d-5257-1673-d3fc-000000000069 24134 1727096419.37062: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096419.37142: no more pending results, returning what we have 24134 1727096419.37146: results queue empty 24134 1727096419.37146: checking for any_errors_fatal 24134 1727096419.37155: done checking for any_errors_fatal 24134 1727096419.37155: checking for max_fail_percentage 24134 1727096419.37157: done checking for max_fail_percentage 24134 1727096419.37157: checking to see if all hosts have failed and the running result is not ok 24134 1727096419.37158: done checking to see if all hosts have failed 24134 1727096419.37158: getting the remaining hosts for this loop 24134 1727096419.37160: done getting the remaining hosts for this loop 24134 1727096419.37164: getting the next task for host managed_node1 24134 1727096419.37173: done getting next task for host managed_node1 24134 1727096419.37178: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24134 1727096419.37180: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096419.37194: getting variables 24134 1727096419.37196: in VariableManager get_vars() 24134 1727096419.37233: Calling all_inventory to load vars for managed_node1 24134 1727096419.37235: Calling groups_inventory to load vars for managed_node1 24134 1727096419.37238: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096419.37249: Calling all_plugins_play to load vars for managed_node1 24134 1727096419.37252: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096419.37255: Calling groups_plugins_play to load vars for managed_node1 24134 1727096419.43112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096419.44505: done with get_vars() 24134 1727096419.44524: done getting variables 24134 1727096419.44563: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 09:00:19 -0400 (0:00:00.094) 0:00:23.659 ****** 24134 1727096419.44586: entering _queue_task() for managed_node1/fail 24134 1727096419.44851: worker is 1 (out of 1 available) 24134 1727096419.44866: exiting _queue_task() for managed_node1/fail 24134 1727096419.44881: done queuing things up, now waiting for results queue to drain 24134 1727096419.44883: waiting for pending results... 24134 1727096419.45057: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24134 1727096419.45145: in run() - task 0afff68d-5257-1673-d3fc-00000000006a 24134 1727096419.45156: variable 'ansible_search_path' from source: unknown 24134 1727096419.45159: variable 'ansible_search_path' from source: unknown 24134 1727096419.45190: calling self._execute() 24134 1727096419.45263: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096419.45271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096419.45279: variable 'omit' from source: magic vars 24134 1727096419.45549: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.45566: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.45643: variable 'connection_failed' from source: set_fact 24134 1727096419.45649: Evaluated conditional (not connection_failed): True 24134 1727096419.45722: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.45726: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.45796: variable 'connection_failed' from source: set_fact 24134 1727096419.45800: Evaluated conditional (not connection_failed): True 24134 1727096419.45881: variable 'network_state' from source: role '' defaults 24134 1727096419.45885: Evaluated conditional (network_state != {}): False 24134 1727096419.45888: when evaluation is False, skipping this task 24134 1727096419.45891: _execute() done 24134 1727096419.45894: dumping result to json 24134 1727096419.45897: done dumping result, returning 24134 1727096419.45903: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-1673-d3fc-00000000006a] 24134 1727096419.45908: sending task result for task 0afff68d-5257-1673-d3fc-00000000006a 24134 1727096419.45999: done sending task result for task 0afff68d-5257-1673-d3fc-00000000006a 24134 1727096419.46002: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096419.46046: no more pending results, returning what we have 24134 1727096419.46050: results queue empty 24134 1727096419.46050: checking for any_errors_fatal 24134 1727096419.46062: done checking for any_errors_fatal 24134 1727096419.46063: checking for max_fail_percentage 24134 1727096419.46065: done checking for max_fail_percentage 24134 1727096419.46065: checking to see if all hosts have failed and the running result is not ok 24134 1727096419.46066: done checking to see if all hosts have failed 24134 1727096419.46070: getting the remaining hosts for this loop 24134 1727096419.46072: done getting the remaining hosts for this loop 24134 1727096419.46076: getting the next task for host managed_node1 24134 1727096419.46082: done getting next task for host managed_node1 24134 1727096419.46085: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24134 1727096419.46087: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096419.46101: getting variables 24134 1727096419.46102: in VariableManager get_vars() 24134 1727096419.46138: Calling all_inventory to load vars for managed_node1 24134 1727096419.46140: Calling groups_inventory to load vars for managed_node1 24134 1727096419.46143: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096419.46152: Calling all_plugins_play to load vars for managed_node1 24134 1727096419.46154: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096419.46156: Calling groups_plugins_play to load vars for managed_node1 24134 1727096419.47681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096419.48593: done with get_vars() 24134 1727096419.48609: done getting variables 24134 1727096419.48653: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 09:00:19 -0400 (0:00:00.040) 0:00:23.700 ****** 24134 1727096419.48681: entering _queue_task() for managed_node1/fail 24134 1727096419.48928: worker is 1 (out of 1 available) 24134 1727096419.48940: exiting _queue_task() for managed_node1/fail 24134 1727096419.48953: done queuing things up, now waiting for results queue to drain 24134 1727096419.48954: waiting for pending results... 24134 1727096419.49130: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24134 1727096419.49208: in run() - task 0afff68d-5257-1673-d3fc-00000000006b 24134 1727096419.49271: variable 'ansible_search_path' from source: unknown 24134 1727096419.49275: variable 'ansible_search_path' from source: unknown 24134 1727096419.49292: calling self._execute() 24134 1727096419.49475: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096419.49479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096419.49481: variable 'omit' from source: magic vars 24134 1727096419.49794: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.49810: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.49936: variable 'connection_failed' from source: set_fact 24134 1727096419.49940: Evaluated conditional (not connection_failed): True 24134 1727096419.50044: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.50048: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.50175: variable 'connection_failed' from source: set_fact 24134 1727096419.50178: Evaluated conditional (not connection_failed): True 24134 1727096419.50320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096419.52064: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096419.52122: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096419.52147: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096419.52176: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096419.52200: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096419.52256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.52281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.52302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.52327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.52338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.52412: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.52422: Evaluated conditional (ansible_distribution_major_version | int > 9): True 24134 1727096419.52499: variable 'ansible_distribution' from source: facts 24134 1727096419.52503: variable '__network_rh_distros' from source: role '' defaults 24134 1727096419.52512: Evaluated conditional (ansible_distribution in __network_rh_distros): True 24134 1727096419.52668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.52687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.52704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.52729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.52740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.52780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.52797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.52813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.52837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.52849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.52884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.52900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.52973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.52976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.53005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.53361: variable 'network_connections' from source: play vars 24134 1727096419.53383: variable 'profile' from source: play vars 24134 1727096419.53633: variable 'profile' from source: play vars 24134 1727096419.53636: variable 'interface' from source: set_fact 24134 1727096419.53639: variable 'interface' from source: set_fact 24134 1727096419.53641: variable 'network_state' from source: role '' defaults 24134 1727096419.53643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096419.53913: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096419.53964: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096419.53995: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096419.54022: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096419.54053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096419.54071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096419.54094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.54113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096419.54132: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 24134 1727096419.54136: when evaluation is False, skipping this task 24134 1727096419.54138: _execute() done 24134 1727096419.54140: dumping result to json 24134 1727096419.54142: done dumping result, returning 24134 1727096419.54150: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-1673-d3fc-00000000006b] 24134 1727096419.54156: sending task result for task 0afff68d-5257-1673-d3fc-00000000006b skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 24134 1727096419.54298: no more pending results, returning what we have 24134 1727096419.54303: results queue empty 24134 1727096419.54304: checking for any_errors_fatal 24134 1727096419.54310: done checking for any_errors_fatal 24134 1727096419.54311: checking for max_fail_percentage 24134 1727096419.54312: done checking for max_fail_percentage 24134 1727096419.54313: checking to see if all hosts have failed and the running result is not ok 24134 1727096419.54314: done checking to see if all hosts have failed 24134 1727096419.54314: getting the remaining hosts for this loop 24134 1727096419.54316: done getting the remaining hosts for this loop 24134 1727096419.54319: getting the next task for host managed_node1 24134 1727096419.54325: done getting next task for host managed_node1 24134 1727096419.54328: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24134 1727096419.54330: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096419.54342: getting variables 24134 1727096419.54344: in VariableManager get_vars() 24134 1727096419.54382: Calling all_inventory to load vars for managed_node1 24134 1727096419.54385: Calling groups_inventory to load vars for managed_node1 24134 1727096419.54387: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096419.54396: Calling all_plugins_play to load vars for managed_node1 24134 1727096419.54398: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096419.54401: Calling groups_plugins_play to load vars for managed_node1 24134 1727096419.54982: done sending task result for task 0afff68d-5257-1673-d3fc-00000000006b 24134 1727096419.54986: WORKER PROCESS EXITING 24134 1727096419.55232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096419.56138: done with get_vars() 24134 1727096419.56156: done getting variables 24134 1727096419.56203: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 09:00:19 -0400 (0:00:00.075) 0:00:23.775 ****** 24134 1727096419.56227: entering _queue_task() for managed_node1/dnf 24134 1727096419.56484: worker is 1 (out of 1 available) 24134 1727096419.56498: exiting _queue_task() for managed_node1/dnf 24134 1727096419.56510: done queuing things up, now waiting for results queue to drain 24134 1727096419.56512: waiting for pending results... 24134 1727096419.56688: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24134 1727096419.56770: in run() - task 0afff68d-5257-1673-d3fc-00000000006c 24134 1727096419.56783: variable 'ansible_search_path' from source: unknown 24134 1727096419.56787: variable 'ansible_search_path' from source: unknown 24134 1727096419.56815: calling self._execute() 24134 1727096419.56892: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096419.56896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096419.56904: variable 'omit' from source: magic vars 24134 1727096419.57186: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.57196: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.57272: variable 'connection_failed' from source: set_fact 24134 1727096419.57282: Evaluated conditional (not connection_failed): True 24134 1727096419.57353: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.57357: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.57439: variable 'connection_failed' from source: set_fact 24134 1727096419.57454: Evaluated conditional (not connection_failed): True 24134 1727096419.57590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096419.59354: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096419.59402: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096419.59437: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096419.59466: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096419.59490: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096419.59547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.59571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.59592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.59617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.59628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.59711: variable 'ansible_distribution' from source: facts 24134 1727096419.59715: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.59727: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 24134 1727096419.59807: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096419.59895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.59912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.59928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.59953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.59963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.59997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.60016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.60032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.60055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.60066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.60096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.60116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.60132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.60155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.60165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.60266: variable 'network_connections' from source: play vars 24134 1727096419.60281: variable 'profile' from source: play vars 24134 1727096419.60329: variable 'profile' from source: play vars 24134 1727096419.60333: variable 'interface' from source: set_fact 24134 1727096419.60380: variable 'interface' from source: set_fact 24134 1727096419.60429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096419.60540: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096419.60572: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096419.60596: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096419.60638: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096419.60876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096419.60880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096419.60883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.60885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096419.60888: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096419.61037: variable 'network_connections' from source: play vars 24134 1727096419.61040: variable 'profile' from source: play vars 24134 1727096419.61090: variable 'profile' from source: play vars 24134 1727096419.61093: variable 'interface' from source: set_fact 24134 1727096419.61134: variable 'interface' from source: set_fact 24134 1727096419.61153: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24134 1727096419.61157: when evaluation is False, skipping this task 24134 1727096419.61159: _execute() done 24134 1727096419.61162: dumping result to json 24134 1727096419.61164: done dumping result, returning 24134 1727096419.61182: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-00000000006c] 24134 1727096419.61185: sending task result for task 0afff68d-5257-1673-d3fc-00000000006c 24134 1727096419.61270: done sending task result for task 0afff68d-5257-1673-d3fc-00000000006c 24134 1727096419.61273: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24134 1727096419.61332: no more pending results, returning what we have 24134 1727096419.61335: results queue empty 24134 1727096419.61336: checking for any_errors_fatal 24134 1727096419.61344: done checking for any_errors_fatal 24134 1727096419.61344: checking for max_fail_percentage 24134 1727096419.61346: done checking for max_fail_percentage 24134 1727096419.61347: checking to see if all hosts have failed and the running result is not ok 24134 1727096419.61348: done checking to see if all hosts have failed 24134 1727096419.61348: getting the remaining hosts for this loop 24134 1727096419.61350: done getting the remaining hosts for this loop 24134 1727096419.61353: getting the next task for host managed_node1 24134 1727096419.61358: done getting next task for host managed_node1 24134 1727096419.61362: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24134 1727096419.61364: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096419.61379: getting variables 24134 1727096419.61381: in VariableManager get_vars() 24134 1727096419.61420: Calling all_inventory to load vars for managed_node1 24134 1727096419.61422: Calling groups_inventory to load vars for managed_node1 24134 1727096419.61425: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096419.61434: Calling all_plugins_play to load vars for managed_node1 24134 1727096419.61436: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096419.61438: Calling groups_plugins_play to load vars for managed_node1 24134 1727096419.62432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096419.63903: done with get_vars() 24134 1727096419.63929: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24134 1727096419.64014: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 09:00:19 -0400 (0:00:00.078) 0:00:23.853 ****** 24134 1727096419.64043: entering _queue_task() for managed_node1/yum 24134 1727096419.64586: worker is 1 (out of 1 available) 24134 1727096419.64596: exiting _queue_task() for managed_node1/yum 24134 1727096419.64607: done queuing things up, now waiting for results queue to drain 24134 1727096419.64608: waiting for pending results... 24134 1727096419.64745: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24134 1727096419.64947: in run() - task 0afff68d-5257-1673-d3fc-00000000006d 24134 1727096419.64951: variable 'ansible_search_path' from source: unknown 24134 1727096419.64954: variable 'ansible_search_path' from source: unknown 24134 1727096419.64957: calling self._execute() 24134 1727096419.65062: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096419.65078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096419.65093: variable 'omit' from source: magic vars 24134 1727096419.65506: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.65524: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.65643: variable 'connection_failed' from source: set_fact 24134 1727096419.65713: Evaluated conditional (not connection_failed): True 24134 1727096419.65779: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.65789: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.65898: variable 'connection_failed' from source: set_fact 24134 1727096419.65908: Evaluated conditional (not connection_failed): True 24134 1727096419.66106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096419.68306: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096419.68360: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096419.68390: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096419.68416: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096419.68440: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096419.68503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.68523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.68544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.68571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.68586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.68656: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.68671: Evaluated conditional (ansible_distribution_major_version | int < 8): False 24134 1727096419.68675: when evaluation is False, skipping this task 24134 1727096419.68679: _execute() done 24134 1727096419.68682: dumping result to json 24134 1727096419.68685: done dumping result, returning 24134 1727096419.68693: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-00000000006d] 24134 1727096419.68698: sending task result for task 0afff68d-5257-1673-d3fc-00000000006d 24134 1727096419.68789: done sending task result for task 0afff68d-5257-1673-d3fc-00000000006d 24134 1727096419.68792: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 24134 1727096419.68844: no more pending results, returning what we have 24134 1727096419.68847: results queue empty 24134 1727096419.68848: checking for any_errors_fatal 24134 1727096419.68852: done checking for any_errors_fatal 24134 1727096419.68852: checking for max_fail_percentage 24134 1727096419.68854: done checking for max_fail_percentage 24134 1727096419.68855: checking to see if all hosts have failed and the running result is not ok 24134 1727096419.68856: done checking to see if all hosts have failed 24134 1727096419.68856: getting the remaining hosts for this loop 24134 1727096419.68858: done getting the remaining hosts for this loop 24134 1727096419.68861: getting the next task for host managed_node1 24134 1727096419.68869: done getting next task for host managed_node1 24134 1727096419.68873: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24134 1727096419.68875: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096419.68887: getting variables 24134 1727096419.68888: in VariableManager get_vars() 24134 1727096419.68928: Calling all_inventory to load vars for managed_node1 24134 1727096419.68931: Calling groups_inventory to load vars for managed_node1 24134 1727096419.68933: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096419.68942: Calling all_plugins_play to load vars for managed_node1 24134 1727096419.68944: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096419.68947: Calling groups_plugins_play to load vars for managed_node1 24134 1727096419.70291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096419.71978: done with get_vars() 24134 1727096419.72007: done getting variables 24134 1727096419.72066: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 09:00:19 -0400 (0:00:00.080) 0:00:23.934 ****** 24134 1727096419.72103: entering _queue_task() for managed_node1/fail 24134 1727096419.72491: worker is 1 (out of 1 available) 24134 1727096419.72503: exiting _queue_task() for managed_node1/fail 24134 1727096419.72515: done queuing things up, now waiting for results queue to drain 24134 1727096419.72517: waiting for pending results... 24134 1727096419.73034: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24134 1727096419.73250: in run() - task 0afff68d-5257-1673-d3fc-00000000006e 24134 1727096419.73253: variable 'ansible_search_path' from source: unknown 24134 1727096419.73341: variable 'ansible_search_path' from source: unknown 24134 1727096419.73346: calling self._execute() 24134 1727096419.73428: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096419.73440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096419.73459: variable 'omit' from source: magic vars 24134 1727096419.73854: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.73873: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.74000: variable 'connection_failed' from source: set_fact 24134 1727096419.74010: Evaluated conditional (not connection_failed): True 24134 1727096419.74121: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.74132: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.74232: variable 'connection_failed' from source: set_fact 24134 1727096419.74242: Evaluated conditional (not connection_failed): True 24134 1727096419.74357: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096419.74558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096419.77113: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096419.77198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096419.77236: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096419.77281: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096419.77312: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096419.77470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.77474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.77477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.77495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.77513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.77560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.77590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.77613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.77652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.77670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.77718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.77744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.77771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.77813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.77828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.77995: variable 'network_connections' from source: play vars 24134 1727096419.78073: variable 'profile' from source: play vars 24134 1727096419.78102: variable 'profile' from source: play vars 24134 1727096419.78116: variable 'interface' from source: set_fact 24134 1727096419.78184: variable 'interface' from source: set_fact 24134 1727096419.78258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096419.78432: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096419.78492: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096419.78541: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096419.78578: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096419.78776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096419.78780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096419.78782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.78784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096419.79095: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096419.79596: variable 'network_connections' from source: play vars 24134 1727096419.79623: variable 'profile' from source: play vars 24134 1727096419.79789: variable 'profile' from source: play vars 24134 1727096419.79801: variable 'interface' from source: set_fact 24134 1727096419.79932: variable 'interface' from source: set_fact 24134 1727096419.79935: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24134 1727096419.79937: when evaluation is False, skipping this task 24134 1727096419.79940: _execute() done 24134 1727096419.79942: dumping result to json 24134 1727096419.79945: done dumping result, returning 24134 1727096419.79947: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-00000000006e] 24134 1727096419.79949: sending task result for task 0afff68d-5257-1673-d3fc-00000000006e skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24134 1727096419.80446: no more pending results, returning what we have 24134 1727096419.80449: results queue empty 24134 1727096419.80451: checking for any_errors_fatal 24134 1727096419.80459: done checking for any_errors_fatal 24134 1727096419.80459: checking for max_fail_percentage 24134 1727096419.80461: done checking for max_fail_percentage 24134 1727096419.80463: checking to see if all hosts have failed and the running result is not ok 24134 1727096419.80463: done checking to see if all hosts have failed 24134 1727096419.80464: getting the remaining hosts for this loop 24134 1727096419.80466: done getting the remaining hosts for this loop 24134 1727096419.80472: getting the next task for host managed_node1 24134 1727096419.80478: done getting next task for host managed_node1 24134 1727096419.80482: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 24134 1727096419.80484: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096419.80497: getting variables 24134 1727096419.80503: in VariableManager get_vars() 24134 1727096419.80542: Calling all_inventory to load vars for managed_node1 24134 1727096419.80545: Calling groups_inventory to load vars for managed_node1 24134 1727096419.80547: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096419.80557: Calling all_plugins_play to load vars for managed_node1 24134 1727096419.80560: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096419.80563: Calling groups_plugins_play to load vars for managed_node1 24134 1727096419.80617: done sending task result for task 0afff68d-5257-1673-d3fc-00000000006e 24134 1727096419.80620: WORKER PROCESS EXITING 24134 1727096419.82614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096419.84435: done with get_vars() 24134 1727096419.84464: done getting variables 24134 1727096419.84527: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 09:00:19 -0400 (0:00:00.124) 0:00:24.059 ****** 24134 1727096419.84555: entering _queue_task() for managed_node1/package 24134 1727096419.85297: worker is 1 (out of 1 available) 24134 1727096419.85309: exiting _queue_task() for managed_node1/package 24134 1727096419.85321: done queuing things up, now waiting for results queue to drain 24134 1727096419.85322: waiting for pending results... 24134 1727096419.85539: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 24134 1727096419.85685: in run() - task 0afff68d-5257-1673-d3fc-00000000006f 24134 1727096419.85736: variable 'ansible_search_path' from source: unknown 24134 1727096419.85740: variable 'ansible_search_path' from source: unknown 24134 1727096419.85774: calling self._execute() 24134 1727096419.85883: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096419.85895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096419.85954: variable 'omit' from source: magic vars 24134 1727096419.86343: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.86361: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.86538: variable 'connection_failed' from source: set_fact 24134 1727096419.86634: Evaluated conditional (not connection_failed): True 24134 1727096419.86751: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.86762: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096419.86942: variable 'connection_failed' from source: set_fact 24134 1727096419.86945: Evaluated conditional (not connection_failed): True 24134 1727096419.87160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096419.87725: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096419.87836: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096419.87879: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096419.87928: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096419.88046: variable 'network_packages' from source: role '' defaults 24134 1727096419.88154: variable '__network_provider_setup' from source: role '' defaults 24134 1727096419.88180: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096419.88277: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096419.88280: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096419.88338: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096419.88538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096419.91179: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096419.91304: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096419.91433: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096419.91474: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096419.91544: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096419.91744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.91782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.91872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.91919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.91946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.92012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.92041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.92083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.92129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.92150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.92415: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24134 1727096419.92540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.92575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.92612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.92654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.92705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.92790: variable 'ansible_python' from source: facts 24134 1727096419.92830: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24134 1727096419.92927: variable '__network_wpa_supplicant_required' from source: role '' defaults 24134 1727096419.93017: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24134 1727096419.93249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.93253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.93255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.93288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.93307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.93398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096419.93449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096419.93516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.93797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096419.93800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096419.94036: variable 'network_connections' from source: play vars 24134 1727096419.94047: variable 'profile' from source: play vars 24134 1727096419.94281: variable 'profile' from source: play vars 24134 1727096419.94375: variable 'interface' from source: set_fact 24134 1727096419.94430: variable 'interface' from source: set_fact 24134 1727096419.94645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096419.94696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096419.94738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096419.94818: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096419.94865: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096419.95978: variable 'network_connections' from source: play vars 24134 1727096419.95981: variable 'profile' from source: play vars 24134 1727096419.96247: variable 'profile' from source: play vars 24134 1727096419.96376: variable 'interface' from source: set_fact 24134 1727096419.96420: variable 'interface' from source: set_fact 24134 1727096419.96885: variable '__network_packages_default_wireless' from source: role '' defaults 24134 1727096419.97087: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096419.97641: variable 'network_connections' from source: play vars 24134 1727096419.97645: variable 'profile' from source: play vars 24134 1727096419.97714: variable 'profile' from source: play vars 24134 1727096419.97718: variable 'interface' from source: set_fact 24134 1727096419.97822: variable 'interface' from source: set_fact 24134 1727096419.97851: variable '__network_packages_default_team' from source: role '' defaults 24134 1727096419.97933: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096419.98247: variable 'network_connections' from source: play vars 24134 1727096419.98250: variable 'profile' from source: play vars 24134 1727096419.98314: variable 'profile' from source: play vars 24134 1727096419.98317: variable 'interface' from source: set_fact 24134 1727096419.98415: variable 'interface' from source: set_fact 24134 1727096419.98466: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096419.98529: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096419.98535: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096419.98599: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096419.99033: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24134 1727096419.99692: variable 'network_connections' from source: play vars 24134 1727096419.99698: variable 'profile' from source: play vars 24134 1727096419.99754: variable 'profile' from source: play vars 24134 1727096419.99757: variable 'interface' from source: set_fact 24134 1727096419.99821: variable 'interface' from source: set_fact 24134 1727096419.99830: variable 'ansible_distribution' from source: facts 24134 1727096419.99833: variable '__network_rh_distros' from source: role '' defaults 24134 1727096419.99840: variable 'ansible_distribution_major_version' from source: facts 24134 1727096419.99855: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24134 1727096420.00074: variable 'ansible_distribution' from source: facts 24134 1727096420.00078: variable '__network_rh_distros' from source: role '' defaults 24134 1727096420.00080: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.00082: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24134 1727096420.00185: variable 'ansible_distribution' from source: facts 24134 1727096420.00189: variable '__network_rh_distros' from source: role '' defaults 24134 1727096420.00194: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.00228: variable 'network_provider' from source: set_fact 24134 1727096420.00243: variable 'ansible_facts' from source: unknown 24134 1727096420.00910: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 24134 1727096420.00917: when evaluation is False, skipping this task 24134 1727096420.00919: _execute() done 24134 1727096420.00924: dumping result to json 24134 1727096420.00927: done dumping result, returning 24134 1727096420.00930: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-1673-d3fc-00000000006f] 24134 1727096420.00938: sending task result for task 0afff68d-5257-1673-d3fc-00000000006f skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 24134 1727096420.01135: no more pending results, returning what we have 24134 1727096420.01138: results queue empty 24134 1727096420.01139: checking for any_errors_fatal 24134 1727096420.01146: done checking for any_errors_fatal 24134 1727096420.01146: checking for max_fail_percentage 24134 1727096420.01148: done checking for max_fail_percentage 24134 1727096420.01149: checking to see if all hosts have failed and the running result is not ok 24134 1727096420.01149: done checking to see if all hosts have failed 24134 1727096420.01150: getting the remaining hosts for this loop 24134 1727096420.01151: done getting the remaining hosts for this loop 24134 1727096420.01155: getting the next task for host managed_node1 24134 1727096420.01160: done getting next task for host managed_node1 24134 1727096420.01164: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24134 1727096420.01166: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096420.01187: getting variables 24134 1727096420.01188: in VariableManager get_vars() 24134 1727096420.01224: Calling all_inventory to load vars for managed_node1 24134 1727096420.01227: Calling groups_inventory to load vars for managed_node1 24134 1727096420.01229: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096420.01237: Calling all_plugins_play to load vars for managed_node1 24134 1727096420.01239: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096420.01242: Calling groups_plugins_play to load vars for managed_node1 24134 1727096420.01785: done sending task result for task 0afff68d-5257-1673-d3fc-00000000006f 24134 1727096420.01789: WORKER PROCESS EXITING 24134 1727096420.02735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096420.04577: done with get_vars() 24134 1727096420.04599: done getting variables 24134 1727096420.04666: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 09:00:20 -0400 (0:00:00.201) 0:00:24.260 ****** 24134 1727096420.04703: entering _queue_task() for managed_node1/package 24134 1727096420.05276: worker is 1 (out of 1 available) 24134 1727096420.05286: exiting _queue_task() for managed_node1/package 24134 1727096420.05296: done queuing things up, now waiting for results queue to drain 24134 1727096420.05297: waiting for pending results... 24134 1727096420.05384: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24134 1727096420.05498: in run() - task 0afff68d-5257-1673-d3fc-000000000070 24134 1727096420.05521: variable 'ansible_search_path' from source: unknown 24134 1727096420.05531: variable 'ansible_search_path' from source: unknown 24134 1727096420.05571: calling self._execute() 24134 1727096420.05679: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096420.05692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096420.05742: variable 'omit' from source: magic vars 24134 1727096420.06117: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.06135: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096420.06253: variable 'connection_failed' from source: set_fact 24134 1727096420.06263: Evaluated conditional (not connection_failed): True 24134 1727096420.06384: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.06476: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096420.06510: variable 'connection_failed' from source: set_fact 24134 1727096420.06520: Evaluated conditional (not connection_failed): True 24134 1727096420.06646: variable 'network_state' from source: role '' defaults 24134 1727096420.06661: Evaluated conditional (network_state != {}): False 24134 1727096420.06671: when evaluation is False, skipping this task 24134 1727096420.06680: _execute() done 24134 1727096420.06687: dumping result to json 24134 1727096420.06694: done dumping result, returning 24134 1727096420.06705: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-1673-d3fc-000000000070] 24134 1727096420.06722: sending task result for task 0afff68d-5257-1673-d3fc-000000000070 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096420.06991: no more pending results, returning what we have 24134 1727096420.06996: results queue empty 24134 1727096420.06997: checking for any_errors_fatal 24134 1727096420.07006: done checking for any_errors_fatal 24134 1727096420.07007: checking for max_fail_percentage 24134 1727096420.07008: done checking for max_fail_percentage 24134 1727096420.07009: checking to see if all hosts have failed and the running result is not ok 24134 1727096420.07010: done checking to see if all hosts have failed 24134 1727096420.07011: getting the remaining hosts for this loop 24134 1727096420.07012: done getting the remaining hosts for this loop 24134 1727096420.07016: getting the next task for host managed_node1 24134 1727096420.07022: done getting next task for host managed_node1 24134 1727096420.07026: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24134 1727096420.07028: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096420.07045: getting variables 24134 1727096420.07046: in VariableManager get_vars() 24134 1727096420.07087: Calling all_inventory to load vars for managed_node1 24134 1727096420.07090: Calling groups_inventory to load vars for managed_node1 24134 1727096420.07093: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096420.07104: Calling all_plugins_play to load vars for managed_node1 24134 1727096420.07107: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096420.07110: Calling groups_plugins_play to load vars for managed_node1 24134 1727096420.07683: done sending task result for task 0afff68d-5257-1673-d3fc-000000000070 24134 1727096420.07686: WORKER PROCESS EXITING 24134 1727096420.08674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096420.10363: done with get_vars() 24134 1727096420.10391: done getting variables 24134 1727096420.10456: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 09:00:20 -0400 (0:00:00.057) 0:00:24.318 ****** 24134 1727096420.10492: entering _queue_task() for managed_node1/package 24134 1727096420.10834: worker is 1 (out of 1 available) 24134 1727096420.10847: exiting _queue_task() for managed_node1/package 24134 1727096420.10860: done queuing things up, now waiting for results queue to drain 24134 1727096420.10861: waiting for pending results... 24134 1727096420.11130: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24134 1727096420.11243: in run() - task 0afff68d-5257-1673-d3fc-000000000071 24134 1727096420.11259: variable 'ansible_search_path' from source: unknown 24134 1727096420.11266: variable 'ansible_search_path' from source: unknown 24134 1727096420.11311: calling self._execute() 24134 1727096420.11415: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096420.11425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096420.11436: variable 'omit' from source: magic vars 24134 1727096420.11822: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.11847: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096420.11966: variable 'connection_failed' from source: set_fact 24134 1727096420.12055: Evaluated conditional (not connection_failed): True 24134 1727096420.12099: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.12110: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096420.12219: variable 'connection_failed' from source: set_fact 24134 1727096420.12229: Evaluated conditional (not connection_failed): True 24134 1727096420.12354: variable 'network_state' from source: role '' defaults 24134 1727096420.12375: Evaluated conditional (network_state != {}): False 24134 1727096420.12388: when evaluation is False, skipping this task 24134 1727096420.12396: _execute() done 24134 1727096420.12404: dumping result to json 24134 1727096420.12411: done dumping result, returning 24134 1727096420.12423: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-1673-d3fc-000000000071] 24134 1727096420.12434: sending task result for task 0afff68d-5257-1673-d3fc-000000000071 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096420.12706: no more pending results, returning what we have 24134 1727096420.12710: results queue empty 24134 1727096420.12711: checking for any_errors_fatal 24134 1727096420.12721: done checking for any_errors_fatal 24134 1727096420.12722: checking for max_fail_percentage 24134 1727096420.12723: done checking for max_fail_percentage 24134 1727096420.12724: checking to see if all hosts have failed and the running result is not ok 24134 1727096420.12725: done checking to see if all hosts have failed 24134 1727096420.12726: getting the remaining hosts for this loop 24134 1727096420.12727: done getting the remaining hosts for this loop 24134 1727096420.12731: getting the next task for host managed_node1 24134 1727096420.12737: done getting next task for host managed_node1 24134 1727096420.12740: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24134 1727096420.12742: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096420.12757: getting variables 24134 1727096420.12759: in VariableManager get_vars() 24134 1727096420.12800: Calling all_inventory to load vars for managed_node1 24134 1727096420.12803: Calling groups_inventory to load vars for managed_node1 24134 1727096420.12806: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096420.12817: Calling all_plugins_play to load vars for managed_node1 24134 1727096420.12820: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096420.12823: Calling groups_plugins_play to load vars for managed_node1 24134 1727096420.13383: done sending task result for task 0afff68d-5257-1673-d3fc-000000000071 24134 1727096420.13387: WORKER PROCESS EXITING 24134 1727096420.14552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096420.16208: done with get_vars() 24134 1727096420.16229: done getting variables 24134 1727096420.16331: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 09:00:20 -0400 (0:00:00.058) 0:00:24.377 ****** 24134 1727096420.16365: entering _queue_task() for managed_node1/service 24134 1727096420.16790: worker is 1 (out of 1 available) 24134 1727096420.16801: exiting _queue_task() for managed_node1/service 24134 1727096420.16811: done queuing things up, now waiting for results queue to drain 24134 1727096420.16812: waiting for pending results... 24134 1727096420.16984: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24134 1727096420.17097: in run() - task 0afff68d-5257-1673-d3fc-000000000072 24134 1727096420.17121: variable 'ansible_search_path' from source: unknown 24134 1727096420.17130: variable 'ansible_search_path' from source: unknown 24134 1727096420.17181: calling self._execute() 24134 1727096420.17300: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096420.17311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096420.17329: variable 'omit' from source: magic vars 24134 1727096420.18143: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.18147: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096420.18150: variable 'connection_failed' from source: set_fact 24134 1727096420.18152: Evaluated conditional (not connection_failed): True 24134 1727096420.18274: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.18287: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096420.18399: variable 'connection_failed' from source: set_fact 24134 1727096420.18413: Evaluated conditional (not connection_failed): True 24134 1727096420.18575: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096420.18763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096420.21223: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096420.21308: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096420.21355: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096420.21399: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096420.21426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096420.21575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096420.21588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096420.21618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096420.21666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096420.21759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096420.21762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096420.21765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096420.21802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096420.21846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096420.21865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096420.21919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096420.21943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096420.21974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096420.22099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096420.22103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096420.22229: variable 'network_connections' from source: play vars 24134 1727096420.22246: variable 'profile' from source: play vars 24134 1727096420.22328: variable 'profile' from source: play vars 24134 1727096420.22337: variable 'interface' from source: set_fact 24134 1727096420.22408: variable 'interface' from source: set_fact 24134 1727096420.22483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096420.22666: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096420.22716: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096420.22755: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096420.22795: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096420.22847: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096420.22885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096420.22935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096420.22950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096420.23014: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096420.23372: variable 'network_connections' from source: play vars 24134 1727096420.23375: variable 'profile' from source: play vars 24134 1727096420.23378: variable 'profile' from source: play vars 24134 1727096420.23380: variable 'interface' from source: set_fact 24134 1727096420.23447: variable 'interface' from source: set_fact 24134 1727096420.23481: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24134 1727096420.23489: when evaluation is False, skipping this task 24134 1727096420.23500: _execute() done 24134 1727096420.23512: dumping result to json 24134 1727096420.23521: done dumping result, returning 24134 1727096420.23613: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-000000000072] 24134 1727096420.23617: sending task result for task 0afff68d-5257-1673-d3fc-000000000072 24134 1727096420.23688: done sending task result for task 0afff68d-5257-1673-d3fc-000000000072 24134 1727096420.23691: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24134 1727096420.23741: no more pending results, returning what we have 24134 1727096420.23745: results queue empty 24134 1727096420.23746: checking for any_errors_fatal 24134 1727096420.23753: done checking for any_errors_fatal 24134 1727096420.23754: checking for max_fail_percentage 24134 1727096420.23756: done checking for max_fail_percentage 24134 1727096420.23757: checking to see if all hosts have failed and the running result is not ok 24134 1727096420.23758: done checking to see if all hosts have failed 24134 1727096420.23758: getting the remaining hosts for this loop 24134 1727096420.23760: done getting the remaining hosts for this loop 24134 1727096420.23764: getting the next task for host managed_node1 24134 1727096420.23774: done getting next task for host managed_node1 24134 1727096420.23778: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24134 1727096420.23781: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096420.23795: getting variables 24134 1727096420.23796: in VariableManager get_vars() 24134 1727096420.23835: Calling all_inventory to load vars for managed_node1 24134 1727096420.23838: Calling groups_inventory to load vars for managed_node1 24134 1727096420.23841: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096420.23851: Calling all_plugins_play to load vars for managed_node1 24134 1727096420.23855: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096420.23858: Calling groups_plugins_play to load vars for managed_node1 24134 1727096420.25596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096420.27271: done with get_vars() 24134 1727096420.27297: done getting variables 24134 1727096420.27363: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 09:00:20 -0400 (0:00:00.110) 0:00:24.487 ****** 24134 1727096420.27413: entering _queue_task() for managed_node1/service 24134 1727096420.28035: worker is 1 (out of 1 available) 24134 1727096420.28048: exiting _queue_task() for managed_node1/service 24134 1727096420.28060: done queuing things up, now waiting for results queue to drain 24134 1727096420.28062: waiting for pending results... 24134 1727096420.28694: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24134 1727096420.28700: in run() - task 0afff68d-5257-1673-d3fc-000000000073 24134 1727096420.28704: variable 'ansible_search_path' from source: unknown 24134 1727096420.28706: variable 'ansible_search_path' from source: unknown 24134 1727096420.28718: calling self._execute() 24134 1727096420.28824: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096420.28836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096420.28972: variable 'omit' from source: magic vars 24134 1727096420.29209: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.29225: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096420.29334: variable 'connection_failed' from source: set_fact 24134 1727096420.29344: Evaluated conditional (not connection_failed): True 24134 1727096420.29454: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.29465: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096420.29562: variable 'connection_failed' from source: set_fact 24134 1727096420.29576: Evaluated conditional (not connection_failed): True 24134 1727096420.29725: variable 'network_provider' from source: set_fact 24134 1727096420.29734: variable 'network_state' from source: role '' defaults 24134 1727096420.29748: Evaluated conditional (network_provider == "nm" or network_state != {}): True 24134 1727096420.29758: variable 'omit' from source: magic vars 24134 1727096420.29802: variable 'omit' from source: magic vars 24134 1727096420.29835: variable 'network_service_name' from source: role '' defaults 24134 1727096420.29911: variable 'network_service_name' from source: role '' defaults 24134 1727096420.30019: variable '__network_provider_setup' from source: role '' defaults 24134 1727096420.30030: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096420.30096: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096420.30109: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096420.30172: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096420.30388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096420.33040: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096420.33109: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096420.33164: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096420.33208: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096420.33240: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096420.33328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096420.33369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096420.33404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096420.33450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096420.33477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096420.33534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096420.33564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096420.33672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096420.33676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096420.33679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096420.33922: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24134 1727096420.34049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096420.34081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096420.34110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096420.34162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096420.34186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096420.34287: variable 'ansible_python' from source: facts 24134 1727096420.34312: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24134 1727096420.34404: variable '__network_wpa_supplicant_required' from source: role '' defaults 24134 1727096420.34494: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24134 1727096420.34785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096420.34789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096420.34791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096420.34821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096420.35075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096420.35078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096420.35093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096420.35219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096420.35254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096420.35311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096420.35596: variable 'network_connections' from source: play vars 24134 1727096420.35626: variable 'profile' from source: play vars 24134 1727096420.35797: variable 'profile' from source: play vars 24134 1727096420.35876: variable 'interface' from source: set_fact 24134 1727096420.35963: variable 'interface' from source: set_fact 24134 1727096420.36093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096420.36525: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096420.36607: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096420.36656: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096420.36718: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096420.36786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096420.36830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096420.36866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096420.36912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096420.36966: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096420.37287: variable 'network_connections' from source: play vars 24134 1727096420.37342: variable 'profile' from source: play vars 24134 1727096420.37387: variable 'profile' from source: play vars 24134 1727096420.37399: variable 'interface' from source: set_fact 24134 1727096420.37473: variable 'interface' from source: set_fact 24134 1727096420.37561: variable '__network_packages_default_wireless' from source: role '' defaults 24134 1727096420.37628: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096420.38080: variable 'network_connections' from source: play vars 24134 1727096420.38091: variable 'profile' from source: play vars 24134 1727096420.38174: variable 'profile' from source: play vars 24134 1727096420.38186: variable 'interface' from source: set_fact 24134 1727096420.38266: variable 'interface' from source: set_fact 24134 1727096420.38302: variable '__network_packages_default_team' from source: role '' defaults 24134 1727096420.38461: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096420.38696: variable 'network_connections' from source: play vars 24134 1727096420.38705: variable 'profile' from source: play vars 24134 1727096420.38773: variable 'profile' from source: play vars 24134 1727096420.38792: variable 'interface' from source: set_fact 24134 1727096420.38864: variable 'interface' from source: set_fact 24134 1727096420.38933: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096420.39005: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096420.39075: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096420.39078: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096420.39310: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24134 1727096420.39997: variable 'network_connections' from source: play vars 24134 1727096420.40008: variable 'profile' from source: play vars 24134 1727096420.40079: variable 'profile' from source: play vars 24134 1727096420.40091: variable 'interface' from source: set_fact 24134 1727096420.40177: variable 'interface' from source: set_fact 24134 1727096420.40190: variable 'ansible_distribution' from source: facts 24134 1727096420.40198: variable '__network_rh_distros' from source: role '' defaults 24134 1727096420.40213: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.40231: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24134 1727096420.40413: variable 'ansible_distribution' from source: facts 24134 1727096420.40476: variable '__network_rh_distros' from source: role '' defaults 24134 1727096420.40479: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.40481: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24134 1727096420.40661: variable 'ansible_distribution' from source: facts 24134 1727096420.40675: variable '__network_rh_distros' from source: role '' defaults 24134 1727096420.40685: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.40723: variable 'network_provider' from source: set_fact 24134 1727096420.40759: variable 'omit' from source: magic vars 24134 1727096420.40795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096420.40825: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096420.40866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096420.40879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096420.40893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096420.40979: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096420.40982: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096420.40984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096420.41046: Set connection var ansible_shell_executable to /bin/sh 24134 1727096420.41058: Set connection var ansible_pipelining to False 24134 1727096420.41074: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096420.41098: Set connection var ansible_timeout to 10 24134 1727096420.41104: Set connection var ansible_connection to ssh 24134 1727096420.41110: Set connection var ansible_shell_type to sh 24134 1727096420.41143: variable 'ansible_shell_executable' from source: unknown 24134 1727096420.41248: variable 'ansible_connection' from source: unknown 24134 1727096420.41251: variable 'ansible_module_compression' from source: unknown 24134 1727096420.41253: variable 'ansible_shell_type' from source: unknown 24134 1727096420.41255: variable 'ansible_shell_executable' from source: unknown 24134 1727096420.41257: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096420.41259: variable 'ansible_pipelining' from source: unknown 24134 1727096420.41260: variable 'ansible_timeout' from source: unknown 24134 1727096420.41263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096420.41351: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096420.41413: variable 'omit' from source: magic vars 24134 1727096420.41416: starting attempt loop 24134 1727096420.41418: running the handler 24134 1727096420.41477: variable 'ansible_facts' from source: unknown 24134 1727096420.42212: _low_level_execute_command(): starting 24134 1727096420.42224: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096420.42944: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096420.43036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096420.43061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096420.43177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096420.44904: stdout chunk (state=3): >>>/root <<< 24134 1727096420.45055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096420.45059: stdout chunk (state=3): >>><<< 24134 1727096420.45061: stderr chunk (state=3): >>><<< 24134 1727096420.45083: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096420.45098: _low_level_execute_command(): starting 24134 1727096420.45106: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137 `" && echo ansible-tmp-1727096420.450888-25338-109237827495137="` echo /root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137 `" ) && sleep 0' 24134 1727096420.45723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096420.45758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096420.45788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096420.45842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096420.45927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096420.45960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096420.45995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096420.46094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096420.48094: stdout chunk (state=3): >>>ansible-tmp-1727096420.450888-25338-109237827495137=/root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137 <<< 24134 1727096420.48255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096420.48258: stdout chunk (state=3): >>><<< 24134 1727096420.48260: stderr chunk (state=3): >>><<< 24134 1727096420.48278: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096420.450888-25338-109237827495137=/root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096420.48473: variable 'ansible_module_compression' from source: unknown 24134 1727096420.48476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 24134 1727096420.48478: variable 'ansible_facts' from source: unknown 24134 1727096420.48659: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137/AnsiballZ_systemd.py 24134 1727096420.48827: Sending initial data 24134 1727096420.48836: Sent initial data (155 bytes) 24134 1727096420.49441: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096420.49457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096420.49582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096420.49597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096420.49615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096420.49709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096420.51412: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096420.51503: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096420.51586: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpdqykivqv /root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137/AnsiballZ_systemd.py <<< 24134 1727096420.51590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137/AnsiballZ_systemd.py" <<< 24134 1727096420.51671: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpdqykivqv" to remote "/root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137/AnsiballZ_systemd.py" <<< 24134 1727096420.53398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096420.53423: stderr chunk (state=3): >>><<< 24134 1727096420.53544: stdout chunk (state=3): >>><<< 24134 1727096420.53547: done transferring module to remote 24134 1727096420.53549: _low_level_execute_command(): starting 24134 1727096420.53552: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137/ /root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137/AnsiballZ_systemd.py && sleep 0' 24134 1727096420.54131: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096420.54175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096420.54243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096420.56174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096420.56178: stdout chunk (state=3): >>><<< 24134 1727096420.56375: stderr chunk (state=3): >>><<< 24134 1727096420.56380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096420.56382: _low_level_execute_command(): starting 24134 1727096420.56385: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137/AnsiballZ_systemd.py && sleep 0' 24134 1727096420.56983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096420.56996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096420.57017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096420.57033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096420.57142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096420.87092: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10678272", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3322433536", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "1113815000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 24134 1727096420.87118: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 24134 1727096420.89159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096420.89187: stderr chunk (state=3): >>><<< 24134 1727096420.89190: stdout chunk (state=3): >>><<< 24134 1727096420.89203: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10678272", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3322433536", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "1113815000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096420.89322: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096420.89337: _low_level_execute_command(): starting 24134 1727096420.89341: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096420.450888-25338-109237827495137/ > /dev/null 2>&1 && sleep 0' 24134 1727096420.89748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096420.89755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096420.89773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096420.89787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096420.89790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096420.89848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096420.89851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096420.89913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096420.91973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096420.91977: stdout chunk (state=3): >>><<< 24134 1727096420.91979: stderr chunk (state=3): >>><<< 24134 1727096420.91982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096420.91985: handler run complete 24134 1727096420.91987: attempt loop complete, returning result 24134 1727096420.91989: _execute() done 24134 1727096420.91990: dumping result to json 24134 1727096420.91992: done dumping result, returning 24134 1727096420.91994: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-1673-d3fc-000000000073] 24134 1727096420.91996: sending task result for task 0afff68d-5257-1673-d3fc-000000000073 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096420.92445: no more pending results, returning what we have 24134 1727096420.92448: results queue empty 24134 1727096420.92449: checking for any_errors_fatal 24134 1727096420.92454: done checking for any_errors_fatal 24134 1727096420.92455: checking for max_fail_percentage 24134 1727096420.92456: done checking for max_fail_percentage 24134 1727096420.92457: checking to see if all hosts have failed and the running result is not ok 24134 1727096420.92458: done checking to see if all hosts have failed 24134 1727096420.92458: getting the remaining hosts for this loop 24134 1727096420.92460: done getting the remaining hosts for this loop 24134 1727096420.92463: getting the next task for host managed_node1 24134 1727096420.92470: done getting next task for host managed_node1 24134 1727096420.92475: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24134 1727096420.92477: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096420.92486: getting variables 24134 1727096420.92487: in VariableManager get_vars() 24134 1727096420.92517: Calling all_inventory to load vars for managed_node1 24134 1727096420.92520: Calling groups_inventory to load vars for managed_node1 24134 1727096420.92522: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096420.92530: Calling all_plugins_play to load vars for managed_node1 24134 1727096420.92532: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096420.92534: Calling groups_plugins_play to load vars for managed_node1 24134 1727096420.93066: done sending task result for task 0afff68d-5257-1673-d3fc-000000000073 24134 1727096420.93072: WORKER PROCESS EXITING 24134 1727096420.93487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096420.94526: done with get_vars() 24134 1727096420.94551: done getting variables 24134 1727096420.94621: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 09:00:20 -0400 (0:00:00.672) 0:00:25.159 ****** 24134 1727096420.94653: entering _queue_task() for managed_node1/service 24134 1727096420.95019: worker is 1 (out of 1 available) 24134 1727096420.95031: exiting _queue_task() for managed_node1/service 24134 1727096420.95043: done queuing things up, now waiting for results queue to drain 24134 1727096420.95045: waiting for pending results... 24134 1727096420.95393: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24134 1727096420.95439: in run() - task 0afff68d-5257-1673-d3fc-000000000074 24134 1727096420.95456: variable 'ansible_search_path' from source: unknown 24134 1727096420.95459: variable 'ansible_search_path' from source: unknown 24134 1727096420.95535: calling self._execute() 24134 1727096420.95623: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096420.95627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096420.95636: variable 'omit' from source: magic vars 24134 1727096420.96038: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.96041: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096420.96156: variable 'connection_failed' from source: set_fact 24134 1727096420.96159: Evaluated conditional (not connection_failed): True 24134 1727096420.96275: variable 'ansible_distribution_major_version' from source: facts 24134 1727096420.96279: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096420.96347: variable 'connection_failed' from source: set_fact 24134 1727096420.96351: Evaluated conditional (not connection_failed): True 24134 1727096420.96476: variable 'network_provider' from source: set_fact 24134 1727096420.96479: Evaluated conditional (network_provider == "nm"): True 24134 1727096420.96577: variable '__network_wpa_supplicant_required' from source: role '' defaults 24134 1727096420.96673: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24134 1727096420.96802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096420.99557: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096420.99855: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096420.99966: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096421.00072: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096421.00078: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096421.00147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096421.00277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096421.00313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096421.00376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096421.00584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096421.00587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096421.00590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096421.00592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096421.00594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096421.00597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096421.00629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096421.00661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096421.00694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096421.00744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096421.00780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096421.00971: variable 'network_connections' from source: play vars 24134 1727096421.00995: variable 'profile' from source: play vars 24134 1727096421.01086: variable 'profile' from source: play vars 24134 1727096421.01094: variable 'interface' from source: set_fact 24134 1727096421.01153: variable 'interface' from source: set_fact 24134 1727096421.01231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096421.01485: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096421.01489: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096421.01492: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096421.01517: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096421.01566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096421.01604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096421.01634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096421.01662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096421.01724: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096421.02175: variable 'network_connections' from source: play vars 24134 1727096421.02195: variable 'profile' from source: play vars 24134 1727096421.02269: variable 'profile' from source: play vars 24134 1727096421.02359: variable 'interface' from source: set_fact 24134 1727096421.02363: variable 'interface' from source: set_fact 24134 1727096421.02387: Evaluated conditional (__network_wpa_supplicant_required): False 24134 1727096421.02395: when evaluation is False, skipping this task 24134 1727096421.02402: _execute() done 24134 1727096421.02409: dumping result to json 24134 1727096421.02416: done dumping result, returning 24134 1727096421.02427: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-1673-d3fc-000000000074] 24134 1727096421.02437: sending task result for task 0afff68d-5257-1673-d3fc-000000000074 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 24134 1727096421.02621: no more pending results, returning what we have 24134 1727096421.02625: results queue empty 24134 1727096421.02627: checking for any_errors_fatal 24134 1727096421.02651: done checking for any_errors_fatal 24134 1727096421.02652: checking for max_fail_percentage 24134 1727096421.02654: done checking for max_fail_percentage 24134 1727096421.02655: checking to see if all hosts have failed and the running result is not ok 24134 1727096421.02656: done checking to see if all hosts have failed 24134 1727096421.02656: getting the remaining hosts for this loop 24134 1727096421.02658: done getting the remaining hosts for this loop 24134 1727096421.02662: getting the next task for host managed_node1 24134 1727096421.02671: done getting next task for host managed_node1 24134 1727096421.02675: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 24134 1727096421.02679: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096421.02693: getting variables 24134 1727096421.02695: in VariableManager get_vars() 24134 1727096421.02736: Calling all_inventory to load vars for managed_node1 24134 1727096421.02739: Calling groups_inventory to load vars for managed_node1 24134 1727096421.02742: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096421.02752: Calling all_plugins_play to load vars for managed_node1 24134 1727096421.02754: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096421.02757: Calling groups_plugins_play to load vars for managed_node1 24134 1727096421.03416: done sending task result for task 0afff68d-5257-1673-d3fc-000000000074 24134 1727096421.03420: WORKER PROCESS EXITING 24134 1727096421.04894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096421.06677: done with get_vars() 24134 1727096421.06705: done getting variables 24134 1727096421.06774: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 09:00:21 -0400 (0:00:00.121) 0:00:25.281 ****** 24134 1727096421.06804: entering _queue_task() for managed_node1/service 24134 1727096421.07160: worker is 1 (out of 1 available) 24134 1727096421.07177: exiting _queue_task() for managed_node1/service 24134 1727096421.07191: done queuing things up, now waiting for results queue to drain 24134 1727096421.07192: waiting for pending results... 24134 1727096421.07504: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 24134 1727096421.07625: in run() - task 0afff68d-5257-1673-d3fc-000000000075 24134 1727096421.07653: variable 'ansible_search_path' from source: unknown 24134 1727096421.07662: variable 'ansible_search_path' from source: unknown 24134 1727096421.07705: calling self._execute() 24134 1727096421.07815: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.07828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.07843: variable 'omit' from source: magic vars 24134 1727096421.08294: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.08298: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.08378: variable 'connection_failed' from source: set_fact 24134 1727096421.08389: Evaluated conditional (not connection_failed): True 24134 1727096421.08510: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.08525: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.08636: variable 'connection_failed' from source: set_fact 24134 1727096421.08648: Evaluated conditional (not connection_failed): True 24134 1727096421.08837: variable 'network_provider' from source: set_fact 24134 1727096421.08840: Evaluated conditional (network_provider == "initscripts"): False 24134 1727096421.08842: when evaluation is False, skipping this task 24134 1727096421.08843: _execute() done 24134 1727096421.08845: dumping result to json 24134 1727096421.08847: done dumping result, returning 24134 1727096421.08850: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-1673-d3fc-000000000075] 24134 1727096421.08851: sending task result for task 0afff68d-5257-1673-d3fc-000000000075 24134 1727096421.08919: done sending task result for task 0afff68d-5257-1673-d3fc-000000000075 24134 1727096421.08922: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096421.08992: no more pending results, returning what we have 24134 1727096421.08996: results queue empty 24134 1727096421.08997: checking for any_errors_fatal 24134 1727096421.09008: done checking for any_errors_fatal 24134 1727096421.09009: checking for max_fail_percentage 24134 1727096421.09011: done checking for max_fail_percentage 24134 1727096421.09012: checking to see if all hosts have failed and the running result is not ok 24134 1727096421.09013: done checking to see if all hosts have failed 24134 1727096421.09014: getting the remaining hosts for this loop 24134 1727096421.09016: done getting the remaining hosts for this loop 24134 1727096421.09020: getting the next task for host managed_node1 24134 1727096421.09026: done getting next task for host managed_node1 24134 1727096421.09029: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24134 1727096421.09034: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096421.09049: getting variables 24134 1727096421.09051: in VariableManager get_vars() 24134 1727096421.09091: Calling all_inventory to load vars for managed_node1 24134 1727096421.09094: Calling groups_inventory to load vars for managed_node1 24134 1727096421.09097: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096421.09108: Calling all_plugins_play to load vars for managed_node1 24134 1727096421.09110: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096421.09113: Calling groups_plugins_play to load vars for managed_node1 24134 1727096421.10774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096421.12407: done with get_vars() 24134 1727096421.12443: done getting variables 24134 1727096421.12510: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 09:00:21 -0400 (0:00:00.057) 0:00:25.338 ****** 24134 1727096421.12546: entering _queue_task() for managed_node1/copy 24134 1727096421.13099: worker is 1 (out of 1 available) 24134 1727096421.13110: exiting _queue_task() for managed_node1/copy 24134 1727096421.13120: done queuing things up, now waiting for results queue to drain 24134 1727096421.13122: waiting for pending results... 24134 1727096421.13361: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24134 1727096421.13366: in run() - task 0afff68d-5257-1673-d3fc-000000000076 24134 1727096421.13372: variable 'ansible_search_path' from source: unknown 24134 1727096421.13375: variable 'ansible_search_path' from source: unknown 24134 1727096421.13401: calling self._execute() 24134 1727096421.13509: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.13522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.13536: variable 'omit' from source: magic vars 24134 1727096421.13915: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.13933: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.14050: variable 'connection_failed' from source: set_fact 24134 1727096421.14060: Evaluated conditional (not connection_failed): True 24134 1727096421.14180: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.14191: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.14297: variable 'connection_failed' from source: set_fact 24134 1727096421.14329: Evaluated conditional (not connection_failed): True 24134 1727096421.14427: variable 'network_provider' from source: set_fact 24134 1727096421.14446: Evaluated conditional (network_provider == "initscripts"): False 24134 1727096421.14548: when evaluation is False, skipping this task 24134 1727096421.14551: _execute() done 24134 1727096421.14554: dumping result to json 24134 1727096421.14557: done dumping result, returning 24134 1727096421.14560: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-1673-d3fc-000000000076] 24134 1727096421.14563: sending task result for task 0afff68d-5257-1673-d3fc-000000000076 24134 1727096421.14631: done sending task result for task 0afff68d-5257-1673-d3fc-000000000076 24134 1727096421.14633: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 24134 1727096421.14696: no more pending results, returning what we have 24134 1727096421.14701: results queue empty 24134 1727096421.14702: checking for any_errors_fatal 24134 1727096421.14709: done checking for any_errors_fatal 24134 1727096421.14710: checking for max_fail_percentage 24134 1727096421.14712: done checking for max_fail_percentage 24134 1727096421.14713: checking to see if all hosts have failed and the running result is not ok 24134 1727096421.14714: done checking to see if all hosts have failed 24134 1727096421.14715: getting the remaining hosts for this loop 24134 1727096421.14717: done getting the remaining hosts for this loop 24134 1727096421.14721: getting the next task for host managed_node1 24134 1727096421.14728: done getting next task for host managed_node1 24134 1727096421.14732: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24134 1727096421.14735: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096421.14751: getting variables 24134 1727096421.14753: in VariableManager get_vars() 24134 1727096421.14797: Calling all_inventory to load vars for managed_node1 24134 1727096421.14801: Calling groups_inventory to load vars for managed_node1 24134 1727096421.14803: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096421.14815: Calling all_plugins_play to load vars for managed_node1 24134 1727096421.14818: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096421.14821: Calling groups_plugins_play to load vars for managed_node1 24134 1727096421.16655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096421.18285: done with get_vars() 24134 1727096421.18313: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 09:00:21 -0400 (0:00:00.058) 0:00:25.397 ****** 24134 1727096421.18408: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24134 1727096421.18763: worker is 1 (out of 1 available) 24134 1727096421.18906: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24134 1727096421.18916: done queuing things up, now waiting for results queue to drain 24134 1727096421.18917: waiting for pending results... 24134 1727096421.19130: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24134 1727096421.19214: in run() - task 0afff68d-5257-1673-d3fc-000000000077 24134 1727096421.19241: variable 'ansible_search_path' from source: unknown 24134 1727096421.19336: variable 'ansible_search_path' from source: unknown 24134 1727096421.19339: calling self._execute() 24134 1727096421.19397: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.19410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.19423: variable 'omit' from source: magic vars 24134 1727096421.19814: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.19832: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.19951: variable 'connection_failed' from source: set_fact 24134 1727096421.19955: Evaluated conditional (not connection_failed): True 24134 1727096421.20036: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.20040: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.20115: variable 'connection_failed' from source: set_fact 24134 1727096421.20119: Evaluated conditional (not connection_failed): True 24134 1727096421.20124: variable 'omit' from source: magic vars 24134 1727096421.20159: variable 'omit' from source: magic vars 24134 1727096421.20281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096421.21902: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096421.21907: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096421.21910: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096421.21912: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096421.21939: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096421.22173: variable 'network_provider' from source: set_fact 24134 1727096421.22177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096421.22199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096421.22227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096421.22274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096421.22294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096421.22371: variable 'omit' from source: magic vars 24134 1727096421.22486: variable 'omit' from source: magic vars 24134 1727096421.22591: variable 'network_connections' from source: play vars 24134 1727096421.22608: variable 'profile' from source: play vars 24134 1727096421.22676: variable 'profile' from source: play vars 24134 1727096421.22687: variable 'interface' from source: set_fact 24134 1727096421.22757: variable 'interface' from source: set_fact 24134 1727096421.22875: variable 'omit' from source: magic vars 24134 1727096421.22881: variable '__lsr_ansible_managed' from source: task vars 24134 1727096421.22926: variable '__lsr_ansible_managed' from source: task vars 24134 1727096421.23124: Loaded config def from plugin (lookup/template) 24134 1727096421.23128: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 24134 1727096421.23149: File lookup term: get_ansible_managed.j2 24134 1727096421.23152: variable 'ansible_search_path' from source: unknown 24134 1727096421.23157: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 24134 1727096421.23174: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 24134 1727096421.23186: variable 'ansible_search_path' from source: unknown 24134 1727096421.27324: variable 'ansible_managed' from source: unknown 24134 1727096421.27403: variable 'omit' from source: magic vars 24134 1727096421.27431: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096421.27452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096421.27466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096421.27482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096421.27491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096421.27514: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096421.27518: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.27520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.27586: Set connection var ansible_shell_executable to /bin/sh 24134 1727096421.27590: Set connection var ansible_pipelining to False 24134 1727096421.27595: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096421.27604: Set connection var ansible_timeout to 10 24134 1727096421.27607: Set connection var ansible_connection to ssh 24134 1727096421.27609: Set connection var ansible_shell_type to sh 24134 1727096421.27627: variable 'ansible_shell_executable' from source: unknown 24134 1727096421.27629: variable 'ansible_connection' from source: unknown 24134 1727096421.27632: variable 'ansible_module_compression' from source: unknown 24134 1727096421.27634: variable 'ansible_shell_type' from source: unknown 24134 1727096421.27637: variable 'ansible_shell_executable' from source: unknown 24134 1727096421.27648: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.27651: variable 'ansible_pipelining' from source: unknown 24134 1727096421.27653: variable 'ansible_timeout' from source: unknown 24134 1727096421.27655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.27742: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096421.27751: variable 'omit' from source: magic vars 24134 1727096421.27757: starting attempt loop 24134 1727096421.27759: running the handler 24134 1727096421.27773: _low_level_execute_command(): starting 24134 1727096421.27778: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096421.28487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096421.28534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096421.28556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096421.28581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096421.28690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096421.30454: stdout chunk (state=3): >>>/root <<< 24134 1727096421.30552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096421.30586: stderr chunk (state=3): >>><<< 24134 1727096421.30589: stdout chunk (state=3): >>><<< 24134 1727096421.30609: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096421.30620: _low_level_execute_command(): starting 24134 1727096421.30626: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798 `" && echo ansible-tmp-1727096421.3061008-25388-188523186610798="` echo /root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798 `" ) && sleep 0' 24134 1727096421.31064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096421.31069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096421.31072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24134 1727096421.31074: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096421.31077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096421.31125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096421.31129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096421.31229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096421.33196: stdout chunk (state=3): >>>ansible-tmp-1727096421.3061008-25388-188523186610798=/root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798 <<< 24134 1727096421.33303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096421.33335: stderr chunk (state=3): >>><<< 24134 1727096421.33337: stdout chunk (state=3): >>><<< 24134 1727096421.33349: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096421.3061008-25388-188523186610798=/root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096421.33477: variable 'ansible_module_compression' from source: unknown 24134 1727096421.33482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 24134 1727096421.33484: variable 'ansible_facts' from source: unknown 24134 1727096421.33527: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798/AnsiballZ_network_connections.py 24134 1727096421.33629: Sending initial data 24134 1727096421.33633: Sent initial data (168 bytes) 24134 1727096421.34042: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096421.34049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096421.34075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096421.34078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096421.34080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096421.34135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096421.34138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096421.34212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096421.35876: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096421.35941: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096421.36015: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpdvrvy4s7 /root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798/AnsiballZ_network_connections.py <<< 24134 1727096421.36018: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798/AnsiballZ_network_connections.py" <<< 24134 1727096421.36081: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpdvrvy4s7" to remote "/root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798/AnsiballZ_network_connections.py" <<< 24134 1727096421.37156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096421.37196: stderr chunk (state=3): >>><<< 24134 1727096421.37206: stdout chunk (state=3): >>><<< 24134 1727096421.37274: done transferring module to remote 24134 1727096421.37347: _low_level_execute_command(): starting 24134 1727096421.37351: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798/ /root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798/AnsiballZ_network_connections.py && sleep 0' 24134 1727096421.37958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096421.37979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096421.38036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096421.38107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096421.38147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096421.38182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096421.38245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096421.40172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096421.40176: stdout chunk (state=3): >>><<< 24134 1727096421.40179: stderr chunk (state=3): >>><<< 24134 1727096421.40283: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096421.40286: _low_level_execute_command(): starting 24134 1727096421.40290: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798/AnsiballZ_network_connections.py && sleep 0' 24134 1727096421.40938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096421.40952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096421.40979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096421.41009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096421.41062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096421.41079: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096421.41156: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096421.41200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096421.41221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096421.41239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096421.41374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096421.68109: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 24134 1727096421.70346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096421.70350: stdout chunk (state=3): >>><<< 24134 1727096421.70352: stderr chunk (state=3): >>><<< 24134 1727096421.70355: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096421.70358: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096421.70360: _low_level_execute_command(): starting 24134 1727096421.70423: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096421.3061008-25388-188523186610798/ > /dev/null 2>&1 && sleep 0' 24134 1727096421.71381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096421.71385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096421.71388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096421.71390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096421.71392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096421.71394: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096421.71396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096421.71398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096421.71399: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096421.71401: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24134 1727096421.71403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096421.71405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096421.71598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096421.71601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096421.71604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096421.71606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096421.73447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096421.73499: stderr chunk (state=3): >>><<< 24134 1727096421.73505: stdout chunk (state=3): >>><<< 24134 1727096421.73695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096421.73702: handler run complete 24134 1727096421.73704: attempt loop complete, returning result 24134 1727096421.73706: _execute() done 24134 1727096421.73707: dumping result to json 24134 1727096421.73709: done dumping result, returning 24134 1727096421.73710: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-1673-d3fc-000000000077] 24134 1727096421.73712: sending task result for task 0afff68d-5257-1673-d3fc-000000000077 24134 1727096421.73778: done sending task result for task 0afff68d-5257-1673-d3fc-000000000077 24134 1727096421.73781: WORKER PROCESS EXITING ok: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: 24134 1727096421.73872: no more pending results, returning what we have 24134 1727096421.73876: results queue empty 24134 1727096421.73877: checking for any_errors_fatal 24134 1727096421.73882: done checking for any_errors_fatal 24134 1727096421.73883: checking for max_fail_percentage 24134 1727096421.73884: done checking for max_fail_percentage 24134 1727096421.73885: checking to see if all hosts have failed and the running result is not ok 24134 1727096421.73886: done checking to see if all hosts have failed 24134 1727096421.73886: getting the remaining hosts for this loop 24134 1727096421.73888: done getting the remaining hosts for this loop 24134 1727096421.73892: getting the next task for host managed_node1 24134 1727096421.73897: done getting next task for host managed_node1 24134 1727096421.73900: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 24134 1727096421.73902: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096421.73910: getting variables 24134 1727096421.73912: in VariableManager get_vars() 24134 1727096421.73944: Calling all_inventory to load vars for managed_node1 24134 1727096421.73946: Calling groups_inventory to load vars for managed_node1 24134 1727096421.73948: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096421.73956: Calling all_plugins_play to load vars for managed_node1 24134 1727096421.73959: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096421.73961: Calling groups_plugins_play to load vars for managed_node1 24134 1727096421.75720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096421.76600: done with get_vars() 24134 1727096421.76620: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 09:00:21 -0400 (0:00:00.582) 0:00:25.980 ****** 24134 1727096421.76683: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24134 1727096421.76945: worker is 1 (out of 1 available) 24134 1727096421.76958: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24134 1727096421.76972: done queuing things up, now waiting for results queue to drain 24134 1727096421.76974: waiting for pending results... 24134 1727096421.77149: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 24134 1727096421.77222: in run() - task 0afff68d-5257-1673-d3fc-000000000078 24134 1727096421.77244: variable 'ansible_search_path' from source: unknown 24134 1727096421.77272: variable 'ansible_search_path' from source: unknown 24134 1727096421.77291: calling self._execute() 24134 1727096421.77454: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.77458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.77461: variable 'omit' from source: magic vars 24134 1727096421.77976: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.77980: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.77983: variable 'connection_failed' from source: set_fact 24134 1727096421.77986: Evaluated conditional (not connection_failed): True 24134 1727096421.78059: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.78076: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.78195: variable 'connection_failed' from source: set_fact 24134 1727096421.78250: Evaluated conditional (not connection_failed): True 24134 1727096421.78437: variable 'network_state' from source: role '' defaults 24134 1727096421.78452: Evaluated conditional (network_state != {}): False 24134 1727096421.78455: when evaluation is False, skipping this task 24134 1727096421.78458: _execute() done 24134 1727096421.78484: dumping result to json 24134 1727096421.78495: done dumping result, returning 24134 1727096421.78497: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-1673-d3fc-000000000078] 24134 1727096421.78500: sending task result for task 0afff68d-5257-1673-d3fc-000000000078 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096421.78655: no more pending results, returning what we have 24134 1727096421.78660: results queue empty 24134 1727096421.78661: checking for any_errors_fatal 24134 1727096421.78677: done checking for any_errors_fatal 24134 1727096421.78678: checking for max_fail_percentage 24134 1727096421.78680: done checking for max_fail_percentage 24134 1727096421.78681: checking to see if all hosts have failed and the running result is not ok 24134 1727096421.78682: done checking to see if all hosts have failed 24134 1727096421.78682: getting the remaining hosts for this loop 24134 1727096421.78684: done getting the remaining hosts for this loop 24134 1727096421.78688: getting the next task for host managed_node1 24134 1727096421.78693: done getting next task for host managed_node1 24134 1727096421.78697: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24134 1727096421.78700: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096421.78714: getting variables 24134 1727096421.78715: in VariableManager get_vars() 24134 1727096421.78764: Calling all_inventory to load vars for managed_node1 24134 1727096421.78770: Calling groups_inventory to load vars for managed_node1 24134 1727096421.78773: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096421.78779: done sending task result for task 0afff68d-5257-1673-d3fc-000000000078 24134 1727096421.78781: WORKER PROCESS EXITING 24134 1727096421.78789: Calling all_plugins_play to load vars for managed_node1 24134 1727096421.78792: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096421.78794: Calling groups_plugins_play to load vars for managed_node1 24134 1727096421.79678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096421.80628: done with get_vars() 24134 1727096421.80659: done getting variables 24134 1727096421.80723: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 09:00:21 -0400 (0:00:00.040) 0:00:26.020 ****** 24134 1727096421.80754: entering _queue_task() for managed_node1/debug 24134 1727096421.81124: worker is 1 (out of 1 available) 24134 1727096421.81137: exiting _queue_task() for managed_node1/debug 24134 1727096421.81150: done queuing things up, now waiting for results queue to drain 24134 1727096421.81151: waiting for pending results... 24134 1727096421.81450: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24134 1727096421.81657: in run() - task 0afff68d-5257-1673-d3fc-000000000079 24134 1727096421.81661: variable 'ansible_search_path' from source: unknown 24134 1727096421.81663: variable 'ansible_search_path' from source: unknown 24134 1727096421.81666: calling self._execute() 24134 1727096421.81691: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.81701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.81710: variable 'omit' from source: magic vars 24134 1727096421.82091: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.82111: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.82220: variable 'connection_failed' from source: set_fact 24134 1727096421.82223: Evaluated conditional (not connection_failed): True 24134 1727096421.82437: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.82441: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.82444: variable 'connection_failed' from source: set_fact 24134 1727096421.82446: Evaluated conditional (not connection_failed): True 24134 1727096421.82448: variable 'omit' from source: magic vars 24134 1727096421.82488: variable 'omit' from source: magic vars 24134 1727096421.82525: variable 'omit' from source: magic vars 24134 1727096421.82571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096421.82612: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096421.82632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096421.82654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096421.82663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096421.82764: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096421.82773: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.82777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.82814: Set connection var ansible_shell_executable to /bin/sh 24134 1727096421.82818: Set connection var ansible_pipelining to False 24134 1727096421.82825: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096421.82876: Set connection var ansible_timeout to 10 24134 1727096421.82880: Set connection var ansible_connection to ssh 24134 1727096421.82882: Set connection var ansible_shell_type to sh 24134 1727096421.82884: variable 'ansible_shell_executable' from source: unknown 24134 1727096421.82890: variable 'ansible_connection' from source: unknown 24134 1727096421.82894: variable 'ansible_module_compression' from source: unknown 24134 1727096421.82897: variable 'ansible_shell_type' from source: unknown 24134 1727096421.82899: variable 'ansible_shell_executable' from source: unknown 24134 1727096421.82901: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.82904: variable 'ansible_pipelining' from source: unknown 24134 1727096421.82907: variable 'ansible_timeout' from source: unknown 24134 1727096421.82910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.83094: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096421.83098: variable 'omit' from source: magic vars 24134 1727096421.83100: starting attempt loop 24134 1727096421.83103: running the handler 24134 1727096421.83178: variable '__network_connections_result' from source: set_fact 24134 1727096421.83228: handler run complete 24134 1727096421.83249: attempt loop complete, returning result 24134 1727096421.83253: _execute() done 24134 1727096421.83256: dumping result to json 24134 1727096421.83258: done dumping result, returning 24134 1727096421.83272: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-1673-d3fc-000000000079] 24134 1727096421.83275: sending task result for task 0afff68d-5257-1673-d3fc-000000000079 24134 1727096421.83445: done sending task result for task 0afff68d-5257-1673-d3fc-000000000079 24134 1727096421.83449: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 24134 1727096421.83515: no more pending results, returning what we have 24134 1727096421.83518: results queue empty 24134 1727096421.83520: checking for any_errors_fatal 24134 1727096421.83527: done checking for any_errors_fatal 24134 1727096421.83529: checking for max_fail_percentage 24134 1727096421.83530: done checking for max_fail_percentage 24134 1727096421.83531: checking to see if all hosts have failed and the running result is not ok 24134 1727096421.83532: done checking to see if all hosts have failed 24134 1727096421.83533: getting the remaining hosts for this loop 24134 1727096421.83534: done getting the remaining hosts for this loop 24134 1727096421.83538: getting the next task for host managed_node1 24134 1727096421.83543: done getting next task for host managed_node1 24134 1727096421.83546: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24134 1727096421.83549: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096421.83558: getting variables 24134 1727096421.83560: in VariableManager get_vars() 24134 1727096421.83599: Calling all_inventory to load vars for managed_node1 24134 1727096421.83602: Calling groups_inventory to load vars for managed_node1 24134 1727096421.83605: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096421.83615: Calling all_plugins_play to load vars for managed_node1 24134 1727096421.83617: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096421.83620: Calling groups_plugins_play to load vars for managed_node1 24134 1727096421.85129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096421.86810: done with get_vars() 24134 1727096421.86833: done getting variables 24134 1727096421.86897: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 09:00:21 -0400 (0:00:00.061) 0:00:26.082 ****** 24134 1727096421.86930: entering _queue_task() for managed_node1/debug 24134 1727096421.87278: worker is 1 (out of 1 available) 24134 1727096421.87292: exiting _queue_task() for managed_node1/debug 24134 1727096421.87304: done queuing things up, now waiting for results queue to drain 24134 1727096421.87306: waiting for pending results... 24134 1727096421.87601: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24134 1727096421.87807: in run() - task 0afff68d-5257-1673-d3fc-00000000007a 24134 1727096421.87811: variable 'ansible_search_path' from source: unknown 24134 1727096421.87814: variable 'ansible_search_path' from source: unknown 24134 1727096421.87816: calling self._execute() 24134 1727096421.87828: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.87835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.87844: variable 'omit' from source: magic vars 24134 1727096421.88211: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.88223: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.88335: variable 'connection_failed' from source: set_fact 24134 1727096421.88347: Evaluated conditional (not connection_failed): True 24134 1727096421.88463: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.88467: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.88571: variable 'connection_failed' from source: set_fact 24134 1727096421.88574: Evaluated conditional (not connection_failed): True 24134 1727096421.88580: variable 'omit' from source: magic vars 24134 1727096421.88616: variable 'omit' from source: magic vars 24134 1727096421.88651: variable 'omit' from source: magic vars 24134 1727096421.88697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096421.88731: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096421.88750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096421.88785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096421.88788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096421.88875: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096421.88878: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.88881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.88919: Set connection var ansible_shell_executable to /bin/sh 24134 1727096421.88925: Set connection var ansible_pipelining to False 24134 1727096421.88930: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096421.88940: Set connection var ansible_timeout to 10 24134 1727096421.88943: Set connection var ansible_connection to ssh 24134 1727096421.88945: Set connection var ansible_shell_type to sh 24134 1727096421.88973: variable 'ansible_shell_executable' from source: unknown 24134 1727096421.88976: variable 'ansible_connection' from source: unknown 24134 1727096421.88978: variable 'ansible_module_compression' from source: unknown 24134 1727096421.88981: variable 'ansible_shell_type' from source: unknown 24134 1727096421.88983: variable 'ansible_shell_executable' from source: unknown 24134 1727096421.88985: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.88988: variable 'ansible_pipelining' from source: unknown 24134 1727096421.88990: variable 'ansible_timeout' from source: unknown 24134 1727096421.89104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.89119: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096421.89131: variable 'omit' from source: magic vars 24134 1727096421.89136: starting attempt loop 24134 1727096421.89139: running the handler 24134 1727096421.89188: variable '__network_connections_result' from source: set_fact 24134 1727096421.89258: variable '__network_connections_result' from source: set_fact 24134 1727096421.89361: handler run complete 24134 1727096421.89385: attempt loop complete, returning result 24134 1727096421.89388: _execute() done 24134 1727096421.89391: dumping result to json 24134 1727096421.89393: done dumping result, returning 24134 1727096421.89403: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-1673-d3fc-00000000007a] 24134 1727096421.89406: sending task result for task 0afff68d-5257-1673-d3fc-00000000007a 24134 1727096421.89611: done sending task result for task 0afff68d-5257-1673-d3fc-00000000007a 24134 1727096421.89614: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 24134 1727096421.89725: no more pending results, returning what we have 24134 1727096421.89728: results queue empty 24134 1727096421.89729: checking for any_errors_fatal 24134 1727096421.89735: done checking for any_errors_fatal 24134 1727096421.89736: checking for max_fail_percentage 24134 1727096421.89737: done checking for max_fail_percentage 24134 1727096421.89738: checking to see if all hosts have failed and the running result is not ok 24134 1727096421.89738: done checking to see if all hosts have failed 24134 1727096421.89739: getting the remaining hosts for this loop 24134 1727096421.89740: done getting the remaining hosts for this loop 24134 1727096421.89744: getting the next task for host managed_node1 24134 1727096421.89749: done getting next task for host managed_node1 24134 1727096421.89753: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24134 1727096421.89755: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096421.89764: getting variables 24134 1727096421.89766: in VariableManager get_vars() 24134 1727096421.89799: Calling all_inventory to load vars for managed_node1 24134 1727096421.89803: Calling groups_inventory to load vars for managed_node1 24134 1727096421.89805: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096421.89813: Calling all_plugins_play to load vars for managed_node1 24134 1727096421.89815: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096421.89817: Calling groups_plugins_play to load vars for managed_node1 24134 1727096421.91190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096421.92746: done with get_vars() 24134 1727096421.92770: done getting variables 24134 1727096421.92827: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 09:00:21 -0400 (0:00:00.059) 0:00:26.142 ****** 24134 1727096421.92860: entering _queue_task() for managed_node1/debug 24134 1727096421.93155: worker is 1 (out of 1 available) 24134 1727096421.93273: exiting _queue_task() for managed_node1/debug 24134 1727096421.93284: done queuing things up, now waiting for results queue to drain 24134 1727096421.93286: waiting for pending results... 24134 1727096421.93588: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24134 1727096421.93593: in run() - task 0afff68d-5257-1673-d3fc-00000000007b 24134 1727096421.93596: variable 'ansible_search_path' from source: unknown 24134 1727096421.93598: variable 'ansible_search_path' from source: unknown 24134 1727096421.93601: calling self._execute() 24134 1727096421.93691: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096421.93694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096421.93705: variable 'omit' from source: magic vars 24134 1727096421.94074: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.94084: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.94196: variable 'connection_failed' from source: set_fact 24134 1727096421.94200: Evaluated conditional (not connection_failed): True 24134 1727096421.94317: variable 'ansible_distribution_major_version' from source: facts 24134 1727096421.94322: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096421.94426: variable 'connection_failed' from source: set_fact 24134 1727096421.94429: Evaluated conditional (not connection_failed): True 24134 1727096421.94549: variable 'network_state' from source: role '' defaults 24134 1727096421.94556: Evaluated conditional (network_state != {}): False 24134 1727096421.94559: when evaluation is False, skipping this task 24134 1727096421.94562: _execute() done 24134 1727096421.94566: dumping result to json 24134 1727096421.94573: done dumping result, returning 24134 1727096421.94580: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-1673-d3fc-00000000007b] 24134 1727096421.94586: sending task result for task 0afff68d-5257-1673-d3fc-00000000007b 24134 1727096421.94684: done sending task result for task 0afff68d-5257-1673-d3fc-00000000007b 24134 1727096421.94687: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 24134 1727096421.94810: no more pending results, returning what we have 24134 1727096421.94813: results queue empty 24134 1727096421.94814: checking for any_errors_fatal 24134 1727096421.94821: done checking for any_errors_fatal 24134 1727096421.94821: checking for max_fail_percentage 24134 1727096421.94823: done checking for max_fail_percentage 24134 1727096421.94824: checking to see if all hosts have failed and the running result is not ok 24134 1727096421.94824: done checking to see if all hosts have failed 24134 1727096421.94825: getting the remaining hosts for this loop 24134 1727096421.94826: done getting the remaining hosts for this loop 24134 1727096421.94830: getting the next task for host managed_node1 24134 1727096421.94835: done getting next task for host managed_node1 24134 1727096421.94839: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 24134 1727096421.94841: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096421.94853: getting variables 24134 1727096421.94855: in VariableManager get_vars() 24134 1727096421.94890: Calling all_inventory to load vars for managed_node1 24134 1727096421.94893: Calling groups_inventory to load vars for managed_node1 24134 1727096421.94896: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096421.94905: Calling all_plugins_play to load vars for managed_node1 24134 1727096421.94908: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096421.94910: Calling groups_plugins_play to load vars for managed_node1 24134 1727096422.00119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096422.01259: done with get_vars() 24134 1727096422.01282: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 09:00:22 -0400 (0:00:00.084) 0:00:26.226 ****** 24134 1727096422.01338: entering _queue_task() for managed_node1/ping 24134 1727096422.01598: worker is 1 (out of 1 available) 24134 1727096422.01613: exiting _queue_task() for managed_node1/ping 24134 1727096422.01625: done queuing things up, now waiting for results queue to drain 24134 1727096422.01627: waiting for pending results... 24134 1727096422.01808: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 24134 1727096422.01887: in run() - task 0afff68d-5257-1673-d3fc-00000000007c 24134 1727096422.01898: variable 'ansible_search_path' from source: unknown 24134 1727096422.01901: variable 'ansible_search_path' from source: unknown 24134 1727096422.01931: calling self._execute() 24134 1727096422.02012: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096422.02016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096422.02025: variable 'omit' from source: magic vars 24134 1727096422.02307: variable 'ansible_distribution_major_version' from source: facts 24134 1727096422.02317: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096422.02419: variable 'connection_failed' from source: set_fact 24134 1727096422.02423: Evaluated conditional (not connection_failed): True 24134 1727096422.02580: variable 'ansible_distribution_major_version' from source: facts 24134 1727096422.02583: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096422.02641: variable 'connection_failed' from source: set_fact 24134 1727096422.02646: Evaluated conditional (not connection_failed): True 24134 1727096422.02663: variable 'omit' from source: magic vars 24134 1727096422.02694: variable 'omit' from source: magic vars 24134 1727096422.02732: variable 'omit' from source: magic vars 24134 1727096422.02797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096422.02821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096422.02840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096422.02857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096422.02904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096422.02907: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096422.02910: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096422.02912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096422.03013: Set connection var ansible_shell_executable to /bin/sh 24134 1727096422.03016: Set connection var ansible_pipelining to False 24134 1727096422.03019: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096422.03074: Set connection var ansible_timeout to 10 24134 1727096422.03078: Set connection var ansible_connection to ssh 24134 1727096422.03080: Set connection var ansible_shell_type to sh 24134 1727096422.03082: variable 'ansible_shell_executable' from source: unknown 24134 1727096422.03084: variable 'ansible_connection' from source: unknown 24134 1727096422.03087: variable 'ansible_module_compression' from source: unknown 24134 1727096422.03089: variable 'ansible_shell_type' from source: unknown 24134 1727096422.03092: variable 'ansible_shell_executable' from source: unknown 24134 1727096422.03094: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096422.03096: variable 'ansible_pipelining' from source: unknown 24134 1727096422.03098: variable 'ansible_timeout' from source: unknown 24134 1727096422.03101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096422.03300: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096422.03305: variable 'omit' from source: magic vars 24134 1727096422.03307: starting attempt loop 24134 1727096422.03310: running the handler 24134 1727096422.03312: _low_level_execute_command(): starting 24134 1727096422.03314: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096422.03906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096422.03934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.03976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096422.03992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096422.04075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096422.05850: stdout chunk (state=3): >>>/root <<< 24134 1727096422.05943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096422.05984: stderr chunk (state=3): >>><<< 24134 1727096422.05987: stdout chunk (state=3): >>><<< 24134 1727096422.06009: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096422.06020: _low_level_execute_command(): starting 24134 1727096422.06026: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204 `" && echo ansible-tmp-1727096422.0600865-25429-21683510510204="` echo /root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204 `" ) && sleep 0' 24134 1727096422.06452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096422.06464: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.06472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096422.06475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.06516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096422.06523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096422.06527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096422.06593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096422.08894: stdout chunk (state=3): >>>ansible-tmp-1727096422.0600865-25429-21683510510204=/root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204 <<< 24134 1727096422.08897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096422.08924: stderr chunk (state=3): >>><<< 24134 1727096422.08927: stdout chunk (state=3): >>><<< 24134 1727096422.08950: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096422.0600865-25429-21683510510204=/root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096422.09024: variable 'ansible_module_compression' from source: unknown 24134 1727096422.09275: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 24134 1727096422.09279: variable 'ansible_facts' from source: unknown 24134 1727096422.09429: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204/AnsiballZ_ping.py 24134 1727096422.09694: Sending initial data 24134 1727096422.09703: Sent initial data (152 bytes) 24134 1727096422.10852: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096422.10884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.10910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096422.10963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.11000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096422.11024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096422.11066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096422.11124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096422.12796: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096422.12850: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096422.12937: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpp00j3zb_ /root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204/AnsiballZ_ping.py <<< 24134 1727096422.12954: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204/AnsiballZ_ping.py" <<< 24134 1727096422.13042: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 24134 1727096422.13071: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpp00j3zb_" to remote "/root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204/AnsiballZ_ping.py" <<< 24134 1727096422.14142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096422.14146: stdout chunk (state=3): >>><<< 24134 1727096422.14149: stderr chunk (state=3): >>><<< 24134 1727096422.14152: done transferring module to remote 24134 1727096422.14155: _low_level_execute_command(): starting 24134 1727096422.14158: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204/ /root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204/AnsiballZ_ping.py && sleep 0' 24134 1727096422.14765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096422.14796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096422.14893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096422.16765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096422.16779: stdout chunk (state=3): >>><<< 24134 1727096422.16791: stderr chunk (state=3): >>><<< 24134 1727096422.16813: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096422.16822: _low_level_execute_command(): starting 24134 1727096422.16832: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204/AnsiballZ_ping.py && sleep 0' 24134 1727096422.17410: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096422.17425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096422.17439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096422.17460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096422.17481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096422.17493: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096422.17507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.17526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096422.17615: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096422.17637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096422.17733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096422.32886: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 24134 1727096422.34203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096422.34227: stderr chunk (state=3): >>><<< 24134 1727096422.34230: stdout chunk (state=3): >>><<< 24134 1727096422.34245: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096422.34263: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096422.34274: _low_level_execute_command(): starting 24134 1727096422.34278: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096422.0600865-25429-21683510510204/ > /dev/null 2>&1 && sleep 0' 24134 1727096422.34699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096422.34702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.34705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096422.34707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 24134 1727096422.34709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.34749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096422.34752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096422.34824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096422.36669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096422.36691: stderr chunk (state=3): >>><<< 24134 1727096422.36694: stdout chunk (state=3): >>><<< 24134 1727096422.36706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096422.36715: handler run complete 24134 1727096422.36725: attempt loop complete, returning result 24134 1727096422.36728: _execute() done 24134 1727096422.36731: dumping result to json 24134 1727096422.36733: done dumping result, returning 24134 1727096422.36742: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-1673-d3fc-00000000007c] 24134 1727096422.36746: sending task result for task 0afff68d-5257-1673-d3fc-00000000007c 24134 1727096422.36833: done sending task result for task 0afff68d-5257-1673-d3fc-00000000007c 24134 1727096422.36835: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 24134 1727096422.36895: no more pending results, returning what we have 24134 1727096422.36899: results queue empty 24134 1727096422.36899: checking for any_errors_fatal 24134 1727096422.36909: done checking for any_errors_fatal 24134 1727096422.36909: checking for max_fail_percentage 24134 1727096422.36911: done checking for max_fail_percentage 24134 1727096422.36912: checking to see if all hosts have failed and the running result is not ok 24134 1727096422.36912: done checking to see if all hosts have failed 24134 1727096422.36913: getting the remaining hosts for this loop 24134 1727096422.36915: done getting the remaining hosts for this loop 24134 1727096422.36918: getting the next task for host managed_node1 24134 1727096422.36925: done getting next task for host managed_node1 24134 1727096422.36927: ^ task is: TASK: meta (role_complete) 24134 1727096422.36929: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096422.36937: getting variables 24134 1727096422.36939: in VariableManager get_vars() 24134 1727096422.36979: Calling all_inventory to load vars for managed_node1 24134 1727096422.36982: Calling groups_inventory to load vars for managed_node1 24134 1727096422.36985: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096422.36994: Calling all_plugins_play to load vars for managed_node1 24134 1727096422.36996: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096422.36999: Calling groups_plugins_play to load vars for managed_node1 24134 1727096422.37812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096422.39146: done with get_vars() 24134 1727096422.39162: done getting variables 24134 1727096422.39219: done queuing things up, now waiting for results queue to drain 24134 1727096422.39221: results queue empty 24134 1727096422.39222: checking for any_errors_fatal 24134 1727096422.39223: done checking for any_errors_fatal 24134 1727096422.39224: checking for max_fail_percentage 24134 1727096422.39224: done checking for max_fail_percentage 24134 1727096422.39225: checking to see if all hosts have failed and the running result is not ok 24134 1727096422.39226: done checking to see if all hosts have failed 24134 1727096422.39226: getting the remaining hosts for this loop 24134 1727096422.39227: done getting the remaining hosts for this loop 24134 1727096422.39228: getting the next task for host managed_node1 24134 1727096422.39230: done getting next task for host managed_node1 24134 1727096422.39231: ^ task is: TASK: meta (flush_handlers) 24134 1727096422.39232: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096422.39235: getting variables 24134 1727096422.39235: in VariableManager get_vars() 24134 1727096422.39244: Calling all_inventory to load vars for managed_node1 24134 1727096422.39246: Calling groups_inventory to load vars for managed_node1 24134 1727096422.39248: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096422.39251: Calling all_plugins_play to load vars for managed_node1 24134 1727096422.39252: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096422.39254: Calling groups_plugins_play to load vars for managed_node1 24134 1727096422.39992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096422.40864: done with get_vars() 24134 1727096422.40883: done getting variables 24134 1727096422.40914: in VariableManager get_vars() 24134 1727096422.40923: Calling all_inventory to load vars for managed_node1 24134 1727096422.40925: Calling groups_inventory to load vars for managed_node1 24134 1727096422.40926: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096422.40929: Calling all_plugins_play to load vars for managed_node1 24134 1727096422.40930: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096422.40932: Calling groups_plugins_play to load vars for managed_node1 24134 1727096422.41574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096422.42442: done with get_vars() 24134 1727096422.42459: done queuing things up, now waiting for results queue to drain 24134 1727096422.42460: results queue empty 24134 1727096422.42461: checking for any_errors_fatal 24134 1727096422.42462: done checking for any_errors_fatal 24134 1727096422.42462: checking for max_fail_percentage 24134 1727096422.42463: done checking for max_fail_percentage 24134 1727096422.42463: checking to see if all hosts have failed and the running result is not ok 24134 1727096422.42464: done checking to see if all hosts have failed 24134 1727096422.42464: getting the remaining hosts for this loop 24134 1727096422.42465: done getting the remaining hosts for this loop 24134 1727096422.42467: getting the next task for host managed_node1 24134 1727096422.42473: done getting next task for host managed_node1 24134 1727096422.42474: ^ task is: TASK: meta (flush_handlers) 24134 1727096422.42475: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096422.42477: getting variables 24134 1727096422.42477: in VariableManager get_vars() 24134 1727096422.42488: Calling all_inventory to load vars for managed_node1 24134 1727096422.42490: Calling groups_inventory to load vars for managed_node1 24134 1727096422.42491: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096422.42494: Calling all_plugins_play to load vars for managed_node1 24134 1727096422.42497: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096422.42499: Calling groups_plugins_play to load vars for managed_node1 24134 1727096422.43202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096422.44057: done with get_vars() 24134 1727096422.44074: done getting variables 24134 1727096422.44105: in VariableManager get_vars() 24134 1727096422.44112: Calling all_inventory to load vars for managed_node1 24134 1727096422.44114: Calling groups_inventory to load vars for managed_node1 24134 1727096422.44115: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096422.44119: Calling all_plugins_play to load vars for managed_node1 24134 1727096422.44121: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096422.44123: Calling groups_plugins_play to load vars for managed_node1 24134 1727096422.44757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096422.45702: done with get_vars() 24134 1727096422.45718: done queuing things up, now waiting for results queue to drain 24134 1727096422.45720: results queue empty 24134 1727096422.45720: checking for any_errors_fatal 24134 1727096422.45721: done checking for any_errors_fatal 24134 1727096422.45721: checking for max_fail_percentage 24134 1727096422.45722: done checking for max_fail_percentage 24134 1727096422.45722: checking to see if all hosts have failed and the running result is not ok 24134 1727096422.45723: done checking to see if all hosts have failed 24134 1727096422.45723: getting the remaining hosts for this loop 24134 1727096422.45724: done getting the remaining hosts for this loop 24134 1727096422.45726: getting the next task for host managed_node1 24134 1727096422.45728: done getting next task for host managed_node1 24134 1727096422.45728: ^ task is: None 24134 1727096422.45729: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096422.45730: done queuing things up, now waiting for results queue to drain 24134 1727096422.45730: results queue empty 24134 1727096422.45731: checking for any_errors_fatal 24134 1727096422.45731: done checking for any_errors_fatal 24134 1727096422.45731: checking for max_fail_percentage 24134 1727096422.45732: done checking for max_fail_percentage 24134 1727096422.45733: checking to see if all hosts have failed and the running result is not ok 24134 1727096422.45733: done checking to see if all hosts have failed 24134 1727096422.45734: getting the next task for host managed_node1 24134 1727096422.45735: done getting next task for host managed_node1 24134 1727096422.45736: ^ task is: None 24134 1727096422.45736: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096422.45776: in VariableManager get_vars() 24134 1727096422.45790: done with get_vars() 24134 1727096422.45794: in VariableManager get_vars() 24134 1727096422.45802: done with get_vars() 24134 1727096422.45805: variable 'omit' from source: magic vars 24134 1727096422.45891: variable 'profile' from source: play vars 24134 1727096422.45954: in VariableManager get_vars() 24134 1727096422.45965: done with get_vars() 24134 1727096422.45984: variable 'omit' from source: magic vars 24134 1727096422.46026: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 24134 1727096422.46459: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24134 1727096422.46482: getting the remaining hosts for this loop 24134 1727096422.46483: done getting the remaining hosts for this loop 24134 1727096422.46485: getting the next task for host managed_node1 24134 1727096422.46487: done getting next task for host managed_node1 24134 1727096422.46488: ^ task is: TASK: Gathering Facts 24134 1727096422.46489: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096422.46490: getting variables 24134 1727096422.46491: in VariableManager get_vars() 24134 1727096422.46498: Calling all_inventory to load vars for managed_node1 24134 1727096422.46500: Calling groups_inventory to load vars for managed_node1 24134 1727096422.46501: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096422.46506: Calling all_plugins_play to load vars for managed_node1 24134 1727096422.46508: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096422.46510: Calling groups_plugins_play to load vars for managed_node1 24134 1727096422.47184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096422.48037: done with get_vars() 24134 1727096422.48050: done getting variables 24134 1727096422.48083: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Monday 23 September 2024 09:00:22 -0400 (0:00:00.467) 0:00:26.694 ****** 24134 1727096422.48100: entering _queue_task() for managed_node1/gather_facts 24134 1727096422.48348: worker is 1 (out of 1 available) 24134 1727096422.48361: exiting _queue_task() for managed_node1/gather_facts 24134 1727096422.48376: done queuing things up, now waiting for results queue to drain 24134 1727096422.48378: waiting for pending results... 24134 1727096422.48540: running TaskExecutor() for managed_node1/TASK: Gathering Facts 24134 1727096422.48611: in run() - task 0afff68d-5257-1673-d3fc-000000000521 24134 1727096422.48617: variable 'ansible_search_path' from source: unknown 24134 1727096422.48644: calling self._execute() 24134 1727096422.48723: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096422.48728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096422.48736: variable 'omit' from source: magic vars 24134 1727096422.49013: variable 'ansible_distribution_major_version' from source: facts 24134 1727096422.49023: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096422.49028: variable 'omit' from source: magic vars 24134 1727096422.49051: variable 'omit' from source: magic vars 24134 1727096422.49079: variable 'omit' from source: magic vars 24134 1727096422.49109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096422.49135: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096422.49155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096422.49170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096422.49180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096422.49203: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096422.49207: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096422.49209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096422.49282: Set connection var ansible_shell_executable to /bin/sh 24134 1727096422.49286: Set connection var ansible_pipelining to False 24134 1727096422.49292: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096422.49299: Set connection var ansible_timeout to 10 24134 1727096422.49302: Set connection var ansible_connection to ssh 24134 1727096422.49304: Set connection var ansible_shell_type to sh 24134 1727096422.49320: variable 'ansible_shell_executable' from source: unknown 24134 1727096422.49323: variable 'ansible_connection' from source: unknown 24134 1727096422.49325: variable 'ansible_module_compression' from source: unknown 24134 1727096422.49328: variable 'ansible_shell_type' from source: unknown 24134 1727096422.49330: variable 'ansible_shell_executable' from source: unknown 24134 1727096422.49332: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096422.49334: variable 'ansible_pipelining' from source: unknown 24134 1727096422.49337: variable 'ansible_timeout' from source: unknown 24134 1727096422.49342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096422.49473: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096422.49486: variable 'omit' from source: magic vars 24134 1727096422.49490: starting attempt loop 24134 1727096422.49493: running the handler 24134 1727096422.49501: variable 'ansible_facts' from source: unknown 24134 1727096422.49518: _low_level_execute_command(): starting 24134 1727096422.49524: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096422.50032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096422.50036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.50039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.50094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096422.50098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096422.50196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096422.51900: stdout chunk (state=3): >>>/root <<< 24134 1727096422.51998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096422.52025: stderr chunk (state=3): >>><<< 24134 1727096422.52028: stdout chunk (state=3): >>><<< 24134 1727096422.52048: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096422.52059: _low_level_execute_command(): starting 24134 1727096422.52065: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986 `" && echo ansible-tmp-1727096422.5204754-25458-193747031660986="` echo /root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986 `" ) && sleep 0' 24134 1727096422.52493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096422.52496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096422.52499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.52502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096422.52513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.52553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096422.52557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096422.52631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096422.54556: stdout chunk (state=3): >>>ansible-tmp-1727096422.5204754-25458-193747031660986=/root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986 <<< 24134 1727096422.54666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096422.54699: stderr chunk (state=3): >>><<< 24134 1727096422.54702: stdout chunk (state=3): >>><<< 24134 1727096422.54717: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096422.5204754-25458-193747031660986=/root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096422.54742: variable 'ansible_module_compression' from source: unknown 24134 1727096422.54791: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24134 1727096422.54838: variable 'ansible_facts' from source: unknown 24134 1727096422.54972: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986/AnsiballZ_setup.py 24134 1727096422.55079: Sending initial data 24134 1727096422.55082: Sent initial data (154 bytes) 24134 1727096422.55712: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096422.55745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096422.55761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096422.55792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096422.55893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096422.57537: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 24134 1727096422.57547: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096422.57605: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096422.57670: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmp1re8_6_b /root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986/AnsiballZ_setup.py <<< 24134 1727096422.57676: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986/AnsiballZ_setup.py" <<< 24134 1727096422.57733: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmp1re8_6_b" to remote "/root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986/AnsiballZ_setup.py" <<< 24134 1727096422.59209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096422.59375: stderr chunk (state=3): >>><<< 24134 1727096422.59378: stdout chunk (state=3): >>><<< 24134 1727096422.59381: done transferring module to remote 24134 1727096422.59383: _low_level_execute_command(): starting 24134 1727096422.59386: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986/ /root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986/AnsiballZ_setup.py && sleep 0' 24134 1727096422.59971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096422.59978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096422.59995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096422.60007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096422.60099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096422.61990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096422.62013: stderr chunk (state=3): >>><<< 24134 1727096422.62016: stdout chunk (state=3): >>><<< 24134 1727096422.62108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096422.62111: _low_level_execute_command(): starting 24134 1727096422.62114: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986/AnsiballZ_setup.py && sleep 0' 24134 1727096422.62657: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096422.62683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096422.62700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096422.62806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096422.62820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096422.62841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096422.62938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096423.31303: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:19<<< 24134 1727096423.31331: stdout chunk (state=3): >>>2M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 576, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795151872, "block_size": 4096, "block_total": 65519099, "block_available": 63914832, "block_used": 1604267, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_interfaces": ["peerethtest0", "lo", "eth0", "ethtest0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "of<<< 24134 1727096423.31387: stdout chunk (state=3): >>>f [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "42:d5:59:35:93:c8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40d5:59ff:fe35:93c8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "72:05:50:f0:ff:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7005:50ff:fef0:ffb8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::40d5:59ff:fe35:93c8", "fe80::10ff:acff:fe3f:90f5", "fe80::7005:50ff:fef0:ffb8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::40d5:59ff:fe35:93c8", "fe80::7005:50ff:fef0:ffb8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "23", "epoch": "1727096423", "epoch_int": "1727096423", "date": "2024-09-23", "time": "09:00:23", "iso8601_micro": "2024-09-23T13:00:23.309066Z", "iso8601": "2024-09-23T13:00:23Z", "iso8601_basic": "20240923T090023309066", "iso8601_basic_short": "20240923T090023", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.736328125, "5m": 0.49169921875, "15m": 0.24853515625}, "ansible_lsb": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24134 1727096423.33362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096423.33390: stderr chunk (state=3): >>><<< 24134 1727096423.33393: stdout chunk (state=3): >>><<< 24134 1727096423.33434: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 576, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795151872, "block_size": 4096, "block_total": 65519099, "block_available": 63914832, "block_used": 1604267, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_interfaces": ["peerethtest0", "lo", "eth0", "ethtest0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "42:d5:59:35:93:c8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40d5:59ff:fe35:93c8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "72:05:50:f0:ff:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7005:50ff:fef0:ffb8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::40d5:59ff:fe35:93c8", "fe80::10ff:acff:fe3f:90f5", "fe80::7005:50ff:fef0:ffb8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::40d5:59ff:fe35:93c8", "fe80::7005:50ff:fef0:ffb8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "23", "epoch": "1727096423", "epoch_int": "1727096423", "date": "2024-09-23", "time": "09:00:23", "iso8601_micro": "2024-09-23T13:00:23.309066Z", "iso8601": "2024-09-23T13:00:23Z", "iso8601_basic": "20240923T090023309066", "iso8601_basic_short": "20240923T090023", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.736328125, "5m": 0.49169921875, "15m": 0.24853515625}, "ansible_lsb": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096423.33841: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096423.33845: _low_level_execute_command(): starting 24134 1727096423.33848: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096422.5204754-25458-193747031660986/ > /dev/null 2>&1 && sleep 0' 24134 1727096423.34357: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096423.34462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096423.34481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096423.34501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096423.34600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096423.36480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096423.36504: stderr chunk (state=3): >>><<< 24134 1727096423.36507: stdout chunk (state=3): >>><<< 24134 1727096423.36519: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096423.36526: handler run complete 24134 1727096423.36619: variable 'ansible_facts' from source: unknown 24134 1727096423.36692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096423.36898: variable 'ansible_facts' from source: unknown 24134 1727096423.36958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096423.37058: attempt loop complete, returning result 24134 1727096423.37062: _execute() done 24134 1727096423.37064: dumping result to json 24134 1727096423.37091: done dumping result, returning 24134 1727096423.37098: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-1673-d3fc-000000000521] 24134 1727096423.37104: sending task result for task 0afff68d-5257-1673-d3fc-000000000521 24134 1727096423.37547: done sending task result for task 0afff68d-5257-1673-d3fc-000000000521 24134 1727096423.37550: WORKER PROCESS EXITING ok: [managed_node1] 24134 1727096423.37800: no more pending results, returning what we have 24134 1727096423.37802: results queue empty 24134 1727096423.37803: checking for any_errors_fatal 24134 1727096423.37804: done checking for any_errors_fatal 24134 1727096423.37804: checking for max_fail_percentage 24134 1727096423.37805: done checking for max_fail_percentage 24134 1727096423.37806: checking to see if all hosts have failed and the running result is not ok 24134 1727096423.37806: done checking to see if all hosts have failed 24134 1727096423.37807: getting the remaining hosts for this loop 24134 1727096423.37808: done getting the remaining hosts for this loop 24134 1727096423.37810: getting the next task for host managed_node1 24134 1727096423.37814: done getting next task for host managed_node1 24134 1727096423.37815: ^ task is: TASK: meta (flush_handlers) 24134 1727096423.37816: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096423.37819: getting variables 24134 1727096423.37820: in VariableManager get_vars() 24134 1727096423.37845: Calling all_inventory to load vars for managed_node1 24134 1727096423.37847: Calling groups_inventory to load vars for managed_node1 24134 1727096423.37848: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096423.37856: Calling all_plugins_play to load vars for managed_node1 24134 1727096423.37857: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096423.37859: Calling groups_plugins_play to load vars for managed_node1 24134 1727096423.38653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096423.39542: done with get_vars() 24134 1727096423.39560: done getting variables 24134 1727096423.39616: in VariableManager get_vars() 24134 1727096423.39625: Calling all_inventory to load vars for managed_node1 24134 1727096423.39627: Calling groups_inventory to load vars for managed_node1 24134 1727096423.39628: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096423.39632: Calling all_plugins_play to load vars for managed_node1 24134 1727096423.39633: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096423.39635: Calling groups_plugins_play to load vars for managed_node1 24134 1727096423.40293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096423.41255: done with get_vars() 24134 1727096423.41278: done queuing things up, now waiting for results queue to drain 24134 1727096423.41280: results queue empty 24134 1727096423.41281: checking for any_errors_fatal 24134 1727096423.41283: done checking for any_errors_fatal 24134 1727096423.41284: checking for max_fail_percentage 24134 1727096423.41285: done checking for max_fail_percentage 24134 1727096423.41285: checking to see if all hosts have failed and the running result is not ok 24134 1727096423.41290: done checking to see if all hosts have failed 24134 1727096423.41291: getting the remaining hosts for this loop 24134 1727096423.41291: done getting the remaining hosts for this loop 24134 1727096423.41294: getting the next task for host managed_node1 24134 1727096423.41296: done getting next task for host managed_node1 24134 1727096423.41299: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24134 1727096423.41300: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096423.41309: getting variables 24134 1727096423.41310: in VariableManager get_vars() 24134 1727096423.41320: Calling all_inventory to load vars for managed_node1 24134 1727096423.41321: Calling groups_inventory to load vars for managed_node1 24134 1727096423.41322: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096423.41326: Calling all_plugins_play to load vars for managed_node1 24134 1727096423.41327: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096423.41329: Calling groups_plugins_play to load vars for managed_node1 24134 1727096423.41981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096423.42840: done with get_vars() 24134 1727096423.42854: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 09:00:23 -0400 (0:00:00.948) 0:00:27.642 ****** 24134 1727096423.42912: entering _queue_task() for managed_node1/include_tasks 24134 1727096423.43184: worker is 1 (out of 1 available) 24134 1727096423.43197: exiting _queue_task() for managed_node1/include_tasks 24134 1727096423.43209: done queuing things up, now waiting for results queue to drain 24134 1727096423.43210: waiting for pending results... 24134 1727096423.43386: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24134 1727096423.43456: in run() - task 0afff68d-5257-1673-d3fc-000000000084 24134 1727096423.43473: variable 'ansible_search_path' from source: unknown 24134 1727096423.43476: variable 'ansible_search_path' from source: unknown 24134 1727096423.43500: calling self._execute() 24134 1727096423.43576: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096423.43582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096423.43591: variable 'omit' from source: magic vars 24134 1727096423.43865: variable 'ansible_distribution_major_version' from source: facts 24134 1727096423.43877: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096423.43888: _execute() done 24134 1727096423.43891: dumping result to json 24134 1727096423.43894: done dumping result, returning 24134 1727096423.43902: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-1673-d3fc-000000000084] 24134 1727096423.43907: sending task result for task 0afff68d-5257-1673-d3fc-000000000084 24134 1727096423.43994: done sending task result for task 0afff68d-5257-1673-d3fc-000000000084 24134 1727096423.43997: WORKER PROCESS EXITING 24134 1727096423.44034: no more pending results, returning what we have 24134 1727096423.44038: in VariableManager get_vars() 24134 1727096423.44079: Calling all_inventory to load vars for managed_node1 24134 1727096423.44082: Calling groups_inventory to load vars for managed_node1 24134 1727096423.44084: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096423.44096: Calling all_plugins_play to load vars for managed_node1 24134 1727096423.44098: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096423.44100: Calling groups_plugins_play to load vars for managed_node1 24134 1727096423.44987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096423.45859: done with get_vars() 24134 1727096423.45877: variable 'ansible_search_path' from source: unknown 24134 1727096423.45878: variable 'ansible_search_path' from source: unknown 24134 1727096423.45898: we have included files to process 24134 1727096423.45899: generating all_blocks data 24134 1727096423.45900: done generating all_blocks data 24134 1727096423.45900: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24134 1727096423.45901: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24134 1727096423.45902: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24134 1727096423.46281: done processing included file 24134 1727096423.46282: iterating over new_blocks loaded from include file 24134 1727096423.46283: in VariableManager get_vars() 24134 1727096423.46296: done with get_vars() 24134 1727096423.46297: filtering new block on tags 24134 1727096423.46307: done filtering new block on tags 24134 1727096423.46308: in VariableManager get_vars() 24134 1727096423.46318: done with get_vars() 24134 1727096423.46319: filtering new block on tags 24134 1727096423.46329: done filtering new block on tags 24134 1727096423.46331: in VariableManager get_vars() 24134 1727096423.46341: done with get_vars() 24134 1727096423.46342: filtering new block on tags 24134 1727096423.46351: done filtering new block on tags 24134 1727096423.46352: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 24134 1727096423.46356: extending task lists for all hosts with included blocks 24134 1727096423.46557: done extending task lists 24134 1727096423.46558: done processing included files 24134 1727096423.46558: results queue empty 24134 1727096423.46558: checking for any_errors_fatal 24134 1727096423.46559: done checking for any_errors_fatal 24134 1727096423.46560: checking for max_fail_percentage 24134 1727096423.46560: done checking for max_fail_percentage 24134 1727096423.46561: checking to see if all hosts have failed and the running result is not ok 24134 1727096423.46562: done checking to see if all hosts have failed 24134 1727096423.46562: getting the remaining hosts for this loop 24134 1727096423.46563: done getting the remaining hosts for this loop 24134 1727096423.46565: getting the next task for host managed_node1 24134 1727096423.46569: done getting next task for host managed_node1 24134 1727096423.46572: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24134 1727096423.46574: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096423.46581: getting variables 24134 1727096423.46582: in VariableManager get_vars() 24134 1727096423.46590: Calling all_inventory to load vars for managed_node1 24134 1727096423.46592: Calling groups_inventory to load vars for managed_node1 24134 1727096423.46593: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096423.46596: Calling all_plugins_play to load vars for managed_node1 24134 1727096423.46598: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096423.46600: Calling groups_plugins_play to load vars for managed_node1 24134 1727096423.47255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096423.48110: done with get_vars() 24134 1727096423.48124: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 09:00:23 -0400 (0:00:00.052) 0:00:27.695 ****** 24134 1727096423.48172: entering _queue_task() for managed_node1/setup 24134 1727096423.48423: worker is 1 (out of 1 available) 24134 1727096423.48436: exiting _queue_task() for managed_node1/setup 24134 1727096423.48447: done queuing things up, now waiting for results queue to drain 24134 1727096423.48448: waiting for pending results... 24134 1727096423.48623: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24134 1727096423.48714: in run() - task 0afff68d-5257-1673-d3fc-000000000562 24134 1727096423.48725: variable 'ansible_search_path' from source: unknown 24134 1727096423.48729: variable 'ansible_search_path' from source: unknown 24134 1727096423.48754: calling self._execute() 24134 1727096423.48826: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096423.48832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096423.48840: variable 'omit' from source: magic vars 24134 1727096423.49118: variable 'ansible_distribution_major_version' from source: facts 24134 1727096423.49128: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096423.49273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096423.50800: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096423.50842: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096423.50873: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096423.50901: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096423.50921: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096423.50984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096423.51004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096423.51021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096423.51047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096423.51058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096423.51102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096423.51117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096423.51134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096423.51158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096423.51170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096423.51292: variable '__network_required_facts' from source: role '' defaults 24134 1727096423.51295: variable 'ansible_facts' from source: unknown 24134 1727096423.51726: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 24134 1727096423.51733: when evaluation is False, skipping this task 24134 1727096423.51736: _execute() done 24134 1727096423.51738: dumping result to json 24134 1727096423.51741: done dumping result, returning 24134 1727096423.51749: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-1673-d3fc-000000000562] 24134 1727096423.51754: sending task result for task 0afff68d-5257-1673-d3fc-000000000562 24134 1727096423.51835: done sending task result for task 0afff68d-5257-1673-d3fc-000000000562 24134 1727096423.51837: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096423.51882: no more pending results, returning what we have 24134 1727096423.51886: results queue empty 24134 1727096423.51887: checking for any_errors_fatal 24134 1727096423.51889: done checking for any_errors_fatal 24134 1727096423.51889: checking for max_fail_percentage 24134 1727096423.51891: done checking for max_fail_percentage 24134 1727096423.51892: checking to see if all hosts have failed and the running result is not ok 24134 1727096423.51892: done checking to see if all hosts have failed 24134 1727096423.51893: getting the remaining hosts for this loop 24134 1727096423.51894: done getting the remaining hosts for this loop 24134 1727096423.51898: getting the next task for host managed_node1 24134 1727096423.51905: done getting next task for host managed_node1 24134 1727096423.51908: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 24134 1727096423.51910: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096423.51925: getting variables 24134 1727096423.51926: in VariableManager get_vars() 24134 1727096423.51962: Calling all_inventory to load vars for managed_node1 24134 1727096423.51965: Calling groups_inventory to load vars for managed_node1 24134 1727096423.51971: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096423.51981: Calling all_plugins_play to load vars for managed_node1 24134 1727096423.51984: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096423.51986: Calling groups_plugins_play to load vars for managed_node1 24134 1727096423.52865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096423.53751: done with get_vars() 24134 1727096423.53771: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 09:00:23 -0400 (0:00:00.056) 0:00:27.751 ****** 24134 1727096423.53841: entering _queue_task() for managed_node1/stat 24134 1727096423.54101: worker is 1 (out of 1 available) 24134 1727096423.54117: exiting _queue_task() for managed_node1/stat 24134 1727096423.54129: done queuing things up, now waiting for results queue to drain 24134 1727096423.54130: waiting for pending results... 24134 1727096423.54298: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 24134 1727096423.54379: in run() - task 0afff68d-5257-1673-d3fc-000000000564 24134 1727096423.54391: variable 'ansible_search_path' from source: unknown 24134 1727096423.54395: variable 'ansible_search_path' from source: unknown 24134 1727096423.54419: calling self._execute() 24134 1727096423.54493: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096423.54497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096423.54506: variable 'omit' from source: magic vars 24134 1727096423.54775: variable 'ansible_distribution_major_version' from source: facts 24134 1727096423.54783: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096423.54901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096423.55096: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096423.55132: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096423.55157: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096423.55183: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096423.55247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096423.55265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096423.55286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096423.55303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096423.55365: variable '__network_is_ostree' from source: set_fact 24134 1727096423.55374: Evaluated conditional (not __network_is_ostree is defined): False 24134 1727096423.55377: when evaluation is False, skipping this task 24134 1727096423.55380: _execute() done 24134 1727096423.55382: dumping result to json 24134 1727096423.55385: done dumping result, returning 24134 1727096423.55392: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-1673-d3fc-000000000564] 24134 1727096423.55397: sending task result for task 0afff68d-5257-1673-d3fc-000000000564 24134 1727096423.55479: done sending task result for task 0afff68d-5257-1673-d3fc-000000000564 24134 1727096423.55482: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24134 1727096423.55529: no more pending results, returning what we have 24134 1727096423.55532: results queue empty 24134 1727096423.55533: checking for any_errors_fatal 24134 1727096423.55539: done checking for any_errors_fatal 24134 1727096423.55540: checking for max_fail_percentage 24134 1727096423.55541: done checking for max_fail_percentage 24134 1727096423.55542: checking to see if all hosts have failed and the running result is not ok 24134 1727096423.55543: done checking to see if all hosts have failed 24134 1727096423.55543: getting the remaining hosts for this loop 24134 1727096423.55545: done getting the remaining hosts for this loop 24134 1727096423.55548: getting the next task for host managed_node1 24134 1727096423.55555: done getting next task for host managed_node1 24134 1727096423.55558: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24134 1727096423.55561: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096423.55579: getting variables 24134 1727096423.55581: in VariableManager get_vars() 24134 1727096423.55616: Calling all_inventory to load vars for managed_node1 24134 1727096423.55619: Calling groups_inventory to load vars for managed_node1 24134 1727096423.55621: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096423.55629: Calling all_plugins_play to load vars for managed_node1 24134 1727096423.55632: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096423.55634: Calling groups_plugins_play to load vars for managed_node1 24134 1727096423.56425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096423.57488: done with get_vars() 24134 1727096423.57504: done getting variables 24134 1727096423.57547: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 09:00:23 -0400 (0:00:00.037) 0:00:27.789 ****** 24134 1727096423.57575: entering _queue_task() for managed_node1/set_fact 24134 1727096423.57819: worker is 1 (out of 1 available) 24134 1727096423.57834: exiting _queue_task() for managed_node1/set_fact 24134 1727096423.57848: done queuing things up, now waiting for results queue to drain 24134 1727096423.57849: waiting for pending results... 24134 1727096423.58019: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24134 1727096423.58104: in run() - task 0afff68d-5257-1673-d3fc-000000000565 24134 1727096423.58115: variable 'ansible_search_path' from source: unknown 24134 1727096423.58119: variable 'ansible_search_path' from source: unknown 24134 1727096423.58144: calling self._execute() 24134 1727096423.58214: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096423.58218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096423.58227: variable 'omit' from source: magic vars 24134 1727096423.58673: variable 'ansible_distribution_major_version' from source: facts 24134 1727096423.58677: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096423.58796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096423.59134: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096423.59138: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096423.59176: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096423.59214: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096423.59317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096423.59361: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096423.59394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096423.59424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096423.59520: variable '__network_is_ostree' from source: set_fact 24134 1727096423.59573: Evaluated conditional (not __network_is_ostree is defined): False 24134 1727096423.59577: when evaluation is False, skipping this task 24134 1727096423.59579: _execute() done 24134 1727096423.59581: dumping result to json 24134 1727096423.59583: done dumping result, returning 24134 1727096423.59586: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-1673-d3fc-000000000565] 24134 1727096423.59588: sending task result for task 0afff68d-5257-1673-d3fc-000000000565 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24134 1727096423.59717: no more pending results, returning what we have 24134 1727096423.59721: results queue empty 24134 1727096423.59723: checking for any_errors_fatal 24134 1727096423.59730: done checking for any_errors_fatal 24134 1727096423.59731: checking for max_fail_percentage 24134 1727096423.59733: done checking for max_fail_percentage 24134 1727096423.59734: checking to see if all hosts have failed and the running result is not ok 24134 1727096423.59734: done checking to see if all hosts have failed 24134 1727096423.59735: getting the remaining hosts for this loop 24134 1727096423.59736: done getting the remaining hosts for this loop 24134 1727096423.59740: getting the next task for host managed_node1 24134 1727096423.59750: done getting next task for host managed_node1 24134 1727096423.59753: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 24134 1727096423.59756: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096423.59775: getting variables 24134 1727096423.59777: in VariableManager get_vars() 24134 1727096423.59815: Calling all_inventory to load vars for managed_node1 24134 1727096423.59819: Calling groups_inventory to load vars for managed_node1 24134 1727096423.59821: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096423.59832: Calling all_plugins_play to load vars for managed_node1 24134 1727096423.59835: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096423.59838: Calling groups_plugins_play to load vars for managed_node1 24134 1727096423.60446: done sending task result for task 0afff68d-5257-1673-d3fc-000000000565 24134 1727096423.60450: WORKER PROCESS EXITING 24134 1727096423.60844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096423.61721: done with get_vars() 24134 1727096423.61737: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 09:00:23 -0400 (0:00:00.042) 0:00:27.831 ****** 24134 1727096423.61808: entering _queue_task() for managed_node1/service_facts 24134 1727096423.62043: worker is 1 (out of 1 available) 24134 1727096423.62057: exiting _queue_task() for managed_node1/service_facts 24134 1727096423.62096: done queuing things up, now waiting for results queue to drain 24134 1727096423.62098: waiting for pending results... 24134 1727096423.62483: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 24134 1727096423.62489: in run() - task 0afff68d-5257-1673-d3fc-000000000567 24134 1727096423.62491: variable 'ansible_search_path' from source: unknown 24134 1727096423.62494: variable 'ansible_search_path' from source: unknown 24134 1727096423.62496: calling self._execute() 24134 1727096423.62572: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096423.62585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096423.62600: variable 'omit' from source: magic vars 24134 1727096423.62965: variable 'ansible_distribution_major_version' from source: facts 24134 1727096423.62986: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096423.62997: variable 'omit' from source: magic vars 24134 1727096423.63060: variable 'omit' from source: magic vars 24134 1727096423.63100: variable 'omit' from source: magic vars 24134 1727096423.63141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096423.63185: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096423.63209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096423.63232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096423.63247: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096423.63286: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096423.63295: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096423.63302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096423.63407: Set connection var ansible_shell_executable to /bin/sh 24134 1727096423.63418: Set connection var ansible_pipelining to False 24134 1727096423.63427: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096423.63473: Set connection var ansible_timeout to 10 24134 1727096423.63476: Set connection var ansible_connection to ssh 24134 1727096423.63478: Set connection var ansible_shell_type to sh 24134 1727096423.63480: variable 'ansible_shell_executable' from source: unknown 24134 1727096423.63483: variable 'ansible_connection' from source: unknown 24134 1727096423.63489: variable 'ansible_module_compression' from source: unknown 24134 1727096423.63496: variable 'ansible_shell_type' from source: unknown 24134 1727096423.63503: variable 'ansible_shell_executable' from source: unknown 24134 1727096423.63511: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096423.63572: variable 'ansible_pipelining' from source: unknown 24134 1727096423.63578: variable 'ansible_timeout' from source: unknown 24134 1727096423.63583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096423.63726: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096423.63741: variable 'omit' from source: magic vars 24134 1727096423.63750: starting attempt loop 24134 1727096423.63757: running the handler 24134 1727096423.63777: _low_level_execute_command(): starting 24134 1727096423.63788: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096423.64487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096423.64502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096423.64514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096423.64574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096423.64628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096423.64652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096423.64670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096423.64778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096423.66548: stdout chunk (state=3): >>>/root <<< 24134 1727096423.66653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096423.66693: stderr chunk (state=3): >>><<< 24134 1727096423.66697: stdout chunk (state=3): >>><<< 24134 1727096423.66714: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096423.66809: _low_level_execute_command(): starting 24134 1727096423.66813: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646 `" && echo ansible-tmp-1727096423.667222-25499-274392653583646="` echo /root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646 `" ) && sleep 0' 24134 1727096423.67331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096423.67347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096423.67362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096423.67478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096423.67507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096423.67601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096423.69577: stdout chunk (state=3): >>>ansible-tmp-1727096423.667222-25499-274392653583646=/root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646 <<< 24134 1727096423.69702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096423.69727: stderr chunk (state=3): >>><<< 24134 1727096423.69730: stdout chunk (state=3): >>><<< 24134 1727096423.69976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096423.667222-25499-274392653583646=/root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096423.69980: variable 'ansible_module_compression' from source: unknown 24134 1727096423.69983: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 24134 1727096423.69985: variable 'ansible_facts' from source: unknown 24134 1727096423.69992: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646/AnsiballZ_service_facts.py 24134 1727096423.70115: Sending initial data 24134 1727096423.70182: Sent initial data (161 bytes) 24134 1727096423.70724: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096423.70746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096423.70763: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096423.70804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096423.70817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096423.70886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096423.72515: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096423.72584: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096423.72673: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpjobcm6iz /root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646/AnsiballZ_service_facts.py <<< 24134 1727096423.72676: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646/AnsiballZ_service_facts.py" <<< 24134 1727096423.72750: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpjobcm6iz" to remote "/root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646/AnsiballZ_service_facts.py" <<< 24134 1727096423.73822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096423.73825: stdout chunk (state=3): >>><<< 24134 1727096423.73828: stderr chunk (state=3): >>><<< 24134 1727096423.73830: done transferring module to remote 24134 1727096423.73832: _low_level_execute_command(): starting 24134 1727096423.73834: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646/ /root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646/AnsiballZ_service_facts.py && sleep 0' 24134 1727096423.74485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096423.74538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096423.74556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096423.74579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096423.74676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096423.76560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096423.76573: stdout chunk (state=3): >>><<< 24134 1727096423.76588: stderr chunk (state=3): >>><<< 24134 1727096423.76615: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096423.76704: _low_level_execute_command(): starting 24134 1727096423.76708: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646/AnsiballZ_service_facts.py && sleep 0' 24134 1727096423.77258: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096423.77275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096423.77291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096423.77327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096423.77346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096423.77439: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096423.77453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096423.77476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096423.77582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096425.38452: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 24134 1727096425.38500: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 24134 1727096425.40094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096425.40106: stderr chunk (state=3): >>>Shared connection to 10.31.11.125 closed. <<< 24134 1727096425.40129: stdout chunk (state=3): >>><<< 24134 1727096425.40132: stderr chunk (state=3): >>><<< 24134 1727096425.40275: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096425.41240: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096425.41254: _low_level_execute_command(): starting 24134 1727096425.41263: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096423.667222-25499-274392653583646/ > /dev/null 2>&1 && sleep 0' 24134 1727096425.41908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096425.41984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096425.41987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096425.42057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096425.42076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096425.42107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096425.42201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096425.44154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096425.44182: stderr chunk (state=3): >>><<< 24134 1727096425.44185: stdout chunk (state=3): >>><<< 24134 1727096425.44200: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096425.44373: handler run complete 24134 1727096425.44408: variable 'ansible_facts' from source: unknown 24134 1727096425.44562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096425.45088: variable 'ansible_facts' from source: unknown 24134 1727096425.45233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096425.45449: attempt loop complete, returning result 24134 1727096425.45459: _execute() done 24134 1727096425.45477: dumping result to json 24134 1727096425.45544: done dumping result, returning 24134 1727096425.45558: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-1673-d3fc-000000000567] 24134 1727096425.45573: sending task result for task 0afff68d-5257-1673-d3fc-000000000567 24134 1727096425.46999: done sending task result for task 0afff68d-5257-1673-d3fc-000000000567 24134 1727096425.47003: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096425.47115: no more pending results, returning what we have 24134 1727096425.47118: results queue empty 24134 1727096425.47119: checking for any_errors_fatal 24134 1727096425.47122: done checking for any_errors_fatal 24134 1727096425.47123: checking for max_fail_percentage 24134 1727096425.47125: done checking for max_fail_percentage 24134 1727096425.47126: checking to see if all hosts have failed and the running result is not ok 24134 1727096425.47126: done checking to see if all hosts have failed 24134 1727096425.47127: getting the remaining hosts for this loop 24134 1727096425.47128: done getting the remaining hosts for this loop 24134 1727096425.47132: getting the next task for host managed_node1 24134 1727096425.47143: done getting next task for host managed_node1 24134 1727096425.47147: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 24134 1727096425.47150: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096425.47159: getting variables 24134 1727096425.47160: in VariableManager get_vars() 24134 1727096425.47189: Calling all_inventory to load vars for managed_node1 24134 1727096425.47191: Calling groups_inventory to load vars for managed_node1 24134 1727096425.47194: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096425.47202: Calling all_plugins_play to load vars for managed_node1 24134 1727096425.47205: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096425.47208: Calling groups_plugins_play to load vars for managed_node1 24134 1727096425.48474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096425.50150: done with get_vars() 24134 1727096425.50173: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 09:00:25 -0400 (0:00:01.884) 0:00:29.716 ****** 24134 1727096425.50275: entering _queue_task() for managed_node1/package_facts 24134 1727096425.50608: worker is 1 (out of 1 available) 24134 1727096425.50623: exiting _queue_task() for managed_node1/package_facts 24134 1727096425.50636: done queuing things up, now waiting for results queue to drain 24134 1727096425.50637: waiting for pending results... 24134 1727096425.51007: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 24134 1727096425.51059: in run() - task 0afff68d-5257-1673-d3fc-000000000568 24134 1727096425.51080: variable 'ansible_search_path' from source: unknown 24134 1727096425.51087: variable 'ansible_search_path' from source: unknown 24134 1727096425.51133: calling self._execute() 24134 1727096425.51235: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096425.51247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096425.51259: variable 'omit' from source: magic vars 24134 1727096425.51649: variable 'ansible_distribution_major_version' from source: facts 24134 1727096425.51673: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096425.51686: variable 'omit' from source: magic vars 24134 1727096425.51745: variable 'omit' from source: magic vars 24134 1727096425.51796: variable 'omit' from source: magic vars 24134 1727096425.51871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096425.51893: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096425.51919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096425.51943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096425.51977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096425.52005: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096425.52073: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096425.52076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096425.52139: Set connection var ansible_shell_executable to /bin/sh 24134 1727096425.52151: Set connection var ansible_pipelining to False 24134 1727096425.52161: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096425.52178: Set connection var ansible_timeout to 10 24134 1727096425.52185: Set connection var ansible_connection to ssh 24134 1727096425.52203: Set connection var ansible_shell_type to sh 24134 1727096425.52228: variable 'ansible_shell_executable' from source: unknown 24134 1727096425.52236: variable 'ansible_connection' from source: unknown 24134 1727096425.52243: variable 'ansible_module_compression' from source: unknown 24134 1727096425.52251: variable 'ansible_shell_type' from source: unknown 24134 1727096425.52307: variable 'ansible_shell_executable' from source: unknown 24134 1727096425.52313: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096425.52315: variable 'ansible_pipelining' from source: unknown 24134 1727096425.52317: variable 'ansible_timeout' from source: unknown 24134 1727096425.52319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096425.52501: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096425.52528: variable 'omit' from source: magic vars 24134 1727096425.52542: starting attempt loop 24134 1727096425.52550: running the handler 24134 1727096425.52632: _low_level_execute_command(): starting 24134 1727096425.52635: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096425.53334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096425.53382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096425.53405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096425.53509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096425.53513: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096425.53542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096425.53655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096425.55403: stdout chunk (state=3): >>>/root <<< 24134 1727096425.55550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096425.55577: stdout chunk (state=3): >>><<< 24134 1727096425.55580: stderr chunk (state=3): >>><<< 24134 1727096425.55597: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096425.55701: _low_level_execute_command(): starting 24134 1727096425.55705: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003 `" && echo ansible-tmp-1727096425.556068-25551-23068066631003="` echo /root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003 `" ) && sleep 0' 24134 1727096425.56360: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096425.56394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096425.56501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096425.58510: stdout chunk (state=3): >>>ansible-tmp-1727096425.556068-25551-23068066631003=/root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003 <<< 24134 1727096425.58671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096425.58674: stdout chunk (state=3): >>><<< 24134 1727096425.58677: stderr chunk (state=3): >>><<< 24134 1727096425.58774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096425.556068-25551-23068066631003=/root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096425.58777: variable 'ansible_module_compression' from source: unknown 24134 1727096425.58807: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 24134 1727096425.58872: variable 'ansible_facts' from source: unknown 24134 1727096425.59085: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003/AnsiballZ_package_facts.py 24134 1727096425.59293: Sending initial data 24134 1727096425.59296: Sent initial data (160 bytes) 24134 1727096425.59906: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096425.59979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096425.60041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096425.60096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096425.60161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096425.61804: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 24134 1727096425.61813: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096425.61870: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096425.61938: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpk4bgmmyc /root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003/AnsiballZ_package_facts.py <<< 24134 1727096425.61941: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003/AnsiballZ_package_facts.py" <<< 24134 1727096425.61998: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpk4bgmmyc" to remote "/root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003/AnsiballZ_package_facts.py" <<< 24134 1727096425.63700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096425.63703: stderr chunk (state=3): >>><<< 24134 1727096425.63706: stdout chunk (state=3): >>><<< 24134 1727096425.63708: done transferring module to remote 24134 1727096425.63710: _low_level_execute_command(): starting 24134 1727096425.63712: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003/ /root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003/AnsiballZ_package_facts.py && sleep 0' 24134 1727096425.64284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096425.64320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096425.64341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096425.64358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096425.64479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096425.66366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096425.66373: stdout chunk (state=3): >>><<< 24134 1727096425.66380: stderr chunk (state=3): >>><<< 24134 1727096425.66488: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096425.66499: _low_level_execute_command(): starting 24134 1727096425.66503: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003/AnsiballZ_package_facts.py && sleep 0' 24134 1727096425.67124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 24134 1727096425.67128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096425.67170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096425.67207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096425.67279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096426.12120: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 24134 1727096426.12142: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 24134 1727096426.13984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096426.13988: stdout chunk (state=3): >>><<< 24134 1727096426.13994: stderr chunk (state=3): >>><<< 24134 1727096426.14023: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096426.18475: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096426.18479: _low_level_execute_command(): starting 24134 1727096426.18482: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096425.556068-25551-23068066631003/ > /dev/null 2>&1 && sleep 0' 24134 1727096426.19773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096426.19777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096426.19781: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096426.19784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096426.19787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 24134 1727096426.19789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096426.19826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096426.19829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096426.19844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096426.20041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096426.21988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096426.21991: stderr chunk (state=3): >>><<< 24134 1727096426.21996: stdout chunk (state=3): >>><<< 24134 1727096426.22013: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096426.22019: handler run complete 24134 1727096426.24374: variable 'ansible_facts' from source: unknown 24134 1727096426.25032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096426.41672: variable 'ansible_facts' from source: unknown 24134 1727096426.42601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096426.43947: attempt loop complete, returning result 24134 1727096426.44080: _execute() done 24134 1727096426.44086: dumping result to json 24134 1727096426.44514: done dumping result, returning 24134 1727096426.44715: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-1673-d3fc-000000000568] 24134 1727096426.44718: sending task result for task 0afff68d-5257-1673-d3fc-000000000568 24134 1727096426.48876: done sending task result for task 0afff68d-5257-1673-d3fc-000000000568 24134 1727096426.48880: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096426.49048: no more pending results, returning what we have 24134 1727096426.49051: results queue empty 24134 1727096426.49052: checking for any_errors_fatal 24134 1727096426.49057: done checking for any_errors_fatal 24134 1727096426.49058: checking for max_fail_percentage 24134 1727096426.49060: done checking for max_fail_percentage 24134 1727096426.49060: checking to see if all hosts have failed and the running result is not ok 24134 1727096426.49061: done checking to see if all hosts have failed 24134 1727096426.49062: getting the remaining hosts for this loop 24134 1727096426.49063: done getting the remaining hosts for this loop 24134 1727096426.49072: getting the next task for host managed_node1 24134 1727096426.49079: done getting next task for host managed_node1 24134 1727096426.49083: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 24134 1727096426.49085: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096426.49093: getting variables 24134 1727096426.49095: in VariableManager get_vars() 24134 1727096426.49121: Calling all_inventory to load vars for managed_node1 24134 1727096426.49124: Calling groups_inventory to load vars for managed_node1 24134 1727096426.49126: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096426.49133: Calling all_plugins_play to load vars for managed_node1 24134 1727096426.49136: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096426.49138: Calling groups_plugins_play to load vars for managed_node1 24134 1727096426.57651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096426.59588: done with get_vars() 24134 1727096426.59620: done getting variables 24134 1727096426.59677: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 09:00:26 -0400 (0:00:01.094) 0:00:30.810 ****** 24134 1727096426.59705: entering _queue_task() for managed_node1/debug 24134 1727096426.60076: worker is 1 (out of 1 available) 24134 1727096426.60091: exiting _queue_task() for managed_node1/debug 24134 1727096426.60108: done queuing things up, now waiting for results queue to drain 24134 1727096426.60110: waiting for pending results... 24134 1727096426.60357: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 24134 1727096426.60492: in run() - task 0afff68d-5257-1673-d3fc-000000000085 24134 1727096426.60516: variable 'ansible_search_path' from source: unknown 24134 1727096426.60524: variable 'ansible_search_path' from source: unknown 24134 1727096426.60565: calling self._execute() 24134 1727096426.60678: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096426.60712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096426.60716: variable 'omit' from source: magic vars 24134 1727096426.61119: variable 'ansible_distribution_major_version' from source: facts 24134 1727096426.61148: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096426.61154: variable 'omit' from source: magic vars 24134 1727096426.61263: variable 'omit' from source: magic vars 24134 1727096426.61477: variable 'network_provider' from source: set_fact 24134 1727096426.61487: variable 'omit' from source: magic vars 24134 1727096426.61675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096426.61713: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096426.61740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096426.61827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096426.61846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096426.61886: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096426.61922: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096426.61932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096426.62127: Set connection var ansible_shell_executable to /bin/sh 24134 1727096426.62194: Set connection var ansible_pipelining to False 24134 1727096426.62205: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096426.62218: Set connection var ansible_timeout to 10 24134 1727096426.62403: Set connection var ansible_connection to ssh 24134 1727096426.62406: Set connection var ansible_shell_type to sh 24134 1727096426.62408: variable 'ansible_shell_executable' from source: unknown 24134 1727096426.62410: variable 'ansible_connection' from source: unknown 24134 1727096426.62412: variable 'ansible_module_compression' from source: unknown 24134 1727096426.62414: variable 'ansible_shell_type' from source: unknown 24134 1727096426.62416: variable 'ansible_shell_executable' from source: unknown 24134 1727096426.62418: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096426.62420: variable 'ansible_pipelining' from source: unknown 24134 1727096426.62423: variable 'ansible_timeout' from source: unknown 24134 1727096426.62424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096426.62675: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096426.62679: variable 'omit' from source: magic vars 24134 1727096426.62681: starting attempt loop 24134 1727096426.62683: running the handler 24134 1727096426.62865: handler run complete 24134 1727096426.62873: attempt loop complete, returning result 24134 1727096426.62876: _execute() done 24134 1727096426.62884: dumping result to json 24134 1727096426.62893: done dumping result, returning 24134 1727096426.62906: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-1673-d3fc-000000000085] 24134 1727096426.62915: sending task result for task 0afff68d-5257-1673-d3fc-000000000085 24134 1727096426.63175: done sending task result for task 0afff68d-5257-1673-d3fc-000000000085 24134 1727096426.63179: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 24134 1727096426.63240: no more pending results, returning what we have 24134 1727096426.63244: results queue empty 24134 1727096426.63245: checking for any_errors_fatal 24134 1727096426.63258: done checking for any_errors_fatal 24134 1727096426.63258: checking for max_fail_percentage 24134 1727096426.63261: done checking for max_fail_percentage 24134 1727096426.63262: checking to see if all hosts have failed and the running result is not ok 24134 1727096426.63263: done checking to see if all hosts have failed 24134 1727096426.63263: getting the remaining hosts for this loop 24134 1727096426.63265: done getting the remaining hosts for this loop 24134 1727096426.63273: getting the next task for host managed_node1 24134 1727096426.63279: done getting next task for host managed_node1 24134 1727096426.63283: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24134 1727096426.63285: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096426.63296: getting variables 24134 1727096426.63298: in VariableManager get_vars() 24134 1727096426.63337: Calling all_inventory to load vars for managed_node1 24134 1727096426.63340: Calling groups_inventory to load vars for managed_node1 24134 1727096426.63343: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096426.63353: Calling all_plugins_play to load vars for managed_node1 24134 1727096426.63356: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096426.63360: Calling groups_plugins_play to load vars for managed_node1 24134 1727096426.66524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096426.70031: done with get_vars() 24134 1727096426.70060: done getting variables 24134 1727096426.70280: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 09:00:26 -0400 (0:00:00.106) 0:00:30.916 ****** 24134 1727096426.70315: entering _queue_task() for managed_node1/fail 24134 1727096426.71051: worker is 1 (out of 1 available) 24134 1727096426.71065: exiting _queue_task() for managed_node1/fail 24134 1727096426.71180: done queuing things up, now waiting for results queue to drain 24134 1727096426.71182: waiting for pending results... 24134 1727096426.71554: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24134 1727096426.72031: in run() - task 0afff68d-5257-1673-d3fc-000000000086 24134 1727096426.72035: variable 'ansible_search_path' from source: unknown 24134 1727096426.72037: variable 'ansible_search_path' from source: unknown 24134 1727096426.72041: calling self._execute() 24134 1727096426.72083: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096426.72149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096426.72163: variable 'omit' from source: magic vars 24134 1727096426.72935: variable 'ansible_distribution_major_version' from source: facts 24134 1727096426.73022: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096426.73257: variable 'network_state' from source: role '' defaults 24134 1727096426.73276: Evaluated conditional (network_state != {}): False 24134 1727096426.73284: when evaluation is False, skipping this task 24134 1727096426.73293: _execute() done 24134 1727096426.73301: dumping result to json 24134 1727096426.73447: done dumping result, returning 24134 1727096426.73451: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-1673-d3fc-000000000086] 24134 1727096426.73453: sending task result for task 0afff68d-5257-1673-d3fc-000000000086 24134 1727096426.73521: done sending task result for task 0afff68d-5257-1673-d3fc-000000000086 24134 1727096426.73526: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096426.73574: no more pending results, returning what we have 24134 1727096426.73578: results queue empty 24134 1727096426.73579: checking for any_errors_fatal 24134 1727096426.73585: done checking for any_errors_fatal 24134 1727096426.73586: checking for max_fail_percentage 24134 1727096426.73589: done checking for max_fail_percentage 24134 1727096426.73590: checking to see if all hosts have failed and the running result is not ok 24134 1727096426.73591: done checking to see if all hosts have failed 24134 1727096426.73591: getting the remaining hosts for this loop 24134 1727096426.73593: done getting the remaining hosts for this loop 24134 1727096426.73596: getting the next task for host managed_node1 24134 1727096426.73601: done getting next task for host managed_node1 24134 1727096426.73604: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24134 1727096426.73607: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096426.73621: getting variables 24134 1727096426.73622: in VariableManager get_vars() 24134 1727096426.73656: Calling all_inventory to load vars for managed_node1 24134 1727096426.73658: Calling groups_inventory to load vars for managed_node1 24134 1727096426.73661: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096426.73673: Calling all_plugins_play to load vars for managed_node1 24134 1727096426.73676: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096426.73679: Calling groups_plugins_play to load vars for managed_node1 24134 1727096426.76649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096426.80089: done with get_vars() 24134 1727096426.80115: done getting variables 24134 1727096426.80294: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 09:00:26 -0400 (0:00:00.100) 0:00:31.016 ****** 24134 1727096426.80326: entering _queue_task() for managed_node1/fail 24134 1727096426.81012: worker is 1 (out of 1 available) 24134 1727096426.81024: exiting _queue_task() for managed_node1/fail 24134 1727096426.81035: done queuing things up, now waiting for results queue to drain 24134 1727096426.81036: waiting for pending results... 24134 1727096426.81344: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24134 1727096426.81399: in run() - task 0afff68d-5257-1673-d3fc-000000000087 24134 1727096426.81417: variable 'ansible_search_path' from source: unknown 24134 1727096426.81424: variable 'ansible_search_path' from source: unknown 24134 1727096426.81474: calling self._execute() 24134 1727096426.81581: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096426.81594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096426.81608: variable 'omit' from source: magic vars 24134 1727096426.82279: variable 'ansible_distribution_major_version' from source: facts 24134 1727096426.82283: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096426.82533: variable 'network_state' from source: role '' defaults 24134 1727096426.82552: Evaluated conditional (network_state != {}): False 24134 1727096426.82560: when evaluation is False, skipping this task 24134 1727096426.82573: _execute() done 24134 1727096426.82584: dumping result to json 24134 1727096426.82593: done dumping result, returning 24134 1727096426.82605: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-1673-d3fc-000000000087] 24134 1727096426.82751: sending task result for task 0afff68d-5257-1673-d3fc-000000000087 24134 1727096426.82825: done sending task result for task 0afff68d-5257-1673-d3fc-000000000087 24134 1727096426.82828: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096426.82906: no more pending results, returning what we have 24134 1727096426.82910: results queue empty 24134 1727096426.82911: checking for any_errors_fatal 24134 1727096426.82917: done checking for any_errors_fatal 24134 1727096426.82917: checking for max_fail_percentage 24134 1727096426.82919: done checking for max_fail_percentage 24134 1727096426.82920: checking to see if all hosts have failed and the running result is not ok 24134 1727096426.82921: done checking to see if all hosts have failed 24134 1727096426.82922: getting the remaining hosts for this loop 24134 1727096426.82923: done getting the remaining hosts for this loop 24134 1727096426.82927: getting the next task for host managed_node1 24134 1727096426.82934: done getting next task for host managed_node1 24134 1727096426.82938: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24134 1727096426.82940: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096426.82957: getting variables 24134 1727096426.82958: in VariableManager get_vars() 24134 1727096426.83204: Calling all_inventory to load vars for managed_node1 24134 1727096426.83208: Calling groups_inventory to load vars for managed_node1 24134 1727096426.83211: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096426.83223: Calling all_plugins_play to load vars for managed_node1 24134 1727096426.83226: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096426.83228: Calling groups_plugins_play to load vars for managed_node1 24134 1727096426.84974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096426.87495: done with get_vars() 24134 1727096426.87517: done getting variables 24134 1727096426.87580: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 09:00:26 -0400 (0:00:00.072) 0:00:31.089 ****** 24134 1727096426.87615: entering _queue_task() for managed_node1/fail 24134 1727096426.87965: worker is 1 (out of 1 available) 24134 1727096426.88144: exiting _queue_task() for managed_node1/fail 24134 1727096426.88155: done queuing things up, now waiting for results queue to drain 24134 1727096426.88156: waiting for pending results... 24134 1727096426.88374: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24134 1727096426.88436: in run() - task 0afff68d-5257-1673-d3fc-000000000088 24134 1727096426.88457: variable 'ansible_search_path' from source: unknown 24134 1727096426.88476: variable 'ansible_search_path' from source: unknown 24134 1727096426.88521: calling self._execute() 24134 1727096426.88686: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096426.88690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096426.88694: variable 'omit' from source: magic vars 24134 1727096426.89066: variable 'ansible_distribution_major_version' from source: facts 24134 1727096426.89086: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096426.89341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096426.92378: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096426.92512: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096426.92516: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096426.92542: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096426.92578: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096426.92662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096426.92701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096426.92736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096426.92783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096426.92802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096426.92907: variable 'ansible_distribution_major_version' from source: facts 24134 1727096426.92947: Evaluated conditional (ansible_distribution_major_version | int > 9): True 24134 1727096426.93059: variable 'ansible_distribution' from source: facts 24134 1727096426.93072: variable '__network_rh_distros' from source: role '' defaults 24134 1727096426.93166: Evaluated conditional (ansible_distribution in __network_rh_distros): True 24134 1727096426.93363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096426.93403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096426.93432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096426.93479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096426.93506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096426.93570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096426.93610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096426.93640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096426.93690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096426.93717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096426.93761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096426.93821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096426.93830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096426.93874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096426.93935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096426.94275: variable 'network_connections' from source: play vars 24134 1727096426.94366: variable 'profile' from source: play vars 24134 1727096426.94375: variable 'profile' from source: play vars 24134 1727096426.94386: variable 'interface' from source: set_fact 24134 1727096426.94446: variable 'interface' from source: set_fact 24134 1727096426.94492: variable 'network_state' from source: role '' defaults 24134 1727096426.94703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096426.94911: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096426.94953: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096426.94992: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096426.95042: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096426.95110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096426.95177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096426.95180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096426.95200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096426.95245: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 24134 1727096426.95253: when evaluation is False, skipping this task 24134 1727096426.95261: _execute() done 24134 1727096426.95271: dumping result to json 24134 1727096426.95280: done dumping result, returning 24134 1727096426.95291: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-1673-d3fc-000000000088] 24134 1727096426.95299: sending task result for task 0afff68d-5257-1673-d3fc-000000000088 24134 1727096426.95420: done sending task result for task 0afff68d-5257-1673-d3fc-000000000088 24134 1727096426.95423: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 24134 1727096426.95597: no more pending results, returning what we have 24134 1727096426.95601: results queue empty 24134 1727096426.95602: checking for any_errors_fatal 24134 1727096426.95610: done checking for any_errors_fatal 24134 1727096426.95611: checking for max_fail_percentage 24134 1727096426.95612: done checking for max_fail_percentage 24134 1727096426.95613: checking to see if all hosts have failed and the running result is not ok 24134 1727096426.95614: done checking to see if all hosts have failed 24134 1727096426.95615: getting the remaining hosts for this loop 24134 1727096426.95616: done getting the remaining hosts for this loop 24134 1727096426.95620: getting the next task for host managed_node1 24134 1727096426.95626: done getting next task for host managed_node1 24134 1727096426.95631: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24134 1727096426.95633: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096426.95647: getting variables 24134 1727096426.95648: in VariableManager get_vars() 24134 1727096426.95694: Calling all_inventory to load vars for managed_node1 24134 1727096426.95699: Calling groups_inventory to load vars for managed_node1 24134 1727096426.95704: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096426.95715: Calling all_plugins_play to load vars for managed_node1 24134 1727096426.95719: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096426.95722: Calling groups_plugins_play to load vars for managed_node1 24134 1727096426.97293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096426.98660: done with get_vars() 24134 1727096426.98686: done getting variables 24134 1727096426.98746: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 09:00:26 -0400 (0:00:00.111) 0:00:31.201 ****** 24134 1727096426.98776: entering _queue_task() for managed_node1/dnf 24134 1727096426.99139: worker is 1 (out of 1 available) 24134 1727096426.99152: exiting _queue_task() for managed_node1/dnf 24134 1727096426.99164: done queuing things up, now waiting for results queue to drain 24134 1727096426.99165: waiting for pending results... 24134 1727096426.99601: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24134 1727096426.99821: in run() - task 0afff68d-5257-1673-d3fc-000000000089 24134 1727096426.99825: variable 'ansible_search_path' from source: unknown 24134 1727096426.99828: variable 'ansible_search_path' from source: unknown 24134 1727096426.99831: calling self._execute() 24134 1727096426.99874: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096426.99949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096426.99966: variable 'omit' from source: magic vars 24134 1727096427.00379: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.00398: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096427.00595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096427.02328: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096427.02756: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096427.02794: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096427.02820: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096427.02840: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096427.02905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.02927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.02953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.02987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.02998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.03084: variable 'ansible_distribution' from source: facts 24134 1727096427.03088: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.03098: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 24134 1727096427.03178: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096427.03262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.03283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.03303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.03331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.03341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.03371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.03390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.03408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.03434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.03445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.03475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.03492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.03509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.03537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.03548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.03648: variable 'network_connections' from source: play vars 24134 1727096427.03658: variable 'profile' from source: play vars 24134 1727096427.03709: variable 'profile' from source: play vars 24134 1727096427.03712: variable 'interface' from source: set_fact 24134 1727096427.03757: variable 'interface' from source: set_fact 24134 1727096427.03810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096427.03924: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096427.03953: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096427.03979: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096427.04012: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096427.04043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096427.04059: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096427.04086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.04103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096427.04139: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096427.04290: variable 'network_connections' from source: play vars 24134 1727096427.04294: variable 'profile' from source: play vars 24134 1727096427.04337: variable 'profile' from source: play vars 24134 1727096427.04342: variable 'interface' from source: set_fact 24134 1727096427.04386: variable 'interface' from source: set_fact 24134 1727096427.04405: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24134 1727096427.04409: when evaluation is False, skipping this task 24134 1727096427.04411: _execute() done 24134 1727096427.04413: dumping result to json 24134 1727096427.04415: done dumping result, returning 24134 1727096427.04425: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-000000000089] 24134 1727096427.04427: sending task result for task 0afff68d-5257-1673-d3fc-000000000089 24134 1727096427.04512: done sending task result for task 0afff68d-5257-1673-d3fc-000000000089 24134 1727096427.04514: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24134 1727096427.04576: no more pending results, returning what we have 24134 1727096427.04579: results queue empty 24134 1727096427.04580: checking for any_errors_fatal 24134 1727096427.04584: done checking for any_errors_fatal 24134 1727096427.04585: checking for max_fail_percentage 24134 1727096427.04587: done checking for max_fail_percentage 24134 1727096427.04588: checking to see if all hosts have failed and the running result is not ok 24134 1727096427.04589: done checking to see if all hosts have failed 24134 1727096427.04589: getting the remaining hosts for this loop 24134 1727096427.04591: done getting the remaining hosts for this loop 24134 1727096427.04594: getting the next task for host managed_node1 24134 1727096427.04600: done getting next task for host managed_node1 24134 1727096427.04603: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24134 1727096427.04605: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096427.04618: getting variables 24134 1727096427.04619: in VariableManager get_vars() 24134 1727096427.04658: Calling all_inventory to load vars for managed_node1 24134 1727096427.04661: Calling groups_inventory to load vars for managed_node1 24134 1727096427.04663: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096427.04674: Calling all_plugins_play to load vars for managed_node1 24134 1727096427.04676: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096427.04679: Calling groups_plugins_play to load vars for managed_node1 24134 1727096427.05645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096427.06515: done with get_vars() 24134 1727096427.06532: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24134 1727096427.06588: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 09:00:27 -0400 (0:00:00.078) 0:00:31.279 ****** 24134 1727096427.06611: entering _queue_task() for managed_node1/yum 24134 1727096427.06865: worker is 1 (out of 1 available) 24134 1727096427.06879: exiting _queue_task() for managed_node1/yum 24134 1727096427.06891: done queuing things up, now waiting for results queue to drain 24134 1727096427.06893: waiting for pending results... 24134 1727096427.07085: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24134 1727096427.07162: in run() - task 0afff68d-5257-1673-d3fc-00000000008a 24134 1727096427.07178: variable 'ansible_search_path' from source: unknown 24134 1727096427.07182: variable 'ansible_search_path' from source: unknown 24134 1727096427.07213: calling self._execute() 24134 1727096427.07287: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096427.07291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096427.07302: variable 'omit' from source: magic vars 24134 1727096427.07591: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.07600: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096427.07721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096427.09241: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096427.09296: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096427.09323: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096427.09348: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096427.09371: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096427.09432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.09453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.09472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.09505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.09515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.09590: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.09603: Evaluated conditional (ansible_distribution_major_version | int < 8): False 24134 1727096427.09606: when evaluation is False, skipping this task 24134 1727096427.09609: _execute() done 24134 1727096427.09612: dumping result to json 24134 1727096427.09614: done dumping result, returning 24134 1727096427.09622: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-00000000008a] 24134 1727096427.09627: sending task result for task 0afff68d-5257-1673-d3fc-00000000008a 24134 1727096427.09713: done sending task result for task 0afff68d-5257-1673-d3fc-00000000008a 24134 1727096427.09716: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 24134 1727096427.09765: no more pending results, returning what we have 24134 1727096427.09770: results queue empty 24134 1727096427.09771: checking for any_errors_fatal 24134 1727096427.09778: done checking for any_errors_fatal 24134 1727096427.09779: checking for max_fail_percentage 24134 1727096427.09780: done checking for max_fail_percentage 24134 1727096427.09781: checking to see if all hosts have failed and the running result is not ok 24134 1727096427.09782: done checking to see if all hosts have failed 24134 1727096427.09782: getting the remaining hosts for this loop 24134 1727096427.09784: done getting the remaining hosts for this loop 24134 1727096427.09787: getting the next task for host managed_node1 24134 1727096427.09793: done getting next task for host managed_node1 24134 1727096427.09797: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24134 1727096427.09799: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096427.09812: getting variables 24134 1727096427.09813: in VariableManager get_vars() 24134 1727096427.09850: Calling all_inventory to load vars for managed_node1 24134 1727096427.09853: Calling groups_inventory to load vars for managed_node1 24134 1727096427.09855: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096427.09864: Calling all_plugins_play to load vars for managed_node1 24134 1727096427.09874: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096427.09878: Calling groups_plugins_play to load vars for managed_node1 24134 1727096427.10703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096427.11595: done with get_vars() 24134 1727096427.11615: done getting variables 24134 1727096427.11658: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 09:00:27 -0400 (0:00:00.050) 0:00:31.330 ****** 24134 1727096427.11683: entering _queue_task() for managed_node1/fail 24134 1727096427.11937: worker is 1 (out of 1 available) 24134 1727096427.11951: exiting _queue_task() for managed_node1/fail 24134 1727096427.11963: done queuing things up, now waiting for results queue to drain 24134 1727096427.11965: waiting for pending results... 24134 1727096427.12146: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24134 1727096427.12230: in run() - task 0afff68d-5257-1673-d3fc-00000000008b 24134 1727096427.12242: variable 'ansible_search_path' from source: unknown 24134 1727096427.12246: variable 'ansible_search_path' from source: unknown 24134 1727096427.12278: calling self._execute() 24134 1727096427.12353: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096427.12357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096427.12366: variable 'omit' from source: magic vars 24134 1727096427.12645: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.12656: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096427.12738: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096427.12882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096427.14589: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096427.14636: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096427.14674: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096427.14699: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096427.14721: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096427.14784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.14804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.14824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.14850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.14861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.14896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.14912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.14935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.14959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.14973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.14999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.15015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.15039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.15061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.15075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.15187: variable 'network_connections' from source: play vars 24134 1727096427.15198: variable 'profile' from source: play vars 24134 1727096427.15250: variable 'profile' from source: play vars 24134 1727096427.15254: variable 'interface' from source: set_fact 24134 1727096427.15301: variable 'interface' from source: set_fact 24134 1727096427.15351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096427.15460: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096427.15491: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096427.15514: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096427.15543: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096427.15587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096427.15591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096427.15611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.15629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096427.15666: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096427.15816: variable 'network_connections' from source: play vars 24134 1727096427.15821: variable 'profile' from source: play vars 24134 1727096427.15865: variable 'profile' from source: play vars 24134 1727096427.15873: variable 'interface' from source: set_fact 24134 1727096427.15912: variable 'interface' from source: set_fact 24134 1727096427.15933: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24134 1727096427.15936: when evaluation is False, skipping this task 24134 1727096427.15939: _execute() done 24134 1727096427.15941: dumping result to json 24134 1727096427.15944: done dumping result, returning 24134 1727096427.15951: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-00000000008b] 24134 1727096427.15962: sending task result for task 0afff68d-5257-1673-d3fc-00000000008b 24134 1727096427.16042: done sending task result for task 0afff68d-5257-1673-d3fc-00000000008b 24134 1727096427.16045: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24134 1727096427.16100: no more pending results, returning what we have 24134 1727096427.16104: results queue empty 24134 1727096427.16104: checking for any_errors_fatal 24134 1727096427.16109: done checking for any_errors_fatal 24134 1727096427.16110: checking for max_fail_percentage 24134 1727096427.16111: done checking for max_fail_percentage 24134 1727096427.16112: checking to see if all hosts have failed and the running result is not ok 24134 1727096427.16113: done checking to see if all hosts have failed 24134 1727096427.16114: getting the remaining hosts for this loop 24134 1727096427.16115: done getting the remaining hosts for this loop 24134 1727096427.16119: getting the next task for host managed_node1 24134 1727096427.16124: done getting next task for host managed_node1 24134 1727096427.16128: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 24134 1727096427.16130: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096427.16143: getting variables 24134 1727096427.16144: in VariableManager get_vars() 24134 1727096427.16191: Calling all_inventory to load vars for managed_node1 24134 1727096427.16194: Calling groups_inventory to load vars for managed_node1 24134 1727096427.16196: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096427.16205: Calling all_plugins_play to load vars for managed_node1 24134 1727096427.16208: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096427.16211: Calling groups_plugins_play to load vars for managed_node1 24134 1727096427.17144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096427.18024: done with get_vars() 24134 1727096427.18040: done getting variables 24134 1727096427.18087: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 09:00:27 -0400 (0:00:00.064) 0:00:31.394 ****** 24134 1727096427.18112: entering _queue_task() for managed_node1/package 24134 1727096427.18358: worker is 1 (out of 1 available) 24134 1727096427.18375: exiting _queue_task() for managed_node1/package 24134 1727096427.18387: done queuing things up, now waiting for results queue to drain 24134 1727096427.18389: waiting for pending results... 24134 1727096427.18567: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 24134 1727096427.18650: in run() - task 0afff68d-5257-1673-d3fc-00000000008c 24134 1727096427.18664: variable 'ansible_search_path' from source: unknown 24134 1727096427.18669: variable 'ansible_search_path' from source: unknown 24134 1727096427.18704: calling self._execute() 24134 1727096427.18778: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096427.18782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096427.18792: variable 'omit' from source: magic vars 24134 1727096427.19073: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.19084: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096427.19217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096427.19411: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096427.19442: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096427.19471: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096427.19531: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096427.19613: variable 'network_packages' from source: role '' defaults 24134 1727096427.19688: variable '__network_provider_setup' from source: role '' defaults 24134 1727096427.19698: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096427.19745: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096427.19752: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096427.19802: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096427.19917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096427.21233: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096427.21276: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096427.21304: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096427.21329: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096427.21357: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096427.21419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.21441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.21461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.21492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.21503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.21540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.21553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.21571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.21598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.21609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.21756: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24134 1727096427.21831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.21848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.21874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.21897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.21908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.21971: variable 'ansible_python' from source: facts 24134 1727096427.21990: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24134 1727096427.22045: variable '__network_wpa_supplicant_required' from source: role '' defaults 24134 1727096427.22103: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24134 1727096427.22183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.22205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.22220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.22245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.22256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.22290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.22311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.22328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.22352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.22362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.22459: variable 'network_connections' from source: play vars 24134 1727096427.22462: variable 'profile' from source: play vars 24134 1727096427.22535: variable 'profile' from source: play vars 24134 1727096427.22544: variable 'interface' from source: set_fact 24134 1727096427.22597: variable 'interface' from source: set_fact 24134 1727096427.22651: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096427.22671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096427.22697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.22717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096427.22754: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096427.22933: variable 'network_connections' from source: play vars 24134 1727096427.22936: variable 'profile' from source: play vars 24134 1727096427.23012: variable 'profile' from source: play vars 24134 1727096427.23018: variable 'interface' from source: set_fact 24134 1727096427.23069: variable 'interface' from source: set_fact 24134 1727096427.23096: variable '__network_packages_default_wireless' from source: role '' defaults 24134 1727096427.23149: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096427.23345: variable 'network_connections' from source: play vars 24134 1727096427.23349: variable 'profile' from source: play vars 24134 1727096427.23399: variable 'profile' from source: play vars 24134 1727096427.23403: variable 'interface' from source: set_fact 24134 1727096427.23471: variable 'interface' from source: set_fact 24134 1727096427.23492: variable '__network_packages_default_team' from source: role '' defaults 24134 1727096427.23548: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096427.23740: variable 'network_connections' from source: play vars 24134 1727096427.23748: variable 'profile' from source: play vars 24134 1727096427.23795: variable 'profile' from source: play vars 24134 1727096427.23799: variable 'interface' from source: set_fact 24134 1727096427.23874: variable 'interface' from source: set_fact 24134 1727096427.23912: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096427.23960: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096427.23963: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096427.24005: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096427.24138: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24134 1727096427.24443: variable 'network_connections' from source: play vars 24134 1727096427.24446: variable 'profile' from source: play vars 24134 1727096427.24491: variable 'profile' from source: play vars 24134 1727096427.24494: variable 'interface' from source: set_fact 24134 1727096427.24541: variable 'interface' from source: set_fact 24134 1727096427.24548: variable 'ansible_distribution' from source: facts 24134 1727096427.24551: variable '__network_rh_distros' from source: role '' defaults 24134 1727096427.24556: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.24571: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24134 1727096427.24677: variable 'ansible_distribution' from source: facts 24134 1727096427.24680: variable '__network_rh_distros' from source: role '' defaults 24134 1727096427.24685: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.24695: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24134 1727096427.24801: variable 'ansible_distribution' from source: facts 24134 1727096427.24805: variable '__network_rh_distros' from source: role '' defaults 24134 1727096427.24807: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.24835: variable 'network_provider' from source: set_fact 24134 1727096427.24847: variable 'ansible_facts' from source: unknown 24134 1727096427.25379: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 24134 1727096427.25383: when evaluation is False, skipping this task 24134 1727096427.25385: _execute() done 24134 1727096427.25387: dumping result to json 24134 1727096427.25389: done dumping result, returning 24134 1727096427.25398: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-1673-d3fc-00000000008c] 24134 1727096427.25400: sending task result for task 0afff68d-5257-1673-d3fc-00000000008c 24134 1727096427.25487: done sending task result for task 0afff68d-5257-1673-d3fc-00000000008c 24134 1727096427.25490: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 24134 1727096427.25544: no more pending results, returning what we have 24134 1727096427.25547: results queue empty 24134 1727096427.25548: checking for any_errors_fatal 24134 1727096427.25554: done checking for any_errors_fatal 24134 1727096427.25555: checking for max_fail_percentage 24134 1727096427.25556: done checking for max_fail_percentage 24134 1727096427.25557: checking to see if all hosts have failed and the running result is not ok 24134 1727096427.25558: done checking to see if all hosts have failed 24134 1727096427.25559: getting the remaining hosts for this loop 24134 1727096427.25560: done getting the remaining hosts for this loop 24134 1727096427.25564: getting the next task for host managed_node1 24134 1727096427.25574: done getting next task for host managed_node1 24134 1727096427.25578: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24134 1727096427.25580: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096427.25594: getting variables 24134 1727096427.25595: in VariableManager get_vars() 24134 1727096427.25632: Calling all_inventory to load vars for managed_node1 24134 1727096427.25635: Calling groups_inventory to load vars for managed_node1 24134 1727096427.25637: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096427.25651: Calling all_plugins_play to load vars for managed_node1 24134 1727096427.25654: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096427.25656: Calling groups_plugins_play to load vars for managed_node1 24134 1727096427.26503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096427.27508: done with get_vars() 24134 1727096427.27527: done getting variables 24134 1727096427.27575: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 09:00:27 -0400 (0:00:00.094) 0:00:31.489 ****** 24134 1727096427.27598: entering _queue_task() for managed_node1/package 24134 1727096427.27862: worker is 1 (out of 1 available) 24134 1727096427.27878: exiting _queue_task() for managed_node1/package 24134 1727096427.27892: done queuing things up, now waiting for results queue to drain 24134 1727096427.27893: waiting for pending results... 24134 1727096427.28077: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24134 1727096427.28156: in run() - task 0afff68d-5257-1673-d3fc-00000000008d 24134 1727096427.28172: variable 'ansible_search_path' from source: unknown 24134 1727096427.28176: variable 'ansible_search_path' from source: unknown 24134 1727096427.28205: calling self._execute() 24134 1727096427.28284: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096427.28289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096427.28297: variable 'omit' from source: magic vars 24134 1727096427.28641: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.28645: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096427.28671: variable 'network_state' from source: role '' defaults 24134 1727096427.28678: Evaluated conditional (network_state != {}): False 24134 1727096427.28681: when evaluation is False, skipping this task 24134 1727096427.28683: _execute() done 24134 1727096427.28686: dumping result to json 24134 1727096427.28690: done dumping result, returning 24134 1727096427.28699: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-1673-d3fc-00000000008d] 24134 1727096427.28703: sending task result for task 0afff68d-5257-1673-d3fc-00000000008d 24134 1727096427.28795: done sending task result for task 0afff68d-5257-1673-d3fc-00000000008d 24134 1727096427.28797: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096427.28847: no more pending results, returning what we have 24134 1727096427.28850: results queue empty 24134 1727096427.28850: checking for any_errors_fatal 24134 1727096427.28855: done checking for any_errors_fatal 24134 1727096427.28856: checking for max_fail_percentage 24134 1727096427.28858: done checking for max_fail_percentage 24134 1727096427.28859: checking to see if all hosts have failed and the running result is not ok 24134 1727096427.28859: done checking to see if all hosts have failed 24134 1727096427.28860: getting the remaining hosts for this loop 24134 1727096427.28861: done getting the remaining hosts for this loop 24134 1727096427.28865: getting the next task for host managed_node1 24134 1727096427.28875: done getting next task for host managed_node1 24134 1727096427.28878: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24134 1727096427.28881: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096427.28895: getting variables 24134 1727096427.28897: in VariableManager get_vars() 24134 1727096427.28938: Calling all_inventory to load vars for managed_node1 24134 1727096427.28941: Calling groups_inventory to load vars for managed_node1 24134 1727096427.28943: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096427.28952: Calling all_plugins_play to load vars for managed_node1 24134 1727096427.28954: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096427.28956: Calling groups_plugins_play to load vars for managed_node1 24134 1727096427.29755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096427.31059: done with get_vars() 24134 1727096427.31095: done getting variables 24134 1727096427.31160: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 09:00:27 -0400 (0:00:00.035) 0:00:31.525 ****** 24134 1727096427.31200: entering _queue_task() for managed_node1/package 24134 1727096427.31585: worker is 1 (out of 1 available) 24134 1727096427.31598: exiting _queue_task() for managed_node1/package 24134 1727096427.31612: done queuing things up, now waiting for results queue to drain 24134 1727096427.31613: waiting for pending results... 24134 1727096427.32094: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24134 1727096427.32104: in run() - task 0afff68d-5257-1673-d3fc-00000000008e 24134 1727096427.32108: variable 'ansible_search_path' from source: unknown 24134 1727096427.32111: variable 'ansible_search_path' from source: unknown 24134 1727096427.32124: calling self._execute() 24134 1727096427.32240: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096427.32254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096427.32276: variable 'omit' from source: magic vars 24134 1727096427.32772: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.32776: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096427.32839: variable 'network_state' from source: role '' defaults 24134 1727096427.32857: Evaluated conditional (network_state != {}): False 24134 1727096427.32865: when evaluation is False, skipping this task 24134 1727096427.32881: _execute() done 24134 1727096427.32889: dumping result to json 24134 1727096427.32966: done dumping result, returning 24134 1727096427.32982: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-1673-d3fc-00000000008e] 24134 1727096427.32986: sending task result for task 0afff68d-5257-1673-d3fc-00000000008e 24134 1727096427.33065: done sending task result for task 0afff68d-5257-1673-d3fc-00000000008e 24134 1727096427.33194: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096427.33328: no more pending results, returning what we have 24134 1727096427.33333: results queue empty 24134 1727096427.33333: checking for any_errors_fatal 24134 1727096427.33341: done checking for any_errors_fatal 24134 1727096427.33342: checking for max_fail_percentage 24134 1727096427.33344: done checking for max_fail_percentage 24134 1727096427.33345: checking to see if all hosts have failed and the running result is not ok 24134 1727096427.33346: done checking to see if all hosts have failed 24134 1727096427.33347: getting the remaining hosts for this loop 24134 1727096427.33348: done getting the remaining hosts for this loop 24134 1727096427.33353: getting the next task for host managed_node1 24134 1727096427.33359: done getting next task for host managed_node1 24134 1727096427.33363: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24134 1727096427.33366: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096427.33388: getting variables 24134 1727096427.33390: in VariableManager get_vars() 24134 1727096427.33432: Calling all_inventory to load vars for managed_node1 24134 1727096427.33436: Calling groups_inventory to load vars for managed_node1 24134 1727096427.33439: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096427.33451: Calling all_plugins_play to load vars for managed_node1 24134 1727096427.33456: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096427.33459: Calling groups_plugins_play to load vars for managed_node1 24134 1727096427.35753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096427.36635: done with get_vars() 24134 1727096427.36654: done getting variables 24134 1727096427.36702: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 09:00:27 -0400 (0:00:00.055) 0:00:31.580 ****** 24134 1727096427.36730: entering _queue_task() for managed_node1/service 24134 1727096427.37234: worker is 1 (out of 1 available) 24134 1727096427.37245: exiting _queue_task() for managed_node1/service 24134 1727096427.37256: done queuing things up, now waiting for results queue to drain 24134 1727096427.37257: waiting for pending results... 24134 1727096427.37888: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24134 1727096427.37893: in run() - task 0afff68d-5257-1673-d3fc-00000000008f 24134 1727096427.37897: variable 'ansible_search_path' from source: unknown 24134 1727096427.37899: variable 'ansible_search_path' from source: unknown 24134 1727096427.37982: calling self._execute() 24134 1727096427.38079: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096427.38091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096427.38104: variable 'omit' from source: magic vars 24134 1727096427.38470: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.38490: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096427.38612: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096427.38812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096427.40980: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096427.41063: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096427.41112: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096427.41154: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096427.41191: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096427.41280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.41316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.41347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.41396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.41416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.41575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.41579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.41581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.41583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.41585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.41628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.41657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.41692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.41740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.41760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.41943: variable 'network_connections' from source: play vars 24134 1727096427.41962: variable 'profile' from source: play vars 24134 1727096427.42039: variable 'profile' from source: play vars 24134 1727096427.42048: variable 'interface' from source: set_fact 24134 1727096427.42116: variable 'interface' from source: set_fact 24134 1727096427.42199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096427.42384: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096427.42425: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096427.42460: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096427.42501: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096427.42548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096427.42775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096427.42778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.42781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096427.42784: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096427.42938: variable 'network_connections' from source: play vars 24134 1727096427.42948: variable 'profile' from source: play vars 24134 1727096427.43019: variable 'profile' from source: play vars 24134 1727096427.43029: variable 'interface' from source: set_fact 24134 1727096427.43097: variable 'interface' from source: set_fact 24134 1727096427.43133: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24134 1727096427.43141: when evaluation is False, skipping this task 24134 1727096427.43149: _execute() done 24134 1727096427.43157: dumping result to json 24134 1727096427.43164: done dumping result, returning 24134 1727096427.43183: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-1673-d3fc-00000000008f] 24134 1727096427.43203: sending task result for task 0afff68d-5257-1673-d3fc-00000000008f skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24134 1727096427.43384: no more pending results, returning what we have 24134 1727096427.43388: results queue empty 24134 1727096427.43389: checking for any_errors_fatal 24134 1727096427.43394: done checking for any_errors_fatal 24134 1727096427.43395: checking for max_fail_percentage 24134 1727096427.43397: done checking for max_fail_percentage 24134 1727096427.43398: checking to see if all hosts have failed and the running result is not ok 24134 1727096427.43399: done checking to see if all hosts have failed 24134 1727096427.43400: getting the remaining hosts for this loop 24134 1727096427.43401: done getting the remaining hosts for this loop 24134 1727096427.43406: getting the next task for host managed_node1 24134 1727096427.43412: done getting next task for host managed_node1 24134 1727096427.43416: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24134 1727096427.43418: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096427.43432: getting variables 24134 1727096427.43434: in VariableManager get_vars() 24134 1727096427.43479: Calling all_inventory to load vars for managed_node1 24134 1727096427.43483: Calling groups_inventory to load vars for managed_node1 24134 1727096427.43485: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096427.43496: Calling all_plugins_play to load vars for managed_node1 24134 1727096427.43499: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096427.43502: Calling groups_plugins_play to load vars for managed_node1 24134 1727096427.44182: done sending task result for task 0afff68d-5257-1673-d3fc-00000000008f 24134 1727096427.44185: WORKER PROCESS EXITING 24134 1727096427.45176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096427.46745: done with get_vars() 24134 1727096427.46771: done getting variables 24134 1727096427.46830: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 09:00:27 -0400 (0:00:00.101) 0:00:31.682 ****** 24134 1727096427.46861: entering _queue_task() for managed_node1/service 24134 1727096427.47180: worker is 1 (out of 1 available) 24134 1727096427.47191: exiting _queue_task() for managed_node1/service 24134 1727096427.47203: done queuing things up, now waiting for results queue to drain 24134 1727096427.47204: waiting for pending results... 24134 1727096427.47484: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24134 1727096427.47593: in run() - task 0afff68d-5257-1673-d3fc-000000000090 24134 1727096427.47616: variable 'ansible_search_path' from source: unknown 24134 1727096427.47624: variable 'ansible_search_path' from source: unknown 24134 1727096427.47659: calling self._execute() 24134 1727096427.47761: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096427.47778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096427.47791: variable 'omit' from source: magic vars 24134 1727096427.48172: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.48191: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096427.48360: variable 'network_provider' from source: set_fact 24134 1727096427.48471: variable 'network_state' from source: role '' defaults 24134 1727096427.48475: Evaluated conditional (network_provider == "nm" or network_state != {}): True 24134 1727096427.48478: variable 'omit' from source: magic vars 24134 1727096427.48480: variable 'omit' from source: magic vars 24134 1727096427.48482: variable 'network_service_name' from source: role '' defaults 24134 1727096427.48544: variable 'network_service_name' from source: role '' defaults 24134 1727096427.48663: variable '__network_provider_setup' from source: role '' defaults 24134 1727096427.48684: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096427.48748: variable '__network_service_name_default_nm' from source: role '' defaults 24134 1727096427.48762: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096427.48833: variable '__network_packages_default_nm' from source: role '' defaults 24134 1727096427.49061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096427.51535: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096427.51606: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096427.51662: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096427.51706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096427.51746: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096427.51936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.51940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.51942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.51945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.51972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.52022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.52050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.52088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.52131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.52153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.52398: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24134 1727096427.52516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.52539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.52562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.52607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.52624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.52716: variable 'ansible_python' from source: facts 24134 1727096427.52740: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24134 1727096427.52976: variable '__network_wpa_supplicant_required' from source: role '' defaults 24134 1727096427.52979: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24134 1727096427.53038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.53075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.53108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.53149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.53166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.53225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096427.53261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096427.53294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.53339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096427.53357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096427.53508: variable 'network_connections' from source: play vars 24134 1727096427.53520: variable 'profile' from source: play vars 24134 1727096427.53605: variable 'profile' from source: play vars 24134 1727096427.53616: variable 'interface' from source: set_fact 24134 1727096427.53685: variable 'interface' from source: set_fact 24134 1727096427.53803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096427.54011: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096427.54060: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096427.54175: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096427.54178: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096427.54221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096427.54294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096427.54297: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096427.54331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096427.54383: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096427.54678: variable 'network_connections' from source: play vars 24134 1727096427.54689: variable 'profile' from source: play vars 24134 1727096427.54766: variable 'profile' from source: play vars 24134 1727096427.54837: variable 'interface' from source: set_fact 24134 1727096427.54847: variable 'interface' from source: set_fact 24134 1727096427.54953: variable '__network_packages_default_wireless' from source: role '' defaults 24134 1727096427.54976: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096427.55281: variable 'network_connections' from source: play vars 24134 1727096427.55291: variable 'profile' from source: play vars 24134 1727096427.55361: variable 'profile' from source: play vars 24134 1727096427.55378: variable 'interface' from source: set_fact 24134 1727096427.55456: variable 'interface' from source: set_fact 24134 1727096427.55493: variable '__network_packages_default_team' from source: role '' defaults 24134 1727096427.55582: variable '__network_team_connections_defined' from source: role '' defaults 24134 1727096427.55874: variable 'network_connections' from source: play vars 24134 1727096427.55885: variable 'profile' from source: play vars 24134 1727096427.55960: variable 'profile' from source: play vars 24134 1727096427.56044: variable 'interface' from source: set_fact 24134 1727096427.56055: variable 'interface' from source: set_fact 24134 1727096427.56119: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096427.56194: variable '__network_service_name_default_initscripts' from source: role '' defaults 24134 1727096427.56207: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096427.56280: variable '__network_packages_default_initscripts' from source: role '' defaults 24134 1727096427.56508: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24134 1727096427.57032: variable 'network_connections' from source: play vars 24134 1727096427.57042: variable 'profile' from source: play vars 24134 1727096427.57108: variable 'profile' from source: play vars 24134 1727096427.57118: variable 'interface' from source: set_fact 24134 1727096427.57197: variable 'interface' from source: set_fact 24134 1727096427.57236: variable 'ansible_distribution' from source: facts 24134 1727096427.57239: variable '__network_rh_distros' from source: role '' defaults 24134 1727096427.57241: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.57244: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24134 1727096427.57419: variable 'ansible_distribution' from source: facts 24134 1727096427.57427: variable '__network_rh_distros' from source: role '' defaults 24134 1727096427.57453: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.57456: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24134 1727096427.57631: variable 'ansible_distribution' from source: facts 24134 1727096427.57674: variable '__network_rh_distros' from source: role '' defaults 24134 1727096427.57678: variable 'ansible_distribution_major_version' from source: facts 24134 1727096427.57693: variable 'network_provider' from source: set_fact 24134 1727096427.57721: variable 'omit' from source: magic vars 24134 1727096427.57751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096427.57791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096427.57975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096427.57978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096427.57981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096427.57983: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096427.57986: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096427.57987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096427.57989: Set connection var ansible_shell_executable to /bin/sh 24134 1727096427.58000: Set connection var ansible_pipelining to False 24134 1727096427.58011: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096427.58028: Set connection var ansible_timeout to 10 24134 1727096427.58036: Set connection var ansible_connection to ssh 24134 1727096427.58042: Set connection var ansible_shell_type to sh 24134 1727096427.58073: variable 'ansible_shell_executable' from source: unknown 24134 1727096427.58082: variable 'ansible_connection' from source: unknown 24134 1727096427.58089: variable 'ansible_module_compression' from source: unknown 24134 1727096427.58095: variable 'ansible_shell_type' from source: unknown 24134 1727096427.58105: variable 'ansible_shell_executable' from source: unknown 24134 1727096427.58112: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096427.58124: variable 'ansible_pipelining' from source: unknown 24134 1727096427.58130: variable 'ansible_timeout' from source: unknown 24134 1727096427.58137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096427.58243: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096427.58261: variable 'omit' from source: magic vars 24134 1727096427.58276: starting attempt loop 24134 1727096427.58284: running the handler 24134 1727096427.58373: variable 'ansible_facts' from source: unknown 24134 1727096427.59193: _low_level_execute_command(): starting 24134 1727096427.59207: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096427.59979: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096427.60045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096427.61759: stdout chunk (state=3): >>>/root <<< 24134 1727096427.61888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096427.61932: stderr chunk (state=3): >>><<< 24134 1727096427.61955: stdout chunk (state=3): >>><<< 24134 1727096427.61978: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096427.62081: _low_level_execute_command(): starting 24134 1727096427.62085: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530 `" && echo ansible-tmp-1727096427.6198497-25620-98484944356530="` echo /root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530 `" ) && sleep 0' 24134 1727096427.62672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096427.62687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096427.62700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096427.62725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096427.62832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096427.62875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096427.62942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096427.64908: stdout chunk (state=3): >>>ansible-tmp-1727096427.6198497-25620-98484944356530=/root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530 <<< 24134 1727096427.65075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096427.65078: stdout chunk (state=3): >>><<< 24134 1727096427.65082: stderr chunk (state=3): >>><<< 24134 1727096427.65102: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096427.6198497-25620-98484944356530=/root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096427.65275: variable 'ansible_module_compression' from source: unknown 24134 1727096427.65278: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 24134 1727096427.65281: variable 'ansible_facts' from source: unknown 24134 1727096427.65489: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530/AnsiballZ_systemd.py 24134 1727096427.65635: Sending initial data 24134 1727096427.65737: Sent initial data (155 bytes) 24134 1727096427.66310: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096427.66324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096427.66336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096427.66351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096427.66390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096427.66403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096427.66491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096427.66511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096427.66618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096427.68264: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096427.68364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096427.68440: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmp6kgglqb8 /root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530/AnsiballZ_systemd.py <<< 24134 1727096427.68443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530/AnsiballZ_systemd.py" <<< 24134 1727096427.68503: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmp6kgglqb8" to remote "/root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530/AnsiballZ_systemd.py" <<< 24134 1727096427.70275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096427.70278: stdout chunk (state=3): >>><<< 24134 1727096427.70281: stderr chunk (state=3): >>><<< 24134 1727096427.70283: done transferring module to remote 24134 1727096427.70285: _low_level_execute_command(): starting 24134 1727096427.70287: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530/ /root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530/AnsiballZ_systemd.py && sleep 0' 24134 1727096427.70884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096427.70896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096427.70962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096427.71024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096427.71039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096427.71073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096427.71182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096427.73099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096427.73133: stderr chunk (state=3): >>><<< 24134 1727096427.73154: stdout chunk (state=3): >>><<< 24134 1727096427.73235: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096427.73240: _low_level_execute_command(): starting 24134 1727096427.73243: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530/AnsiballZ_systemd.py && sleep 0' 24134 1727096427.73632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096427.73646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096427.73658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096427.73708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096427.73720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096427.73803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096428.03991: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10678272", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306602496", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "1127950000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 24134 1727096428.03998: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system<<< 24134 1727096428.04016: stdout chunk (state=3): >>>.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 24134 1727096428.06053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096428.06085: stderr chunk (state=3): >>><<< 24134 1727096428.06090: stdout chunk (state=3): >>><<< 24134 1727096428.06108: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10678272", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306602496", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "1127950000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096428.06225: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096428.06242: _low_level_execute_command(): starting 24134 1727096428.06247: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096427.6198497-25620-98484944356530/ > /dev/null 2>&1 && sleep 0' 24134 1727096428.06715: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096428.06718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096428.06720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096428.06723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096428.06725: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096428.06774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096428.06777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096428.06791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096428.06857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096428.08753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096428.08781: stderr chunk (state=3): >>><<< 24134 1727096428.08784: stdout chunk (state=3): >>><<< 24134 1727096428.08798: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096428.08804: handler run complete 24134 1727096428.08840: attempt loop complete, returning result 24134 1727096428.08843: _execute() done 24134 1727096428.08846: dumping result to json 24134 1727096428.08861: done dumping result, returning 24134 1727096428.08871: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-1673-d3fc-000000000090] 24134 1727096428.08878: sending task result for task 0afff68d-5257-1673-d3fc-000000000090 24134 1727096428.09117: done sending task result for task 0afff68d-5257-1673-d3fc-000000000090 24134 1727096428.09120: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096428.09172: no more pending results, returning what we have 24134 1727096428.09176: results queue empty 24134 1727096428.09177: checking for any_errors_fatal 24134 1727096428.09183: done checking for any_errors_fatal 24134 1727096428.09183: checking for max_fail_percentage 24134 1727096428.09185: done checking for max_fail_percentage 24134 1727096428.09186: checking to see if all hosts have failed and the running result is not ok 24134 1727096428.09187: done checking to see if all hosts have failed 24134 1727096428.09187: getting the remaining hosts for this loop 24134 1727096428.09189: done getting the remaining hosts for this loop 24134 1727096428.09192: getting the next task for host managed_node1 24134 1727096428.09197: done getting next task for host managed_node1 24134 1727096428.09200: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24134 1727096428.09202: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096428.09212: getting variables 24134 1727096428.09213: in VariableManager get_vars() 24134 1727096428.09245: Calling all_inventory to load vars for managed_node1 24134 1727096428.09248: Calling groups_inventory to load vars for managed_node1 24134 1727096428.09250: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096428.09258: Calling all_plugins_play to load vars for managed_node1 24134 1727096428.09261: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096428.09263: Calling groups_plugins_play to load vars for managed_node1 24134 1727096428.10177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096428.11061: done with get_vars() 24134 1727096428.11082: done getting variables 24134 1727096428.11126: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 09:00:28 -0400 (0:00:00.642) 0:00:32.324 ****** 24134 1727096428.11150: entering _queue_task() for managed_node1/service 24134 1727096428.11415: worker is 1 (out of 1 available) 24134 1727096428.11429: exiting _queue_task() for managed_node1/service 24134 1727096428.11440: done queuing things up, now waiting for results queue to drain 24134 1727096428.11442: waiting for pending results... 24134 1727096428.11616: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24134 1727096428.11692: in run() - task 0afff68d-5257-1673-d3fc-000000000091 24134 1727096428.11703: variable 'ansible_search_path' from source: unknown 24134 1727096428.11707: variable 'ansible_search_path' from source: unknown 24134 1727096428.11736: calling self._execute() 24134 1727096428.11811: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096428.11815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096428.11861: variable 'omit' from source: magic vars 24134 1727096428.12129: variable 'ansible_distribution_major_version' from source: facts 24134 1727096428.12138: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096428.12223: variable 'network_provider' from source: set_fact 24134 1727096428.12226: Evaluated conditional (network_provider == "nm"): True 24134 1727096428.12289: variable '__network_wpa_supplicant_required' from source: role '' defaults 24134 1727096428.12351: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24134 1727096428.12475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096428.14173: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096428.14177: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096428.14179: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096428.14181: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096428.14183: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096428.14256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096428.14293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096428.14322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096428.14369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096428.14389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096428.14439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096428.14471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096428.14500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096428.14542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096428.14564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096428.14613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096428.14641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096428.14673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096428.14723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096428.14745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096428.14851: variable 'network_connections' from source: play vars 24134 1727096428.14880: variable 'profile' from source: play vars 24134 1727096428.14924: variable 'profile' from source: play vars 24134 1727096428.14927: variable 'interface' from source: set_fact 24134 1727096428.14979: variable 'interface' from source: set_fact 24134 1727096428.15024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24134 1727096428.15136: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24134 1727096428.15162: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24134 1727096428.15186: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24134 1727096428.15211: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24134 1727096428.15243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24134 1727096428.15258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24134 1727096428.15278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096428.15295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24134 1727096428.15335: variable '__network_wireless_connections_defined' from source: role '' defaults 24134 1727096428.15495: variable 'network_connections' from source: play vars 24134 1727096428.15498: variable 'profile' from source: play vars 24134 1727096428.15543: variable 'profile' from source: play vars 24134 1727096428.15547: variable 'interface' from source: set_fact 24134 1727096428.15590: variable 'interface' from source: set_fact 24134 1727096428.15612: Evaluated conditional (__network_wpa_supplicant_required): False 24134 1727096428.15615: when evaluation is False, skipping this task 24134 1727096428.15617: _execute() done 24134 1727096428.15632: dumping result to json 24134 1727096428.15635: done dumping result, returning 24134 1727096428.15637: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-1673-d3fc-000000000091] 24134 1727096428.15640: sending task result for task 0afff68d-5257-1673-d3fc-000000000091 24134 1727096428.15722: done sending task result for task 0afff68d-5257-1673-d3fc-000000000091 24134 1727096428.15726: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 24134 1727096428.15802: no more pending results, returning what we have 24134 1727096428.15806: results queue empty 24134 1727096428.15806: checking for any_errors_fatal 24134 1727096428.15831: done checking for any_errors_fatal 24134 1727096428.15832: checking for max_fail_percentage 24134 1727096428.15834: done checking for max_fail_percentage 24134 1727096428.15835: checking to see if all hosts have failed and the running result is not ok 24134 1727096428.15835: done checking to see if all hosts have failed 24134 1727096428.15836: getting the remaining hosts for this loop 24134 1727096428.15837: done getting the remaining hosts for this loop 24134 1727096428.15841: getting the next task for host managed_node1 24134 1727096428.15847: done getting next task for host managed_node1 24134 1727096428.15850: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 24134 1727096428.15852: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096428.15866: getting variables 24134 1727096428.15871: in VariableManager get_vars() 24134 1727096428.15905: Calling all_inventory to load vars for managed_node1 24134 1727096428.15908: Calling groups_inventory to load vars for managed_node1 24134 1727096428.15910: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096428.15918: Calling all_plugins_play to load vars for managed_node1 24134 1727096428.15920: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096428.15923: Calling groups_plugins_play to load vars for managed_node1 24134 1727096428.17011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096428.18720: done with get_vars() 24134 1727096428.18743: done getting variables 24134 1727096428.18811: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 09:00:28 -0400 (0:00:00.076) 0:00:32.401 ****** 24134 1727096428.18841: entering _queue_task() for managed_node1/service 24134 1727096428.19230: worker is 1 (out of 1 available) 24134 1727096428.19242: exiting _queue_task() for managed_node1/service 24134 1727096428.19253: done queuing things up, now waiting for results queue to drain 24134 1727096428.19254: waiting for pending results... 24134 1727096428.19565: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 24134 1727096428.19601: in run() - task 0afff68d-5257-1673-d3fc-000000000092 24134 1727096428.19620: variable 'ansible_search_path' from source: unknown 24134 1727096428.19627: variable 'ansible_search_path' from source: unknown 24134 1727096428.19673: calling self._execute() 24134 1727096428.19773: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096428.19787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096428.19800: variable 'omit' from source: magic vars 24134 1727096428.20190: variable 'ansible_distribution_major_version' from source: facts 24134 1727096428.20215: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096428.20343: variable 'network_provider' from source: set_fact 24134 1727096428.20356: Evaluated conditional (network_provider == "initscripts"): False 24134 1727096428.20373: when evaluation is False, skipping this task 24134 1727096428.20376: _execute() done 24134 1727096428.20419: dumping result to json 24134 1727096428.20422: done dumping result, returning 24134 1727096428.20425: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-1673-d3fc-000000000092] 24134 1727096428.20427: sending task result for task 0afff68d-5257-1673-d3fc-000000000092 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24134 1727096428.20574: no more pending results, returning what we have 24134 1727096428.20578: results queue empty 24134 1727096428.20579: checking for any_errors_fatal 24134 1727096428.20593: done checking for any_errors_fatal 24134 1727096428.20594: checking for max_fail_percentage 24134 1727096428.20595: done checking for max_fail_percentage 24134 1727096428.20596: checking to see if all hosts have failed and the running result is not ok 24134 1727096428.20597: done checking to see if all hosts have failed 24134 1727096428.20598: getting the remaining hosts for this loop 24134 1727096428.20599: done getting the remaining hosts for this loop 24134 1727096428.20603: getting the next task for host managed_node1 24134 1727096428.20608: done getting next task for host managed_node1 24134 1727096428.20611: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24134 1727096428.20614: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096428.20630: getting variables 24134 1727096428.20632: in VariableManager get_vars() 24134 1727096428.20669: Calling all_inventory to load vars for managed_node1 24134 1727096428.20672: Calling groups_inventory to load vars for managed_node1 24134 1727096428.20674: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096428.20684: Calling all_plugins_play to load vars for managed_node1 24134 1727096428.20687: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096428.20689: Calling groups_plugins_play to load vars for managed_node1 24134 1727096428.21383: done sending task result for task 0afff68d-5257-1673-d3fc-000000000092 24134 1727096428.21387: WORKER PROCESS EXITING 24134 1727096428.22372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096428.23984: done with get_vars() 24134 1727096428.24008: done getting variables 24134 1727096428.24071: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 09:00:28 -0400 (0:00:00.052) 0:00:32.454 ****** 24134 1727096428.24100: entering _queue_task() for managed_node1/copy 24134 1727096428.24410: worker is 1 (out of 1 available) 24134 1727096428.24424: exiting _queue_task() for managed_node1/copy 24134 1727096428.24435: done queuing things up, now waiting for results queue to drain 24134 1727096428.24436: waiting for pending results... 24134 1727096428.24718: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24134 1727096428.24835: in run() - task 0afff68d-5257-1673-d3fc-000000000093 24134 1727096428.24854: variable 'ansible_search_path' from source: unknown 24134 1727096428.24862: variable 'ansible_search_path' from source: unknown 24134 1727096428.24907: calling self._execute() 24134 1727096428.25005: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096428.25017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096428.25029: variable 'omit' from source: magic vars 24134 1727096428.25407: variable 'ansible_distribution_major_version' from source: facts 24134 1727096428.25431: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096428.25557: variable 'network_provider' from source: set_fact 24134 1727096428.25570: Evaluated conditional (network_provider == "initscripts"): False 24134 1727096428.25579: when evaluation is False, skipping this task 24134 1727096428.25586: _execute() done 24134 1727096428.25592: dumping result to json 24134 1727096428.25599: done dumping result, returning 24134 1727096428.25611: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-1673-d3fc-000000000093] 24134 1727096428.25623: sending task result for task 0afff68d-5257-1673-d3fc-000000000093 skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 24134 1727096428.25802: no more pending results, returning what we have 24134 1727096428.25806: results queue empty 24134 1727096428.25807: checking for any_errors_fatal 24134 1727096428.25815: done checking for any_errors_fatal 24134 1727096428.25816: checking for max_fail_percentage 24134 1727096428.25818: done checking for max_fail_percentage 24134 1727096428.25819: checking to see if all hosts have failed and the running result is not ok 24134 1727096428.25819: done checking to see if all hosts have failed 24134 1727096428.25820: getting the remaining hosts for this loop 24134 1727096428.25822: done getting the remaining hosts for this loop 24134 1727096428.25826: getting the next task for host managed_node1 24134 1727096428.25833: done getting next task for host managed_node1 24134 1727096428.25837: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24134 1727096428.25839: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096428.26070: getting variables 24134 1727096428.26072: in VariableManager get_vars() 24134 1727096428.26110: Calling all_inventory to load vars for managed_node1 24134 1727096428.26114: Calling groups_inventory to load vars for managed_node1 24134 1727096428.26116: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096428.26125: Calling all_plugins_play to load vars for managed_node1 24134 1727096428.26129: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096428.26132: Calling groups_plugins_play to load vars for managed_node1 24134 1727096428.26686: done sending task result for task 0afff68d-5257-1673-d3fc-000000000093 24134 1727096428.26689: WORKER PROCESS EXITING 24134 1727096428.27719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096428.29281: done with get_vars() 24134 1727096428.29304: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 09:00:28 -0400 (0:00:00.052) 0:00:32.507 ****** 24134 1727096428.29389: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24134 1727096428.29714: worker is 1 (out of 1 available) 24134 1727096428.29727: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24134 1727096428.29740: done queuing things up, now waiting for results queue to drain 24134 1727096428.29741: waiting for pending results... 24134 1727096428.30018: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24134 1727096428.30131: in run() - task 0afff68d-5257-1673-d3fc-000000000094 24134 1727096428.30151: variable 'ansible_search_path' from source: unknown 24134 1727096428.30158: variable 'ansible_search_path' from source: unknown 24134 1727096428.30204: calling self._execute() 24134 1727096428.30306: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096428.30318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096428.30331: variable 'omit' from source: magic vars 24134 1727096428.30707: variable 'ansible_distribution_major_version' from source: facts 24134 1727096428.30723: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096428.30741: variable 'omit' from source: magic vars 24134 1727096428.30781: variable 'omit' from source: magic vars 24134 1727096428.30941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24134 1727096428.33057: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24134 1727096428.33129: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24134 1727096428.33179: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24134 1727096428.33218: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24134 1727096428.33255: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24134 1727096428.33334: variable 'network_provider' from source: set_fact 24134 1727096428.33482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24134 1727096428.33529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24134 1727096428.33560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24134 1727096428.33614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24134 1727096428.33634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24134 1727096428.33716: variable 'omit' from source: magic vars 24134 1727096428.33900: variable 'omit' from source: magic vars 24134 1727096428.33951: variable 'network_connections' from source: play vars 24134 1727096428.33969: variable 'profile' from source: play vars 24134 1727096428.34044: variable 'profile' from source: play vars 24134 1727096428.34054: variable 'interface' from source: set_fact 24134 1727096428.34119: variable 'interface' from source: set_fact 24134 1727096428.34270: variable 'omit' from source: magic vars 24134 1727096428.34285: variable '__lsr_ansible_managed' from source: task vars 24134 1727096428.34352: variable '__lsr_ansible_managed' from source: task vars 24134 1727096428.34614: Loaded config def from plugin (lookup/template) 24134 1727096428.34660: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 24134 1727096428.34663: File lookup term: get_ansible_managed.j2 24134 1727096428.34665: variable 'ansible_search_path' from source: unknown 24134 1727096428.34672: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 24134 1727096428.34690: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 24134 1727096428.34711: variable 'ansible_search_path' from source: unknown 24134 1727096428.47835: variable 'ansible_managed' from source: unknown 24134 1727096428.48020: variable 'omit' from source: magic vars 24134 1727096428.48024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096428.48048: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096428.48069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096428.48092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096428.48105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096428.48173: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096428.48177: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096428.48179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096428.48257: Set connection var ansible_shell_executable to /bin/sh 24134 1727096428.48270: Set connection var ansible_pipelining to False 24134 1727096428.48283: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096428.48298: Set connection var ansible_timeout to 10 24134 1727096428.48306: Set connection var ansible_connection to ssh 24134 1727096428.48347: Set connection var ansible_shell_type to sh 24134 1727096428.48350: variable 'ansible_shell_executable' from source: unknown 24134 1727096428.48353: variable 'ansible_connection' from source: unknown 24134 1727096428.48355: variable 'ansible_module_compression' from source: unknown 24134 1727096428.48363: variable 'ansible_shell_type' from source: unknown 24134 1727096428.48374: variable 'ansible_shell_executable' from source: unknown 24134 1727096428.48382: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096428.48390: variable 'ansible_pipelining' from source: unknown 24134 1727096428.48456: variable 'ansible_timeout' from source: unknown 24134 1727096428.48459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096428.48537: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096428.48569: variable 'omit' from source: magic vars 24134 1727096428.48581: starting attempt loop 24134 1727096428.48589: running the handler 24134 1727096428.48603: _low_level_execute_command(): starting 24134 1727096428.48613: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096428.49305: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096428.49326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096428.49447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096428.49534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096428.49604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096428.51326: stdout chunk (state=3): >>>/root <<< 24134 1727096428.51462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096428.51465: stdout chunk (state=3): >>><<< 24134 1727096428.51466: stderr chunk (state=3): >>><<< 24134 1727096428.51490: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096428.51502: _low_level_execute_command(): starting 24134 1727096428.51507: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089 `" && echo ansible-tmp-1727096428.5149002-25643-230672418588089="` echo /root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089 `" ) && sleep 0' 24134 1727096428.52091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096428.52104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096428.52175: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096428.52189: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096428.52215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096428.52230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096428.52252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096428.52353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096428.54330: stdout chunk (state=3): >>>ansible-tmp-1727096428.5149002-25643-230672418588089=/root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089 <<< 24134 1727096428.54431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096428.54473: stderr chunk (state=3): >>><<< 24134 1727096428.54483: stdout chunk (state=3): >>><<< 24134 1727096428.54500: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096428.5149002-25643-230672418588089=/root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096428.54608: variable 'ansible_module_compression' from source: unknown 24134 1727096428.54612: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 24134 1727096428.54640: variable 'ansible_facts' from source: unknown 24134 1727096428.54929: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089/AnsiballZ_network_connections.py 24134 1727096428.54953: Sending initial data 24134 1727096428.54962: Sent initial data (168 bytes) 24134 1727096428.55556: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096428.55606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096428.55628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096428.55714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096428.55747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096428.55835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096428.57483: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096428.57557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096428.57646: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmp42zixs_c /root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089/AnsiballZ_network_connections.py <<< 24134 1727096428.57674: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089/AnsiballZ_network_connections.py" <<< 24134 1727096428.57740: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmp42zixs_c" to remote "/root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089/AnsiballZ_network_connections.py" <<< 24134 1727096428.58856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096428.58859: stderr chunk (state=3): >>><<< 24134 1727096428.58862: stdout chunk (state=3): >>><<< 24134 1727096428.58892: done transferring module to remote 24134 1727096428.58900: _low_level_execute_command(): starting 24134 1727096428.58909: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089/ /root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089/AnsiballZ_network_connections.py && sleep 0' 24134 1727096428.59313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096428.59319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096428.59321: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096428.59323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096428.59325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096428.59376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096428.59382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096428.59450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096428.61327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096428.61346: stderr chunk (state=3): >>><<< 24134 1727096428.61354: stdout chunk (state=3): >>><<< 24134 1727096428.61366: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096428.61373: _low_level_execute_command(): starting 24134 1727096428.61376: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089/AnsiballZ_network_connections.py && sleep 0' 24134 1727096428.61774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096428.61777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096428.61779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096428.61781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096428.61783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096428.61831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096428.61834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096428.61912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096428.89774: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3nhu327r/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3nhu327r/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/7ebb5ff9-a77a-410c-88b2-c781d382a6fc: error=unknown <<< 24134 1727096428.89938: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 24134 1727096428.91839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096428.91863: stderr chunk (state=3): >>><<< 24134 1727096428.91866: stdout chunk (state=3): >>><<< 24134 1727096428.91885: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3nhu327r/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3nhu327r/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/7ebb5ff9-a77a-410c-88b2-c781d382a6fc: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096428.91918: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096428.91926: _low_level_execute_command(): starting 24134 1727096428.91931: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096428.5149002-25643-230672418588089/ > /dev/null 2>&1 && sleep 0' 24134 1727096428.92495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096428.92539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096428.92596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096428.92600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096428.92666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096428.94570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096428.94590: stderr chunk (state=3): >>><<< 24134 1727096428.94593: stdout chunk (state=3): >>><<< 24134 1727096428.94605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096428.94610: handler run complete 24134 1727096428.94627: attempt loop complete, returning result 24134 1727096428.94630: _execute() done 24134 1727096428.94632: dumping result to json 24134 1727096428.94636: done dumping result, returning 24134 1727096428.94644: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-1673-d3fc-000000000094] 24134 1727096428.94647: sending task result for task 0afff68d-5257-1673-d3fc-000000000094 24134 1727096428.94737: done sending task result for task 0afff68d-5257-1673-d3fc-000000000094 24134 1727096428.94740: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 24134 1727096428.94827: no more pending results, returning what we have 24134 1727096428.94830: results queue empty 24134 1727096428.94831: checking for any_errors_fatal 24134 1727096428.94836: done checking for any_errors_fatal 24134 1727096428.94837: checking for max_fail_percentage 24134 1727096428.94838: done checking for max_fail_percentage 24134 1727096428.94839: checking to see if all hosts have failed and the running result is not ok 24134 1727096428.94840: done checking to see if all hosts have failed 24134 1727096428.94841: getting the remaining hosts for this loop 24134 1727096428.94842: done getting the remaining hosts for this loop 24134 1727096428.94845: getting the next task for host managed_node1 24134 1727096428.94850: done getting next task for host managed_node1 24134 1727096428.94853: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 24134 1727096428.94855: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096428.94864: getting variables 24134 1727096428.94866: in VariableManager get_vars() 24134 1727096428.94902: Calling all_inventory to load vars for managed_node1 24134 1727096428.94905: Calling groups_inventory to load vars for managed_node1 24134 1727096428.94907: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096428.94915: Calling all_plugins_play to load vars for managed_node1 24134 1727096428.94919: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096428.94922: Calling groups_plugins_play to load vars for managed_node1 24134 1727096428.96257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096428.97142: done with get_vars() 24134 1727096428.97161: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 09:00:28 -0400 (0:00:00.678) 0:00:33.185 ****** 24134 1727096428.97221: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24134 1727096428.97457: worker is 1 (out of 1 available) 24134 1727096428.97474: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24134 1727096428.97484: done queuing things up, now waiting for results queue to drain 24134 1727096428.97485: waiting for pending results... 24134 1727096428.97655: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 24134 1727096428.97729: in run() - task 0afff68d-5257-1673-d3fc-000000000095 24134 1727096428.97742: variable 'ansible_search_path' from source: unknown 24134 1727096428.97746: variable 'ansible_search_path' from source: unknown 24134 1727096428.97777: calling self._execute() 24134 1727096428.97847: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096428.97851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096428.97860: variable 'omit' from source: magic vars 24134 1727096428.98128: variable 'ansible_distribution_major_version' from source: facts 24134 1727096428.98138: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096428.98225: variable 'network_state' from source: role '' defaults 24134 1727096428.98234: Evaluated conditional (network_state != {}): False 24134 1727096428.98238: when evaluation is False, skipping this task 24134 1727096428.98241: _execute() done 24134 1727096428.98243: dumping result to json 24134 1727096428.98246: done dumping result, returning 24134 1727096428.98260: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-1673-d3fc-000000000095] 24134 1727096428.98263: sending task result for task 0afff68d-5257-1673-d3fc-000000000095 24134 1727096428.98342: done sending task result for task 0afff68d-5257-1673-d3fc-000000000095 24134 1727096428.98345: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24134 1727096428.98415: no more pending results, returning what we have 24134 1727096428.98418: results queue empty 24134 1727096428.98419: checking for any_errors_fatal 24134 1727096428.98429: done checking for any_errors_fatal 24134 1727096428.98429: checking for max_fail_percentage 24134 1727096428.98431: done checking for max_fail_percentage 24134 1727096428.98432: checking to see if all hosts have failed and the running result is not ok 24134 1727096428.98432: done checking to see if all hosts have failed 24134 1727096428.98433: getting the remaining hosts for this loop 24134 1727096428.98435: done getting the remaining hosts for this loop 24134 1727096428.98438: getting the next task for host managed_node1 24134 1727096428.98442: done getting next task for host managed_node1 24134 1727096428.98446: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24134 1727096428.98448: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096428.98460: getting variables 24134 1727096428.98461: in VariableManager get_vars() 24134 1727096428.98495: Calling all_inventory to load vars for managed_node1 24134 1727096428.98497: Calling groups_inventory to load vars for managed_node1 24134 1727096428.98500: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096428.98508: Calling all_plugins_play to load vars for managed_node1 24134 1727096428.98510: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096428.98513: Calling groups_plugins_play to load vars for managed_node1 24134 1727096428.99388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096429.00250: done with get_vars() 24134 1727096429.00264: done getting variables 24134 1727096429.00310: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 09:00:29 -0400 (0:00:00.031) 0:00:33.216 ****** 24134 1727096429.00331: entering _queue_task() for managed_node1/debug 24134 1727096429.00547: worker is 1 (out of 1 available) 24134 1727096429.00562: exiting _queue_task() for managed_node1/debug 24134 1727096429.00578: done queuing things up, now waiting for results queue to drain 24134 1727096429.00580: waiting for pending results... 24134 1727096429.00752: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24134 1727096429.00822: in run() - task 0afff68d-5257-1673-d3fc-000000000096 24134 1727096429.00834: variable 'ansible_search_path' from source: unknown 24134 1727096429.00837: variable 'ansible_search_path' from source: unknown 24134 1727096429.00864: calling self._execute() 24134 1727096429.00936: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.00943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.00951: variable 'omit' from source: magic vars 24134 1727096429.01213: variable 'ansible_distribution_major_version' from source: facts 24134 1727096429.01223: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096429.01228: variable 'omit' from source: magic vars 24134 1727096429.01258: variable 'omit' from source: magic vars 24134 1727096429.01285: variable 'omit' from source: magic vars 24134 1727096429.01317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096429.01345: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096429.01361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096429.01376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096429.01385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096429.01407: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096429.01409: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.01412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.01484: Set connection var ansible_shell_executable to /bin/sh 24134 1727096429.01488: Set connection var ansible_pipelining to False 24134 1727096429.01493: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096429.01501: Set connection var ansible_timeout to 10 24134 1727096429.01503: Set connection var ansible_connection to ssh 24134 1727096429.01506: Set connection var ansible_shell_type to sh 24134 1727096429.01522: variable 'ansible_shell_executable' from source: unknown 24134 1727096429.01524: variable 'ansible_connection' from source: unknown 24134 1727096429.01527: variable 'ansible_module_compression' from source: unknown 24134 1727096429.01529: variable 'ansible_shell_type' from source: unknown 24134 1727096429.01531: variable 'ansible_shell_executable' from source: unknown 24134 1727096429.01534: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.01536: variable 'ansible_pipelining' from source: unknown 24134 1727096429.01540: variable 'ansible_timeout' from source: unknown 24134 1727096429.01544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.01643: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096429.01652: variable 'omit' from source: magic vars 24134 1727096429.01657: starting attempt loop 24134 1727096429.01660: running the handler 24134 1727096429.01757: variable '__network_connections_result' from source: set_fact 24134 1727096429.01801: handler run complete 24134 1727096429.01813: attempt loop complete, returning result 24134 1727096429.01816: _execute() done 24134 1727096429.01819: dumping result to json 24134 1727096429.01822: done dumping result, returning 24134 1727096429.01830: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-1673-d3fc-000000000096] 24134 1727096429.01834: sending task result for task 0afff68d-5257-1673-d3fc-000000000096 24134 1727096429.01913: done sending task result for task 0afff68d-5257-1673-d3fc-000000000096 24134 1727096429.01916: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 24134 1727096429.01975: no more pending results, returning what we have 24134 1727096429.01978: results queue empty 24134 1727096429.01979: checking for any_errors_fatal 24134 1727096429.01986: done checking for any_errors_fatal 24134 1727096429.01987: checking for max_fail_percentage 24134 1727096429.01988: done checking for max_fail_percentage 24134 1727096429.01989: checking to see if all hosts have failed and the running result is not ok 24134 1727096429.01990: done checking to see if all hosts have failed 24134 1727096429.01991: getting the remaining hosts for this loop 24134 1727096429.01992: done getting the remaining hosts for this loop 24134 1727096429.01996: getting the next task for host managed_node1 24134 1727096429.02001: done getting next task for host managed_node1 24134 1727096429.02004: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24134 1727096429.02006: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096429.02015: getting variables 24134 1727096429.02016: in VariableManager get_vars() 24134 1727096429.02046: Calling all_inventory to load vars for managed_node1 24134 1727096429.02048: Calling groups_inventory to load vars for managed_node1 24134 1727096429.02050: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096429.02058: Calling all_plugins_play to load vars for managed_node1 24134 1727096429.02061: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096429.02063: Calling groups_plugins_play to load vars for managed_node1 24134 1727096429.02831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096429.03791: done with get_vars() 24134 1727096429.03806: done getting variables 24134 1727096429.03846: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 09:00:29 -0400 (0:00:00.035) 0:00:33.252 ****** 24134 1727096429.03867: entering _queue_task() for managed_node1/debug 24134 1727096429.04087: worker is 1 (out of 1 available) 24134 1727096429.04100: exiting _queue_task() for managed_node1/debug 24134 1727096429.04112: done queuing things up, now waiting for results queue to drain 24134 1727096429.04113: waiting for pending results... 24134 1727096429.04277: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24134 1727096429.04336: in run() - task 0afff68d-5257-1673-d3fc-000000000097 24134 1727096429.04347: variable 'ansible_search_path' from source: unknown 24134 1727096429.04353: variable 'ansible_search_path' from source: unknown 24134 1727096429.04383: calling self._execute() 24134 1727096429.04448: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.04453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.04462: variable 'omit' from source: magic vars 24134 1727096429.04722: variable 'ansible_distribution_major_version' from source: facts 24134 1727096429.04731: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096429.04737: variable 'omit' from source: magic vars 24134 1727096429.04764: variable 'omit' from source: magic vars 24134 1727096429.04794: variable 'omit' from source: magic vars 24134 1727096429.04824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096429.04849: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096429.04864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096429.04880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096429.04892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096429.04915: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096429.04918: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.04921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.04988: Set connection var ansible_shell_executable to /bin/sh 24134 1727096429.04992: Set connection var ansible_pipelining to False 24134 1727096429.05000: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096429.05010: Set connection var ansible_timeout to 10 24134 1727096429.05014: Set connection var ansible_connection to ssh 24134 1727096429.05016: Set connection var ansible_shell_type to sh 24134 1727096429.05031: variable 'ansible_shell_executable' from source: unknown 24134 1727096429.05034: variable 'ansible_connection' from source: unknown 24134 1727096429.05037: variable 'ansible_module_compression' from source: unknown 24134 1727096429.05039: variable 'ansible_shell_type' from source: unknown 24134 1727096429.05041: variable 'ansible_shell_executable' from source: unknown 24134 1727096429.05043: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.05046: variable 'ansible_pipelining' from source: unknown 24134 1727096429.05048: variable 'ansible_timeout' from source: unknown 24134 1727096429.05053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.05154: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096429.05163: variable 'omit' from source: magic vars 24134 1727096429.05172: starting attempt loop 24134 1727096429.05176: running the handler 24134 1727096429.05210: variable '__network_connections_result' from source: set_fact 24134 1727096429.05265: variable '__network_connections_result' from source: set_fact 24134 1727096429.05340: handler run complete 24134 1727096429.05358: attempt loop complete, returning result 24134 1727096429.05361: _execute() done 24134 1727096429.05364: dumping result to json 24134 1727096429.05366: done dumping result, returning 24134 1727096429.05377: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-1673-d3fc-000000000097] 24134 1727096429.05381: sending task result for task 0afff68d-5257-1673-d3fc-000000000097 24134 1727096429.05461: done sending task result for task 0afff68d-5257-1673-d3fc-000000000097 24134 1727096429.05464: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 24134 1727096429.05540: no more pending results, returning what we have 24134 1727096429.05542: results queue empty 24134 1727096429.05543: checking for any_errors_fatal 24134 1727096429.05548: done checking for any_errors_fatal 24134 1727096429.05548: checking for max_fail_percentage 24134 1727096429.05550: done checking for max_fail_percentage 24134 1727096429.05550: checking to see if all hosts have failed and the running result is not ok 24134 1727096429.05551: done checking to see if all hosts have failed 24134 1727096429.05552: getting the remaining hosts for this loop 24134 1727096429.05553: done getting the remaining hosts for this loop 24134 1727096429.05556: getting the next task for host managed_node1 24134 1727096429.05561: done getting next task for host managed_node1 24134 1727096429.05563: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24134 1727096429.05565: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096429.05583: getting variables 24134 1727096429.05585: in VariableManager get_vars() 24134 1727096429.05613: Calling all_inventory to load vars for managed_node1 24134 1727096429.05615: Calling groups_inventory to load vars for managed_node1 24134 1727096429.05617: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096429.05625: Calling all_plugins_play to load vars for managed_node1 24134 1727096429.05627: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096429.05630: Calling groups_plugins_play to load vars for managed_node1 24134 1727096429.10008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096429.11209: done with get_vars() 24134 1727096429.11225: done getting variables 24134 1727096429.11261: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 09:00:29 -0400 (0:00:00.074) 0:00:33.326 ****** 24134 1727096429.11290: entering _queue_task() for managed_node1/debug 24134 1727096429.11557: worker is 1 (out of 1 available) 24134 1727096429.11575: exiting _queue_task() for managed_node1/debug 24134 1727096429.11590: done queuing things up, now waiting for results queue to drain 24134 1727096429.11591: waiting for pending results... 24134 1727096429.11776: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24134 1727096429.11853: in run() - task 0afff68d-5257-1673-d3fc-000000000098 24134 1727096429.11864: variable 'ansible_search_path' from source: unknown 24134 1727096429.11871: variable 'ansible_search_path' from source: unknown 24134 1727096429.11905: calling self._execute() 24134 1727096429.11992: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.11999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.12007: variable 'omit' from source: magic vars 24134 1727096429.12299: variable 'ansible_distribution_major_version' from source: facts 24134 1727096429.12309: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096429.12396: variable 'network_state' from source: role '' defaults 24134 1727096429.12405: Evaluated conditional (network_state != {}): False 24134 1727096429.12409: when evaluation is False, skipping this task 24134 1727096429.12413: _execute() done 24134 1727096429.12416: dumping result to json 24134 1727096429.12418: done dumping result, returning 24134 1727096429.12425: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-1673-d3fc-000000000098] 24134 1727096429.12431: sending task result for task 0afff68d-5257-1673-d3fc-000000000098 24134 1727096429.12517: done sending task result for task 0afff68d-5257-1673-d3fc-000000000098 24134 1727096429.12519: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 24134 1727096429.12566: no more pending results, returning what we have 24134 1727096429.12571: results queue empty 24134 1727096429.12573: checking for any_errors_fatal 24134 1727096429.12585: done checking for any_errors_fatal 24134 1727096429.12586: checking for max_fail_percentage 24134 1727096429.12587: done checking for max_fail_percentage 24134 1727096429.12588: checking to see if all hosts have failed and the running result is not ok 24134 1727096429.12589: done checking to see if all hosts have failed 24134 1727096429.12590: getting the remaining hosts for this loop 24134 1727096429.12591: done getting the remaining hosts for this loop 24134 1727096429.12594: getting the next task for host managed_node1 24134 1727096429.12600: done getting next task for host managed_node1 24134 1727096429.12603: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 24134 1727096429.12606: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096429.12619: getting variables 24134 1727096429.12620: in VariableManager get_vars() 24134 1727096429.12656: Calling all_inventory to load vars for managed_node1 24134 1727096429.12659: Calling groups_inventory to load vars for managed_node1 24134 1727096429.12661: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096429.12673: Calling all_plugins_play to load vars for managed_node1 24134 1727096429.12676: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096429.12678: Calling groups_plugins_play to load vars for managed_node1 24134 1727096429.14076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096429.16033: done with get_vars() 24134 1727096429.16055: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 09:00:29 -0400 (0:00:00.048) 0:00:33.375 ****** 24134 1727096429.16162: entering _queue_task() for managed_node1/ping 24134 1727096429.16489: worker is 1 (out of 1 available) 24134 1727096429.16500: exiting _queue_task() for managed_node1/ping 24134 1727096429.16512: done queuing things up, now waiting for results queue to drain 24134 1727096429.16513: waiting for pending results... 24134 1727096429.16897: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 24134 1727096429.16975: in run() - task 0afff68d-5257-1673-d3fc-000000000099 24134 1727096429.16979: variable 'ansible_search_path' from source: unknown 24134 1727096429.16982: variable 'ansible_search_path' from source: unknown 24134 1727096429.16985: calling self._execute() 24134 1727096429.17080: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.17092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.17109: variable 'omit' from source: magic vars 24134 1727096429.17495: variable 'ansible_distribution_major_version' from source: facts 24134 1727096429.17511: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096429.17522: variable 'omit' from source: magic vars 24134 1727096429.17567: variable 'omit' from source: magic vars 24134 1727096429.17611: variable 'omit' from source: magic vars 24134 1727096429.17750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096429.17754: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096429.17756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096429.17759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096429.17761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096429.17830: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096429.17842: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.17852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.17957: Set connection var ansible_shell_executable to /bin/sh 24134 1727096429.17966: Set connection var ansible_pipelining to False 24134 1727096429.17985: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096429.18000: Set connection var ansible_timeout to 10 24134 1727096429.18007: Set connection var ansible_connection to ssh 24134 1727096429.18092: Set connection var ansible_shell_type to sh 24134 1727096429.18096: variable 'ansible_shell_executable' from source: unknown 24134 1727096429.18097: variable 'ansible_connection' from source: unknown 24134 1727096429.18100: variable 'ansible_module_compression' from source: unknown 24134 1727096429.18102: variable 'ansible_shell_type' from source: unknown 24134 1727096429.18103: variable 'ansible_shell_executable' from source: unknown 24134 1727096429.18105: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.18106: variable 'ansible_pipelining' from source: unknown 24134 1727096429.18108: variable 'ansible_timeout' from source: unknown 24134 1727096429.18110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.18276: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096429.18294: variable 'omit' from source: magic vars 24134 1727096429.18305: starting attempt loop 24134 1727096429.18375: running the handler 24134 1727096429.18378: _low_level_execute_command(): starting 24134 1727096429.18380: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096429.19047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096429.19089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096429.19103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.19195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096429.19239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096429.19309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096429.21053: stdout chunk (state=3): >>>/root <<< 24134 1727096429.21470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096429.21473: stdout chunk (state=3): >>><<< 24134 1727096429.21476: stderr chunk (state=3): >>><<< 24134 1727096429.21479: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096429.21482: _low_level_execute_command(): starting 24134 1727096429.21485: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873 `" && echo ansible-tmp-1727096429.2137988-25681-151230025466873="` echo /root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873 `" ) && sleep 0' 24134 1727096429.22391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096429.22454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.22476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096429.22684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096429.22777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096429.24832: stdout chunk (state=3): >>>ansible-tmp-1727096429.2137988-25681-151230025466873=/root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873 <<< 24134 1727096429.24931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096429.25056: stderr chunk (state=3): >>><<< 24134 1727096429.25060: stdout chunk (state=3): >>><<< 24134 1727096429.25063: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096429.2137988-25681-151230025466873=/root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096429.25275: variable 'ansible_module_compression' from source: unknown 24134 1727096429.25278: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 24134 1727096429.25281: variable 'ansible_facts' from source: unknown 24134 1727096429.25466: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873/AnsiballZ_ping.py 24134 1727096429.25949: Sending initial data 24134 1727096429.25952: Sent initial data (153 bytes) 24134 1727096429.27095: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096429.27233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.27294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096429.27355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096429.27497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096429.29207: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096429.29265: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096429.29353: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpzmdl6z57 /root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873/AnsiballZ_ping.py <<< 24134 1727096429.29357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873/AnsiballZ_ping.py" <<< 24134 1727096429.29436: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpzmdl6z57" to remote "/root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873/AnsiballZ_ping.py" <<< 24134 1727096429.30773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096429.30954: stderr chunk (state=3): >>><<< 24134 1727096429.30964: stdout chunk (state=3): >>><<< 24134 1727096429.30993: done transferring module to remote 24134 1727096429.31039: _low_level_execute_command(): starting 24134 1727096429.31050: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873/ /root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873/AnsiballZ_ping.py && sleep 0' 24134 1727096429.32189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096429.32205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096429.32242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096429.32350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096429.32502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096429.32565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096429.34508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096429.34535: stdout chunk (state=3): >>><<< 24134 1727096429.34729: stderr chunk (state=3): >>><<< 24134 1727096429.34736: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096429.34745: _low_level_execute_command(): starting 24134 1727096429.34748: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873/AnsiballZ_ping.py && sleep 0' 24134 1727096429.35900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096429.35904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096429.35956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.36003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096429.36072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096429.36135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096429.36592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096429.51888: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 24134 1727096429.53365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096429.53378: stdout chunk (state=3): >>><<< 24134 1727096429.53393: stderr chunk (state=3): >>><<< 24134 1727096429.53419: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096429.53456: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096429.53475: _low_level_execute_command(): starting 24134 1727096429.53486: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096429.2137988-25681-151230025466873/ > /dev/null 2>&1 && sleep 0' 24134 1727096429.54073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096429.54116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096429.54133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096429.54160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096429.54235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.54314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096429.54343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096429.54366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096429.54474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096429.56545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096429.56598: stdout chunk (state=3): >>><<< 24134 1727096429.56601: stderr chunk (state=3): >>><<< 24134 1727096429.56796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096429.56804: handler run complete 24134 1727096429.56806: attempt loop complete, returning result 24134 1727096429.56808: _execute() done 24134 1727096429.56811: dumping result to json 24134 1727096429.56813: done dumping result, returning 24134 1727096429.56815: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-1673-d3fc-000000000099] 24134 1727096429.56817: sending task result for task 0afff68d-5257-1673-d3fc-000000000099 24134 1727096429.56890: done sending task result for task 0afff68d-5257-1673-d3fc-000000000099 24134 1727096429.57073: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 24134 1727096429.57139: no more pending results, returning what we have 24134 1727096429.57143: results queue empty 24134 1727096429.57144: checking for any_errors_fatal 24134 1727096429.57150: done checking for any_errors_fatal 24134 1727096429.57151: checking for max_fail_percentage 24134 1727096429.57152: done checking for max_fail_percentage 24134 1727096429.57153: checking to see if all hosts have failed and the running result is not ok 24134 1727096429.57154: done checking to see if all hosts have failed 24134 1727096429.57155: getting the remaining hosts for this loop 24134 1727096429.57157: done getting the remaining hosts for this loop 24134 1727096429.57161: getting the next task for host managed_node1 24134 1727096429.57173: done getting next task for host managed_node1 24134 1727096429.57176: ^ task is: TASK: meta (role_complete) 24134 1727096429.57178: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096429.57189: getting variables 24134 1727096429.57191: in VariableManager get_vars() 24134 1727096429.57230: Calling all_inventory to load vars for managed_node1 24134 1727096429.57233: Calling groups_inventory to load vars for managed_node1 24134 1727096429.57235: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096429.57245: Calling all_plugins_play to load vars for managed_node1 24134 1727096429.57249: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096429.57252: Calling groups_plugins_play to load vars for managed_node1 24134 1727096429.59334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096429.60930: done with get_vars() 24134 1727096429.60954: done getting variables 24134 1727096429.61034: done queuing things up, now waiting for results queue to drain 24134 1727096429.61036: results queue empty 24134 1727096429.61037: checking for any_errors_fatal 24134 1727096429.61039: done checking for any_errors_fatal 24134 1727096429.61040: checking for max_fail_percentage 24134 1727096429.61041: done checking for max_fail_percentage 24134 1727096429.61042: checking to see if all hosts have failed and the running result is not ok 24134 1727096429.61042: done checking to see if all hosts have failed 24134 1727096429.61043: getting the remaining hosts for this loop 24134 1727096429.61045: done getting the remaining hosts for this loop 24134 1727096429.61047: getting the next task for host managed_node1 24134 1727096429.61051: done getting next task for host managed_node1 24134 1727096429.61052: ^ task is: TASK: meta (flush_handlers) 24134 1727096429.61054: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096429.61057: getting variables 24134 1727096429.61058: in VariableManager get_vars() 24134 1727096429.61071: Calling all_inventory to load vars for managed_node1 24134 1727096429.61073: Calling groups_inventory to load vars for managed_node1 24134 1727096429.61075: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096429.61080: Calling all_plugins_play to load vars for managed_node1 24134 1727096429.61082: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096429.61085: Calling groups_plugins_play to load vars for managed_node1 24134 1727096429.62339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096429.63924: done with get_vars() 24134 1727096429.63945: done getting variables 24134 1727096429.63999: in VariableManager get_vars() 24134 1727096429.64011: Calling all_inventory to load vars for managed_node1 24134 1727096429.64013: Calling groups_inventory to load vars for managed_node1 24134 1727096429.64015: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096429.64020: Calling all_plugins_play to load vars for managed_node1 24134 1727096429.64022: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096429.64025: Calling groups_plugins_play to load vars for managed_node1 24134 1727096429.65158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096429.67506: done with get_vars() 24134 1727096429.67536: done queuing things up, now waiting for results queue to drain 24134 1727096429.67538: results queue empty 24134 1727096429.67538: checking for any_errors_fatal 24134 1727096429.67540: done checking for any_errors_fatal 24134 1727096429.67540: checking for max_fail_percentage 24134 1727096429.67541: done checking for max_fail_percentage 24134 1727096429.67542: checking to see if all hosts have failed and the running result is not ok 24134 1727096429.67543: done checking to see if all hosts have failed 24134 1727096429.67543: getting the remaining hosts for this loop 24134 1727096429.67545: done getting the remaining hosts for this loop 24134 1727096429.67547: getting the next task for host managed_node1 24134 1727096429.67551: done getting next task for host managed_node1 24134 1727096429.67553: ^ task is: TASK: meta (flush_handlers) 24134 1727096429.67554: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096429.67557: getting variables 24134 1727096429.67558: in VariableManager get_vars() 24134 1727096429.67774: Calling all_inventory to load vars for managed_node1 24134 1727096429.67777: Calling groups_inventory to load vars for managed_node1 24134 1727096429.67779: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096429.67785: Calling all_plugins_play to load vars for managed_node1 24134 1727096429.67787: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096429.67789: Calling groups_plugins_play to load vars for managed_node1 24134 1727096429.70255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096429.73427: done with get_vars() 24134 1727096429.73454: done getting variables 24134 1727096429.73509: in VariableManager get_vars() 24134 1727096429.73522: Calling all_inventory to load vars for managed_node1 24134 1727096429.73525: Calling groups_inventory to load vars for managed_node1 24134 1727096429.73527: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096429.73532: Calling all_plugins_play to load vars for managed_node1 24134 1727096429.73535: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096429.73537: Calling groups_plugins_play to load vars for managed_node1 24134 1727096429.75272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096429.76972: done with get_vars() 24134 1727096429.77000: done queuing things up, now waiting for results queue to drain 24134 1727096429.77002: results queue empty 24134 1727096429.77003: checking for any_errors_fatal 24134 1727096429.77004: done checking for any_errors_fatal 24134 1727096429.77005: checking for max_fail_percentage 24134 1727096429.77006: done checking for max_fail_percentage 24134 1727096429.77007: checking to see if all hosts have failed and the running result is not ok 24134 1727096429.77008: done checking to see if all hosts have failed 24134 1727096429.77008: getting the remaining hosts for this loop 24134 1727096429.77009: done getting the remaining hosts for this loop 24134 1727096429.77012: getting the next task for host managed_node1 24134 1727096429.77015: done getting next task for host managed_node1 24134 1727096429.77016: ^ task is: None 24134 1727096429.77018: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096429.77019: done queuing things up, now waiting for results queue to drain 24134 1727096429.77020: results queue empty 24134 1727096429.77020: checking for any_errors_fatal 24134 1727096429.77021: done checking for any_errors_fatal 24134 1727096429.77022: checking for max_fail_percentage 24134 1727096429.77022: done checking for max_fail_percentage 24134 1727096429.77023: checking to see if all hosts have failed and the running result is not ok 24134 1727096429.77024: done checking to see if all hosts have failed 24134 1727096429.77025: getting the next task for host managed_node1 24134 1727096429.77027: done getting next task for host managed_node1 24134 1727096429.77028: ^ task is: None 24134 1727096429.77029: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096429.77078: in VariableManager get_vars() 24134 1727096429.77094: done with get_vars() 24134 1727096429.77099: in VariableManager get_vars() 24134 1727096429.77108: done with get_vars() 24134 1727096429.77112: variable 'omit' from source: magic vars 24134 1727096429.77148: in VariableManager get_vars() 24134 1727096429.77159: done with get_vars() 24134 1727096429.77181: variable 'omit' from source: magic vars PLAY [Delete the interface, then assert that device and profile are absent] **** 24134 1727096429.77412: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24134 1727096429.77433: getting the remaining hosts for this loop 24134 1727096429.77435: done getting the remaining hosts for this loop 24134 1727096429.77437: getting the next task for host managed_node1 24134 1727096429.77440: done getting next task for host managed_node1 24134 1727096429.77442: ^ task is: TASK: Gathering Facts 24134 1727096429.77443: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096429.77445: getting variables 24134 1727096429.77446: in VariableManager get_vars() 24134 1727096429.77461: Calling all_inventory to load vars for managed_node1 24134 1727096429.77464: Calling groups_inventory to load vars for managed_node1 24134 1727096429.77466: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096429.77474: Calling all_plugins_play to load vars for managed_node1 24134 1727096429.77477: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096429.77481: Calling groups_plugins_play to load vars for managed_node1 24134 1727096429.78670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096429.80235: done with get_vars() 24134 1727096429.80258: done getting variables 24134 1727096429.80305: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 Monday 23 September 2024 09:00:29 -0400 (0:00:00.641) 0:00:34.016 ****** 24134 1727096429.80331: entering _queue_task() for managed_node1/gather_facts 24134 1727096429.80665: worker is 1 (out of 1 available) 24134 1727096429.80680: exiting _queue_task() for managed_node1/gather_facts 24134 1727096429.80694: done queuing things up, now waiting for results queue to drain 24134 1727096429.80695: waiting for pending results... 24134 1727096429.81082: running TaskExecutor() for managed_node1/TASK: Gathering Facts 24134 1727096429.81103: in run() - task 0afff68d-5257-1673-d3fc-0000000005ee 24134 1727096429.81192: variable 'ansible_search_path' from source: unknown 24134 1727096429.81205: calling self._execute() 24134 1727096429.81614: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.81742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.81747: variable 'omit' from source: magic vars 24134 1727096429.82110: variable 'ansible_distribution_major_version' from source: facts 24134 1727096429.82142: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096429.82155: variable 'omit' from source: magic vars 24134 1727096429.82196: variable 'omit' from source: magic vars 24134 1727096429.82274: variable 'omit' from source: magic vars 24134 1727096429.82292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096429.82334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096429.82361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096429.82397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096429.82475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096429.82479: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096429.82482: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.82486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.82625: Set connection var ansible_shell_executable to /bin/sh 24134 1727096429.82638: Set connection var ansible_pipelining to False 24134 1727096429.82650: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096429.82663: Set connection var ansible_timeout to 10 24134 1727096429.82709: Set connection var ansible_connection to ssh 24134 1727096429.82717: Set connection var ansible_shell_type to sh 24134 1727096429.82719: variable 'ansible_shell_executable' from source: unknown 24134 1727096429.82721: variable 'ansible_connection' from source: unknown 24134 1727096429.82723: variable 'ansible_module_compression' from source: unknown 24134 1727096429.82731: variable 'ansible_shell_type' from source: unknown 24134 1727096429.82738: variable 'ansible_shell_executable' from source: unknown 24134 1727096429.82744: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096429.82752: variable 'ansible_pipelining' from source: unknown 24134 1727096429.82758: variable 'ansible_timeout' from source: unknown 24134 1727096429.82818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096429.83025: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096429.83151: variable 'omit' from source: magic vars 24134 1727096429.83155: starting attempt loop 24134 1727096429.83157: running the handler 24134 1727096429.83176: variable 'ansible_facts' from source: unknown 24134 1727096429.83205: _low_level_execute_command(): starting 24134 1727096429.83218: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096429.83980: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096429.83995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096429.84062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.84080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.84131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096429.84158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096429.84203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096429.84272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096429.86080: stdout chunk (state=3): >>>/root <<< 24134 1727096429.86170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096429.86190: stderr chunk (state=3): >>><<< 24134 1727096429.86194: stdout chunk (state=3): >>><<< 24134 1727096429.86217: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096429.86227: _low_level_execute_command(): starting 24134 1727096429.86234: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396 `" && echo ansible-tmp-1727096429.862147-25712-55043775469396="` echo /root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396 `" ) && sleep 0' 24134 1727096429.86681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096429.86685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.86694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096429.86697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096429.86699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.86745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096429.86752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096429.86754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096429.86824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096429.88857: stdout chunk (state=3): >>>ansible-tmp-1727096429.862147-25712-55043775469396=/root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396 <<< 24134 1727096429.88961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096429.88991: stderr chunk (state=3): >>><<< 24134 1727096429.88994: stdout chunk (state=3): >>><<< 24134 1727096429.89012: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096429.862147-25712-55043775469396=/root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096429.89041: variable 'ansible_module_compression' from source: unknown 24134 1727096429.89085: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24134 1727096429.89137: variable 'ansible_facts' from source: unknown 24134 1727096429.89266: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396/AnsiballZ_setup.py 24134 1727096429.89378: Sending initial data 24134 1727096429.89382: Sent initial data (152 bytes) 24134 1727096429.89819: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.89865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096429.89878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096429.89947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096429.91620: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096429.91685: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096429.91747: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpiqp03zc6 /root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396/AnsiballZ_setup.py <<< 24134 1727096429.91756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396/AnsiballZ_setup.py" <<< 24134 1727096429.91810: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpiqp03zc6" to remote "/root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396/AnsiballZ_setup.py" <<< 24134 1727096429.92953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096429.93009: stderr chunk (state=3): >>><<< 24134 1727096429.93012: stdout chunk (state=3): >>><<< 24134 1727096429.93043: done transferring module to remote 24134 1727096429.93047: _low_level_execute_command(): starting 24134 1727096429.93050: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396/ /root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396/AnsiballZ_setup.py && sleep 0' 24134 1727096429.93737: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.93808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096429.93814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096429.93891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096429.95761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096429.95785: stderr chunk (state=3): >>><<< 24134 1727096429.95789: stdout chunk (state=3): >>><<< 24134 1727096429.95801: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096429.95804: _low_level_execute_command(): starting 24134 1727096429.95809: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396/AnsiballZ_setup.py && sleep 0' 24134 1727096429.96222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096429.96225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.96228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096429.96230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096429.96232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096429.96233: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096429.96314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096429.96391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096430.63059: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "r<<< 24134 1727096430.63072: stdout chunk (state=3): >>>oot", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.67724609375, "5m": 0.4833984375, "15m": 0.2470703125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "30", "epoch": "1727096430", "epoch_int": "1727096430", "date": "2024-09-23", "time": "09:00:30", "iso8601_micro": "2024-09-23T13:00:30.247757Z", "iso8601": "2024-09-23T13:00:30Z", "iso8601_basic": "20240923T090030247757", "iso8601_basic_short": "20240923T090030", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2948, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 583, "free": 2948}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 583, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795180544, "block_size": 4096, "block_total": 65519099, "block_available": 63914839, "block_used": 1604260, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "ethtest0", "eth0", "peerethtest0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "pr<<< 24134 1727096430.63110: stdout chunk (state=3): >>>omisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "42:d5:59:35:93:c8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40d5:59ff:fe35:93c8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "72:05:50:f0:ff:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7005:50ff:fef0:ffb8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::40d5:59ff:fe35:93c8", "fe80::10ff:acff:fe3f:90f5", "fe80::7005:50ff:fef0:ffb8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::40d5:59ff:fe35:93c8", "fe80::7005:50ff:fef0:ffb8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24134 1727096430.65181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096430.65237: stderr chunk (state=3): >>>Shared connection to 10.31.11.125 closed. <<< 24134 1727096430.65253: stderr chunk (state=3): >>><<< 24134 1727096430.65262: stdout chunk (state=3): >>><<< 24134 1727096430.65321: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.67724609375, "5m": 0.4833984375, "15m": 0.2470703125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "30", "epoch": "1727096430", "epoch_int": "1727096430", "date": "2024-09-23", "time": "09:00:30", "iso8601_micro": "2024-09-23T13:00:30.247757Z", "iso8601": "2024-09-23T13:00:30Z", "iso8601_basic": "20240923T090030247757", "iso8601_basic_short": "20240923T090030", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2948, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 583, "free": 2948}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 583, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795180544, "block_size": 4096, "block_total": 65519099, "block_available": 63914839, "block_used": 1604260, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "ethtest0", "eth0", "peerethtest0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "42:d5:59:35:93:c8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40d5:59ff:fe35:93c8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "72:05:50:f0:ff:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7005:50ff:fef0:ffb8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::40d5:59ff:fe35:93c8", "fe80::10ff:acff:fe3f:90f5", "fe80::7005:50ff:fef0:ffb8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::40d5:59ff:fe35:93c8", "fe80::7005:50ff:fef0:ffb8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096430.65875: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096430.65879: _low_level_execute_command(): starting 24134 1727096430.65881: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096429.862147-25712-55043775469396/ > /dev/null 2>&1 && sleep 0' 24134 1727096430.66496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096430.66508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096430.66556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096430.66629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096430.66651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096430.66680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096430.66788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096430.68776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096430.68779: stdout chunk (state=3): >>><<< 24134 1727096430.68781: stderr chunk (state=3): >>><<< 24134 1727096430.68796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096430.68809: handler run complete 24134 1727096430.68984: variable 'ansible_facts' from source: unknown 24134 1727096430.69176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096430.69505: variable 'ansible_facts' from source: unknown 24134 1727096430.69614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096430.69802: attempt loop complete, returning result 24134 1727096430.69814: _execute() done 24134 1727096430.69827: dumping result to json 24134 1727096430.69876: done dumping result, returning 24134 1727096430.69889: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-1673-d3fc-0000000005ee] 24134 1727096430.69897: sending task result for task 0afff68d-5257-1673-d3fc-0000000005ee 24134 1727096430.70535: done sending task result for task 0afff68d-5257-1673-d3fc-0000000005ee 24134 1727096430.70538: WORKER PROCESS EXITING ok: [managed_node1] 24134 1727096430.71076: no more pending results, returning what we have 24134 1727096430.71084: results queue empty 24134 1727096430.71085: checking for any_errors_fatal 24134 1727096430.71086: done checking for any_errors_fatal 24134 1727096430.71087: checking for max_fail_percentage 24134 1727096430.71088: done checking for max_fail_percentage 24134 1727096430.71089: checking to see if all hosts have failed and the running result is not ok 24134 1727096430.71090: done checking to see if all hosts have failed 24134 1727096430.71091: getting the remaining hosts for this loop 24134 1727096430.71092: done getting the remaining hosts for this loop 24134 1727096430.71096: getting the next task for host managed_node1 24134 1727096430.71101: done getting next task for host managed_node1 24134 1727096430.71103: ^ task is: TASK: meta (flush_handlers) 24134 1727096430.71105: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096430.71109: getting variables 24134 1727096430.71111: in VariableManager get_vars() 24134 1727096430.71133: Calling all_inventory to load vars for managed_node1 24134 1727096430.71136: Calling groups_inventory to load vars for managed_node1 24134 1727096430.71139: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096430.71149: Calling all_plugins_play to load vars for managed_node1 24134 1727096430.71152: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096430.71155: Calling groups_plugins_play to load vars for managed_node1 24134 1727096430.72621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096430.74383: done with get_vars() 24134 1727096430.74407: done getting variables 24134 1727096430.74483: in VariableManager get_vars() 24134 1727096430.74492: Calling all_inventory to load vars for managed_node1 24134 1727096430.74495: Calling groups_inventory to load vars for managed_node1 24134 1727096430.74497: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096430.74502: Calling all_plugins_play to load vars for managed_node1 24134 1727096430.74504: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096430.74506: Calling groups_plugins_play to load vars for managed_node1 24134 1727096430.75733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096430.77493: done with get_vars() 24134 1727096430.77519: done queuing things up, now waiting for results queue to drain 24134 1727096430.77521: results queue empty 24134 1727096430.77522: checking for any_errors_fatal 24134 1727096430.77526: done checking for any_errors_fatal 24134 1727096430.77527: checking for max_fail_percentage 24134 1727096430.77528: done checking for max_fail_percentage 24134 1727096430.77533: checking to see if all hosts have failed and the running result is not ok 24134 1727096430.77534: done checking to see if all hosts have failed 24134 1727096430.77534: getting the remaining hosts for this loop 24134 1727096430.77535: done getting the remaining hosts for this loop 24134 1727096430.77538: getting the next task for host managed_node1 24134 1727096430.77542: done getting next task for host managed_node1 24134 1727096430.77545: ^ task is: TASK: Include the task 'delete_interface.yml' 24134 1727096430.77551: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096430.77554: getting variables 24134 1727096430.77555: in VariableManager get_vars() 24134 1727096430.77565: Calling all_inventory to load vars for managed_node1 24134 1727096430.77569: Calling groups_inventory to load vars for managed_node1 24134 1727096430.77572: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096430.77577: Calling all_plugins_play to load vars for managed_node1 24134 1727096430.77580: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096430.77583: Calling groups_plugins_play to load vars for managed_node1 24134 1727096430.78737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096430.79883: done with get_vars() 24134 1727096430.79899: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:83 Monday 23 September 2024 09:00:30 -0400 (0:00:00.996) 0:00:35.012 ****** 24134 1727096430.79954: entering _queue_task() for managed_node1/include_tasks 24134 1727096430.80211: worker is 1 (out of 1 available) 24134 1727096430.80225: exiting _queue_task() for managed_node1/include_tasks 24134 1727096430.80237: done queuing things up, now waiting for results queue to drain 24134 1727096430.80238: waiting for pending results... 24134 1727096430.80420: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 24134 1727096430.80494: in run() - task 0afff68d-5257-1673-d3fc-00000000009c 24134 1727096430.80505: variable 'ansible_search_path' from source: unknown 24134 1727096430.80533: calling self._execute() 24134 1727096430.80604: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096430.80608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096430.80618: variable 'omit' from source: magic vars 24134 1727096430.80904: variable 'ansible_distribution_major_version' from source: facts 24134 1727096430.80914: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096430.80922: _execute() done 24134 1727096430.80926: dumping result to json 24134 1727096430.80929: done dumping result, returning 24134 1727096430.80935: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [0afff68d-5257-1673-d3fc-00000000009c] 24134 1727096430.80940: sending task result for task 0afff68d-5257-1673-d3fc-00000000009c 24134 1727096430.81029: done sending task result for task 0afff68d-5257-1673-d3fc-00000000009c 24134 1727096430.81032: WORKER PROCESS EXITING 24134 1727096430.81075: no more pending results, returning what we have 24134 1727096430.81079: in VariableManager get_vars() 24134 1727096430.81113: Calling all_inventory to load vars for managed_node1 24134 1727096430.81115: Calling groups_inventory to load vars for managed_node1 24134 1727096430.81118: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096430.81131: Calling all_plugins_play to load vars for managed_node1 24134 1727096430.81134: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096430.81137: Calling groups_plugins_play to load vars for managed_node1 24134 1727096430.82545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096430.83415: done with get_vars() 24134 1727096430.83430: variable 'ansible_search_path' from source: unknown 24134 1727096430.83441: we have included files to process 24134 1727096430.83442: generating all_blocks data 24134 1727096430.83443: done generating all_blocks data 24134 1727096430.83445: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 24134 1727096430.83445: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 24134 1727096430.83447: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 24134 1727096430.83611: done processing included file 24134 1727096430.83613: iterating over new_blocks loaded from include file 24134 1727096430.83614: in VariableManager get_vars() 24134 1727096430.83622: done with get_vars() 24134 1727096430.83623: filtering new block on tags 24134 1727096430.83632: done filtering new block on tags 24134 1727096430.83634: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 24134 1727096430.83637: extending task lists for all hosts with included blocks 24134 1727096430.83689: done extending task lists 24134 1727096430.83690: done processing included files 24134 1727096430.83690: results queue empty 24134 1727096430.83691: checking for any_errors_fatal 24134 1727096430.83692: done checking for any_errors_fatal 24134 1727096430.83692: checking for max_fail_percentage 24134 1727096430.83693: done checking for max_fail_percentage 24134 1727096430.83694: checking to see if all hosts have failed and the running result is not ok 24134 1727096430.83694: done checking to see if all hosts have failed 24134 1727096430.83695: getting the remaining hosts for this loop 24134 1727096430.83696: done getting the remaining hosts for this loop 24134 1727096430.83697: getting the next task for host managed_node1 24134 1727096430.83699: done getting next task for host managed_node1 24134 1727096430.83701: ^ task is: TASK: Remove test interface if necessary 24134 1727096430.83702: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096430.83704: getting variables 24134 1727096430.83704: in VariableManager get_vars() 24134 1727096430.83710: Calling all_inventory to load vars for managed_node1 24134 1727096430.83711: Calling groups_inventory to load vars for managed_node1 24134 1727096430.83713: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096430.83717: Calling all_plugins_play to load vars for managed_node1 24134 1727096430.83718: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096430.83720: Calling groups_plugins_play to load vars for managed_node1 24134 1727096430.84390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096430.85870: done with get_vars() 24134 1727096430.85888: done getting variables 24134 1727096430.85921: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Monday 23 September 2024 09:00:30 -0400 (0:00:00.059) 0:00:35.072 ****** 24134 1727096430.85943: entering _queue_task() for managed_node1/command 24134 1727096430.86212: worker is 1 (out of 1 available) 24134 1727096430.86226: exiting _queue_task() for managed_node1/command 24134 1727096430.86240: done queuing things up, now waiting for results queue to drain 24134 1727096430.86242: waiting for pending results... 24134 1727096430.86413: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 24134 1727096430.86481: in run() - task 0afff68d-5257-1673-d3fc-0000000005ff 24134 1727096430.86492: variable 'ansible_search_path' from source: unknown 24134 1727096430.86496: variable 'ansible_search_path' from source: unknown 24134 1727096430.86525: calling self._execute() 24134 1727096430.86603: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096430.86606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096430.86617: variable 'omit' from source: magic vars 24134 1727096430.86893: variable 'ansible_distribution_major_version' from source: facts 24134 1727096430.86908: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096430.86912: variable 'omit' from source: magic vars 24134 1727096430.86938: variable 'omit' from source: magic vars 24134 1727096430.87007: variable 'interface' from source: set_fact 24134 1727096430.87024: variable 'omit' from source: magic vars 24134 1727096430.87055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096430.87084: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096430.87100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096430.87113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096430.87124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096430.87153: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096430.87156: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096430.87159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096430.87229: Set connection var ansible_shell_executable to /bin/sh 24134 1727096430.87236: Set connection var ansible_pipelining to False 24134 1727096430.87242: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096430.87250: Set connection var ansible_timeout to 10 24134 1727096430.87253: Set connection var ansible_connection to ssh 24134 1727096430.87256: Set connection var ansible_shell_type to sh 24134 1727096430.87282: variable 'ansible_shell_executable' from source: unknown 24134 1727096430.87286: variable 'ansible_connection' from source: unknown 24134 1727096430.87288: variable 'ansible_module_compression' from source: unknown 24134 1727096430.87290: variable 'ansible_shell_type' from source: unknown 24134 1727096430.87293: variable 'ansible_shell_executable' from source: unknown 24134 1727096430.87295: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096430.87297: variable 'ansible_pipelining' from source: unknown 24134 1727096430.87299: variable 'ansible_timeout' from source: unknown 24134 1727096430.87301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096430.87412: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096430.87452: variable 'omit' from source: magic vars 24134 1727096430.87456: starting attempt loop 24134 1727096430.87458: running the handler 24134 1727096430.87460: _low_level_execute_command(): starting 24134 1727096430.87480: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096430.88193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096430.88229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096430.88233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096430.88262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096430.88279: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096430.88315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096430.88325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096430.88365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096430.88475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096430.90234: stdout chunk (state=3): >>>/root <<< 24134 1727096430.90374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096430.90379: stdout chunk (state=3): >>><<< 24134 1727096430.90382: stderr chunk (state=3): >>><<< 24134 1727096430.90395: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096430.90405: _low_level_execute_command(): starting 24134 1727096430.90412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393 `" && echo ansible-tmp-1727096430.903931-25756-91397388193393="` echo /root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393 `" ) && sleep 0' 24134 1727096430.90881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096430.90905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096430.90987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096430.91035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096430.91107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096430.93085: stdout chunk (state=3): >>>ansible-tmp-1727096430.903931-25756-91397388193393=/root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393 <<< 24134 1727096430.93192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096430.93234: stderr chunk (state=3): >>><<< 24134 1727096430.93236: stdout chunk (state=3): >>><<< 24134 1727096430.93275: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096430.903931-25756-91397388193393=/root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096430.93278: variable 'ansible_module_compression' from source: unknown 24134 1727096430.93321: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24134 1727096430.93349: variable 'ansible_facts' from source: unknown 24134 1727096430.93406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393/AnsiballZ_command.py 24134 1727096430.93505: Sending initial data 24134 1727096430.93508: Sent initial data (154 bytes) 24134 1727096430.93942: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096430.93947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096430.93950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24134 1727096430.93952: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096430.93954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096430.94005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096430.94008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096430.94085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096430.95731: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096430.95798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096430.95864: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmp5224c93m /root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393/AnsiballZ_command.py <<< 24134 1727096430.95870: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393/AnsiballZ_command.py" <<< 24134 1727096430.95932: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmp5224c93m" to remote "/root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393/AnsiballZ_command.py" <<< 24134 1727096430.95935: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393/AnsiballZ_command.py" <<< 24134 1727096430.96563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096430.96607: stderr chunk (state=3): >>><<< 24134 1727096430.96611: stdout chunk (state=3): >>><<< 24134 1727096430.96634: done transferring module to remote 24134 1727096430.96648: _low_level_execute_command(): starting 24134 1727096430.96651: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393/ /root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393/AnsiballZ_command.py && sleep 0' 24134 1727096430.97074: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096430.97113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096430.97116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096430.97118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096430.97121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096430.97123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 24134 1727096430.97125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096430.97172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096430.97176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096430.97180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096430.97243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096430.99099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096430.99123: stderr chunk (state=3): >>><<< 24134 1727096430.99126: stdout chunk (state=3): >>><<< 24134 1727096430.99148: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096430.99152: _low_level_execute_command(): starting 24134 1727096430.99154: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393/AnsiballZ_command.py && sleep 0' 24134 1727096430.99566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096430.99575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096430.99603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096430.99606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24134 1727096430.99608: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096430.99610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096430.99665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096430.99677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096430.99679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096430.99746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096431.16953: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-23 09:00:31.155123", "end": "2024-09-23 09:00:31.166944", "delta": "0:00:00.011821", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24134 1727096431.19239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096431.19263: stderr chunk (state=3): >>><<< 24134 1727096431.19266: stdout chunk (state=3): >>><<< 24134 1727096431.19291: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-23 09:00:31.155123", "end": "2024-09-23 09:00:31.166944", "delta": "0:00:00.011821", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096431.19332: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096431.19339: _low_level_execute_command(): starting 24134 1727096431.19344: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096430.903931-25756-91397388193393/ > /dev/null 2>&1 && sleep 0' 24134 1727096431.19975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096431.19979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096431.19981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096431.19984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096431.19986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096431.19989: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096431.19998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096431.20004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096431.20007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096431.20014: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24134 1727096431.20023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096431.20033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096431.20112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096431.20118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096431.20121: stderr chunk (state=3): >>>debug2: match found <<< 24134 1727096431.20123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096431.20129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096431.20141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096431.20158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096431.20309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096431.22233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096431.22272: stderr chunk (state=3): >>><<< 24134 1727096431.22288: stdout chunk (state=3): >>><<< 24134 1727096431.22302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096431.22305: handler run complete 24134 1727096431.22320: Evaluated conditional (False): False 24134 1727096431.22328: attempt loop complete, returning result 24134 1727096431.22331: _execute() done 24134 1727096431.22333: dumping result to json 24134 1727096431.22343: done dumping result, returning 24134 1727096431.22376: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [0afff68d-5257-1673-d3fc-0000000005ff] 24134 1727096431.22379: sending task result for task 0afff68d-5257-1673-d3fc-0000000005ff 24134 1727096431.22510: done sending task result for task 0afff68d-5257-1673-d3fc-0000000005ff 24134 1727096431.22513: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.011821", "end": "2024-09-23 09:00:31.166944", "rc": 0, "start": "2024-09-23 09:00:31.155123" } 24134 1727096431.22608: no more pending results, returning what we have 24134 1727096431.22611: results queue empty 24134 1727096431.22612: checking for any_errors_fatal 24134 1727096431.22613: done checking for any_errors_fatal 24134 1727096431.22614: checking for max_fail_percentage 24134 1727096431.22615: done checking for max_fail_percentage 24134 1727096431.22616: checking to see if all hosts have failed and the running result is not ok 24134 1727096431.22617: done checking to see if all hosts have failed 24134 1727096431.22617: getting the remaining hosts for this loop 24134 1727096431.22619: done getting the remaining hosts for this loop 24134 1727096431.22622: getting the next task for host managed_node1 24134 1727096431.22628: done getting next task for host managed_node1 24134 1727096431.22631: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 24134 1727096431.22632: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096431.22636: getting variables 24134 1727096431.22637: in VariableManager get_vars() 24134 1727096431.22777: Calling all_inventory to load vars for managed_node1 24134 1727096431.22780: Calling groups_inventory to load vars for managed_node1 24134 1727096431.22784: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096431.22793: Calling all_plugins_play to load vars for managed_node1 24134 1727096431.22796: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096431.22804: Calling groups_plugins_play to load vars for managed_node1 24134 1727096431.24107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096431.25028: done with get_vars() 24134 1727096431.25046: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:85 Monday 23 September 2024 09:00:31 -0400 (0:00:00.391) 0:00:35.464 ****** 24134 1727096431.25119: entering _queue_task() for managed_node1/include_tasks 24134 1727096431.25366: worker is 1 (out of 1 available) 24134 1727096431.25383: exiting _queue_task() for managed_node1/include_tasks 24134 1727096431.25395: done queuing things up, now waiting for results queue to drain 24134 1727096431.25397: waiting for pending results... 24134 1727096431.25627: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' 24134 1727096431.25773: in run() - task 0afff68d-5257-1673-d3fc-00000000009d 24134 1727096431.25777: variable 'ansible_search_path' from source: unknown 24134 1727096431.25780: calling self._execute() 24134 1727096431.25843: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.25866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.25884: variable 'omit' from source: magic vars 24134 1727096431.26264: variable 'ansible_distribution_major_version' from source: facts 24134 1727096431.26290: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096431.26294: _execute() done 24134 1727096431.26297: dumping result to json 24134 1727096431.26303: done dumping result, returning 24134 1727096431.26334: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' [0afff68d-5257-1673-d3fc-00000000009d] 24134 1727096431.26337: sending task result for task 0afff68d-5257-1673-d3fc-00000000009d 24134 1727096431.26404: done sending task result for task 0afff68d-5257-1673-d3fc-00000000009d 24134 1727096431.26407: WORKER PROCESS EXITING 24134 1727096431.26461: no more pending results, returning what we have 24134 1727096431.26465: in VariableManager get_vars() 24134 1727096431.26503: Calling all_inventory to load vars for managed_node1 24134 1727096431.26506: Calling groups_inventory to load vars for managed_node1 24134 1727096431.26509: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096431.26520: Calling all_plugins_play to load vars for managed_node1 24134 1727096431.26635: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096431.26639: Calling groups_plugins_play to load vars for managed_node1 24134 1727096431.27784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096431.28772: done with get_vars() 24134 1727096431.28786: variable 'ansible_search_path' from source: unknown 24134 1727096431.28797: we have included files to process 24134 1727096431.28798: generating all_blocks data 24134 1727096431.28799: done generating all_blocks data 24134 1727096431.28802: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 24134 1727096431.28803: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 24134 1727096431.28804: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 24134 1727096431.28917: in VariableManager get_vars() 24134 1727096431.28930: done with get_vars() 24134 1727096431.29010: done processing included file 24134 1727096431.29011: iterating over new_blocks loaded from include file 24134 1727096431.29012: in VariableManager get_vars() 24134 1727096431.29020: done with get_vars() 24134 1727096431.29021: filtering new block on tags 24134 1727096431.29031: done filtering new block on tags 24134 1727096431.29033: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 24134 1727096431.29037: extending task lists for all hosts with included blocks 24134 1727096431.29122: done extending task lists 24134 1727096431.29123: done processing included files 24134 1727096431.29124: results queue empty 24134 1727096431.29124: checking for any_errors_fatal 24134 1727096431.29127: done checking for any_errors_fatal 24134 1727096431.29128: checking for max_fail_percentage 24134 1727096431.29129: done checking for max_fail_percentage 24134 1727096431.29129: checking to see if all hosts have failed and the running result is not ok 24134 1727096431.29130: done checking to see if all hosts have failed 24134 1727096431.29130: getting the remaining hosts for this loop 24134 1727096431.29131: done getting the remaining hosts for this loop 24134 1727096431.29133: getting the next task for host managed_node1 24134 1727096431.29135: done getting next task for host managed_node1 24134 1727096431.29137: ^ task is: TASK: Include the task 'get_profile_stat.yml' 24134 1727096431.29138: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096431.29139: getting variables 24134 1727096431.29140: in VariableManager get_vars() 24134 1727096431.29146: Calling all_inventory to load vars for managed_node1 24134 1727096431.29148: Calling groups_inventory to load vars for managed_node1 24134 1727096431.29149: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096431.29153: Calling all_plugins_play to load vars for managed_node1 24134 1727096431.29154: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096431.29156: Calling groups_plugins_play to load vars for managed_node1 24134 1727096431.30199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096431.31136: done with get_vars() 24134 1727096431.31153: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Monday 23 September 2024 09:00:31 -0400 (0:00:00.060) 0:00:35.525 ****** 24134 1727096431.31214: entering _queue_task() for managed_node1/include_tasks 24134 1727096431.31480: worker is 1 (out of 1 available) 24134 1727096431.31493: exiting _queue_task() for managed_node1/include_tasks 24134 1727096431.31506: done queuing things up, now waiting for results queue to drain 24134 1727096431.31508: waiting for pending results... 24134 1727096431.31682: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 24134 1727096431.31756: in run() - task 0afff68d-5257-1673-d3fc-000000000612 24134 1727096431.31766: variable 'ansible_search_path' from source: unknown 24134 1727096431.31776: variable 'ansible_search_path' from source: unknown 24134 1727096431.31801: calling self._execute() 24134 1727096431.31876: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.31882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.31890: variable 'omit' from source: magic vars 24134 1727096431.32161: variable 'ansible_distribution_major_version' from source: facts 24134 1727096431.32179: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096431.32183: _execute() done 24134 1727096431.32186: dumping result to json 24134 1727096431.32188: done dumping result, returning 24134 1727096431.32194: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-1673-d3fc-000000000612] 24134 1727096431.32199: sending task result for task 0afff68d-5257-1673-d3fc-000000000612 24134 1727096431.32281: done sending task result for task 0afff68d-5257-1673-d3fc-000000000612 24134 1727096431.32285: WORKER PROCESS EXITING 24134 1727096431.32312: no more pending results, returning what we have 24134 1727096431.32317: in VariableManager get_vars() 24134 1727096431.32351: Calling all_inventory to load vars for managed_node1 24134 1727096431.32354: Calling groups_inventory to load vars for managed_node1 24134 1727096431.32357: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096431.32374: Calling all_plugins_play to load vars for managed_node1 24134 1727096431.32377: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096431.32380: Calling groups_plugins_play to load vars for managed_node1 24134 1727096431.33271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096431.34141: done with get_vars() 24134 1727096431.34155: variable 'ansible_search_path' from source: unknown 24134 1727096431.34156: variable 'ansible_search_path' from source: unknown 24134 1727096431.34185: we have included files to process 24134 1727096431.34186: generating all_blocks data 24134 1727096431.34187: done generating all_blocks data 24134 1727096431.34187: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24134 1727096431.34188: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24134 1727096431.34189: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24134 1727096431.34857: done processing included file 24134 1727096431.34859: iterating over new_blocks loaded from include file 24134 1727096431.34861: in VariableManager get_vars() 24134 1727096431.34874: done with get_vars() 24134 1727096431.34876: filtering new block on tags 24134 1727096431.34891: done filtering new block on tags 24134 1727096431.34893: in VariableManager get_vars() 24134 1727096431.34900: done with get_vars() 24134 1727096431.34901: filtering new block on tags 24134 1727096431.34913: done filtering new block on tags 24134 1727096431.34914: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 24134 1727096431.34919: extending task lists for all hosts with included blocks 24134 1727096431.34979: done extending task lists 24134 1727096431.34980: done processing included files 24134 1727096431.34981: results queue empty 24134 1727096431.34981: checking for any_errors_fatal 24134 1727096431.34983: done checking for any_errors_fatal 24134 1727096431.34984: checking for max_fail_percentage 24134 1727096431.34984: done checking for max_fail_percentage 24134 1727096431.34985: checking to see if all hosts have failed and the running result is not ok 24134 1727096431.34985: done checking to see if all hosts have failed 24134 1727096431.34986: getting the remaining hosts for this loop 24134 1727096431.34987: done getting the remaining hosts for this loop 24134 1727096431.34988: getting the next task for host managed_node1 24134 1727096431.34991: done getting next task for host managed_node1 24134 1727096431.34992: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 24134 1727096431.34994: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096431.34995: getting variables 24134 1727096431.34996: in VariableManager get_vars() 24134 1727096431.35032: Calling all_inventory to load vars for managed_node1 24134 1727096431.35034: Calling groups_inventory to load vars for managed_node1 24134 1727096431.35036: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096431.35040: Calling all_plugins_play to load vars for managed_node1 24134 1727096431.35041: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096431.35043: Calling groups_plugins_play to load vars for managed_node1 24134 1727096431.35664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096431.36524: done with get_vars() 24134 1727096431.36538: done getting variables 24134 1727096431.36565: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 09:00:31 -0400 (0:00:00.053) 0:00:35.579 ****** 24134 1727096431.36592: entering _queue_task() for managed_node1/set_fact 24134 1727096431.36851: worker is 1 (out of 1 available) 24134 1727096431.36865: exiting _queue_task() for managed_node1/set_fact 24134 1727096431.36881: done queuing things up, now waiting for results queue to drain 24134 1727096431.36883: waiting for pending results... 24134 1727096431.37049: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 24134 1727096431.37131: in run() - task 0afff68d-5257-1673-d3fc-00000000062a 24134 1727096431.37144: variable 'ansible_search_path' from source: unknown 24134 1727096431.37147: variable 'ansible_search_path' from source: unknown 24134 1727096431.37177: calling self._execute() 24134 1727096431.37245: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.37249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.37258: variable 'omit' from source: magic vars 24134 1727096431.37527: variable 'ansible_distribution_major_version' from source: facts 24134 1727096431.37537: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096431.37543: variable 'omit' from source: magic vars 24134 1727096431.37576: variable 'omit' from source: magic vars 24134 1727096431.37601: variable 'omit' from source: magic vars 24134 1727096431.37631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096431.37663: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096431.37677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096431.37691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096431.37701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096431.37724: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096431.37727: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.37729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.37802: Set connection var ansible_shell_executable to /bin/sh 24134 1727096431.37806: Set connection var ansible_pipelining to False 24134 1727096431.37812: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096431.37819: Set connection var ansible_timeout to 10 24134 1727096431.37822: Set connection var ansible_connection to ssh 24134 1727096431.37824: Set connection var ansible_shell_type to sh 24134 1727096431.37841: variable 'ansible_shell_executable' from source: unknown 24134 1727096431.37844: variable 'ansible_connection' from source: unknown 24134 1727096431.37847: variable 'ansible_module_compression' from source: unknown 24134 1727096431.37850: variable 'ansible_shell_type' from source: unknown 24134 1727096431.37852: variable 'ansible_shell_executable' from source: unknown 24134 1727096431.37854: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.37857: variable 'ansible_pipelining' from source: unknown 24134 1727096431.37859: variable 'ansible_timeout' from source: unknown 24134 1727096431.37863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.37964: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096431.37976: variable 'omit' from source: magic vars 24134 1727096431.37984: starting attempt loop 24134 1727096431.37989: running the handler 24134 1727096431.37997: handler run complete 24134 1727096431.38005: attempt loop complete, returning result 24134 1727096431.38008: _execute() done 24134 1727096431.38011: dumping result to json 24134 1727096431.38013: done dumping result, returning 24134 1727096431.38019: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-1673-d3fc-00000000062a] 24134 1727096431.38024: sending task result for task 0afff68d-5257-1673-d3fc-00000000062a 24134 1727096431.38100: done sending task result for task 0afff68d-5257-1673-d3fc-00000000062a 24134 1727096431.38103: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 24134 1727096431.38154: no more pending results, returning what we have 24134 1727096431.38156: results queue empty 24134 1727096431.38157: checking for any_errors_fatal 24134 1727096431.38159: done checking for any_errors_fatal 24134 1727096431.38159: checking for max_fail_percentage 24134 1727096431.38161: done checking for max_fail_percentage 24134 1727096431.38162: checking to see if all hosts have failed and the running result is not ok 24134 1727096431.38162: done checking to see if all hosts have failed 24134 1727096431.38163: getting the remaining hosts for this loop 24134 1727096431.38164: done getting the remaining hosts for this loop 24134 1727096431.38172: getting the next task for host managed_node1 24134 1727096431.38178: done getting next task for host managed_node1 24134 1727096431.38181: ^ task is: TASK: Stat profile file 24134 1727096431.38185: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096431.38189: getting variables 24134 1727096431.38190: in VariableManager get_vars() 24134 1727096431.38225: Calling all_inventory to load vars for managed_node1 24134 1727096431.38228: Calling groups_inventory to load vars for managed_node1 24134 1727096431.38231: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096431.38240: Calling all_plugins_play to load vars for managed_node1 24134 1727096431.38243: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096431.38246: Calling groups_plugins_play to load vars for managed_node1 24134 1727096431.42306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096431.43165: done with get_vars() 24134 1727096431.43185: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 09:00:31 -0400 (0:00:00.066) 0:00:35.645 ****** 24134 1727096431.43240: entering _queue_task() for managed_node1/stat 24134 1727096431.43504: worker is 1 (out of 1 available) 24134 1727096431.43517: exiting _queue_task() for managed_node1/stat 24134 1727096431.43529: done queuing things up, now waiting for results queue to drain 24134 1727096431.43531: waiting for pending results... 24134 1727096431.43707: running TaskExecutor() for managed_node1/TASK: Stat profile file 24134 1727096431.43793: in run() - task 0afff68d-5257-1673-d3fc-00000000062b 24134 1727096431.43805: variable 'ansible_search_path' from source: unknown 24134 1727096431.43810: variable 'ansible_search_path' from source: unknown 24134 1727096431.43837: calling self._execute() 24134 1727096431.43907: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.43911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.43922: variable 'omit' from source: magic vars 24134 1727096431.44207: variable 'ansible_distribution_major_version' from source: facts 24134 1727096431.44215: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096431.44221: variable 'omit' from source: magic vars 24134 1727096431.44258: variable 'omit' from source: magic vars 24134 1727096431.44327: variable 'profile' from source: include params 24134 1727096431.44331: variable 'interface' from source: set_fact 24134 1727096431.44381: variable 'interface' from source: set_fact 24134 1727096431.44396: variable 'omit' from source: magic vars 24134 1727096431.44430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096431.44456: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096431.44520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096431.44525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096431.44528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096431.44531: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096431.44534: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.44536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.44591: Set connection var ansible_shell_executable to /bin/sh 24134 1727096431.44595: Set connection var ansible_pipelining to False 24134 1727096431.44601: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096431.44609: Set connection var ansible_timeout to 10 24134 1727096431.44612: Set connection var ansible_connection to ssh 24134 1727096431.44615: Set connection var ansible_shell_type to sh 24134 1727096431.44631: variable 'ansible_shell_executable' from source: unknown 24134 1727096431.44635: variable 'ansible_connection' from source: unknown 24134 1727096431.44638: variable 'ansible_module_compression' from source: unknown 24134 1727096431.44641: variable 'ansible_shell_type' from source: unknown 24134 1727096431.44643: variable 'ansible_shell_executable' from source: unknown 24134 1727096431.44648: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.44650: variable 'ansible_pipelining' from source: unknown 24134 1727096431.44653: variable 'ansible_timeout' from source: unknown 24134 1727096431.44655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.44797: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096431.44807: variable 'omit' from source: magic vars 24134 1727096431.44812: starting attempt loop 24134 1727096431.44815: running the handler 24134 1727096431.44827: _low_level_execute_command(): starting 24134 1727096431.44833: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096431.45354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096431.45358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096431.45362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096431.45364: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096431.45417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096431.45420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096431.45427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096431.45500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096431.47217: stdout chunk (state=3): >>>/root <<< 24134 1727096431.47320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096431.47347: stderr chunk (state=3): >>><<< 24134 1727096431.47351: stdout chunk (state=3): >>><<< 24134 1727096431.47377: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096431.47391: _low_level_execute_command(): starting 24134 1727096431.47395: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661 `" && echo ansible-tmp-1727096431.4737594-25778-248593100714661="` echo /root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661 `" ) && sleep 0' 24134 1727096431.47850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096431.47853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096431.47863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096431.47865: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096431.47869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096431.47924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096431.47927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096431.47928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096431.47997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096431.49972: stdout chunk (state=3): >>>ansible-tmp-1727096431.4737594-25778-248593100714661=/root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661 <<< 24134 1727096431.50087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096431.50110: stderr chunk (state=3): >>><<< 24134 1727096431.50114: stdout chunk (state=3): >>><<< 24134 1727096431.50126: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096431.4737594-25778-248593100714661=/root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096431.50170: variable 'ansible_module_compression' from source: unknown 24134 1727096431.50216: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 24134 1727096431.50249: variable 'ansible_facts' from source: unknown 24134 1727096431.50305: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661/AnsiballZ_stat.py 24134 1727096431.50404: Sending initial data 24134 1727096431.50408: Sent initial data (153 bytes) 24134 1727096431.50840: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096431.50845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096431.50848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096431.50850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096431.50852: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096431.50897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096431.50900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096431.50975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096431.52603: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096431.52677: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096431.52748: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpd1r9xjd2 /root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661/AnsiballZ_stat.py <<< 24134 1727096431.52751: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661/AnsiballZ_stat.py" <<< 24134 1727096431.52824: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpd1r9xjd2" to remote "/root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661/AnsiballZ_stat.py" <<< 24134 1727096431.54028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096431.54032: stderr chunk (state=3): >>><<< 24134 1727096431.54035: stdout chunk (state=3): >>><<< 24134 1727096431.54276: done transferring module to remote 24134 1727096431.54279: _low_level_execute_command(): starting 24134 1727096431.54282: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661/ /root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661/AnsiballZ_stat.py && sleep 0' 24134 1727096431.55393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096431.55487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096431.55529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096431.55594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096431.57475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096431.57586: stderr chunk (state=3): >>><<< 24134 1727096431.57600: stdout chunk (state=3): >>><<< 24134 1727096431.57657: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096431.57669: _low_level_execute_command(): starting 24134 1727096431.57837: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661/AnsiballZ_stat.py && sleep 0' 24134 1727096431.59285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096431.59498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096431.59510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096431.60176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096431.75689: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 24134 1727096431.77188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096431.77200: stdout chunk (state=3): >>><<< 24134 1727096431.77376: stderr chunk (state=3): >>><<< 24134 1727096431.77380: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096431.77383: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096431.77386: _low_level_execute_command(): starting 24134 1727096431.77388: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096431.4737594-25778-248593100714661/ > /dev/null 2>&1 && sleep 0' 24134 1727096431.78284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096431.78317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096431.78320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096431.78323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 24134 1727096431.78325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096431.78409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096431.78505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096431.80503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096431.80513: stdout chunk (state=3): >>><<< 24134 1727096431.80524: stderr chunk (state=3): >>><<< 24134 1727096431.80543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096431.80556: handler run complete 24134 1727096431.80824: attempt loop complete, returning result 24134 1727096431.80827: _execute() done 24134 1727096431.80830: dumping result to json 24134 1727096431.80832: done dumping result, returning 24134 1727096431.80834: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0afff68d-5257-1673-d3fc-00000000062b] 24134 1727096431.80836: sending task result for task 0afff68d-5257-1673-d3fc-00000000062b 24134 1727096431.80921: done sending task result for task 0afff68d-5257-1673-d3fc-00000000062b ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 24134 1727096431.81105: no more pending results, returning what we have 24134 1727096431.81109: results queue empty 24134 1727096431.81110: checking for any_errors_fatal 24134 1727096431.81118: done checking for any_errors_fatal 24134 1727096431.81119: checking for max_fail_percentage 24134 1727096431.81121: done checking for max_fail_percentage 24134 1727096431.81122: checking to see if all hosts have failed and the running result is not ok 24134 1727096431.81123: done checking to see if all hosts have failed 24134 1727096431.81124: getting the remaining hosts for this loop 24134 1727096431.81126: done getting the remaining hosts for this loop 24134 1727096431.81130: getting the next task for host managed_node1 24134 1727096431.81137: done getting next task for host managed_node1 24134 1727096431.81140: ^ task is: TASK: Set NM profile exist flag based on the profile files 24134 1727096431.81145: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096431.81150: getting variables 24134 1727096431.81152: in VariableManager get_vars() 24134 1727096431.81189: Calling all_inventory to load vars for managed_node1 24134 1727096431.81193: Calling groups_inventory to load vars for managed_node1 24134 1727096431.81197: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096431.81209: Calling all_plugins_play to load vars for managed_node1 24134 1727096431.81213: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096431.81216: Calling groups_plugins_play to load vars for managed_node1 24134 1727096431.82114: WORKER PROCESS EXITING 24134 1727096431.84247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096431.86337: done with get_vars() 24134 1727096431.86366: done getting variables 24134 1727096431.86430: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 09:00:31 -0400 (0:00:00.432) 0:00:36.078 ****** 24134 1727096431.86472: entering _queue_task() for managed_node1/set_fact 24134 1727096431.87035: worker is 1 (out of 1 available) 24134 1727096431.87048: exiting _queue_task() for managed_node1/set_fact 24134 1727096431.87060: done queuing things up, now waiting for results queue to drain 24134 1727096431.87061: waiting for pending results... 24134 1727096431.87450: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 24134 1727096431.87688: in run() - task 0afff68d-5257-1673-d3fc-00000000062c 24134 1727096431.87803: variable 'ansible_search_path' from source: unknown 24134 1727096431.87813: variable 'ansible_search_path' from source: unknown 24134 1727096431.87860: calling self._execute() 24134 1727096431.87971: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.87988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.88008: variable 'omit' from source: magic vars 24134 1727096431.88389: variable 'ansible_distribution_major_version' from source: facts 24134 1727096431.88407: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096431.88536: variable 'profile_stat' from source: set_fact 24134 1727096431.88559: Evaluated conditional (profile_stat.stat.exists): False 24134 1727096431.88568: when evaluation is False, skipping this task 24134 1727096431.88578: _execute() done 24134 1727096431.88587: dumping result to json 24134 1727096431.88595: done dumping result, returning 24134 1727096431.88654: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-1673-d3fc-00000000062c] 24134 1727096431.88658: sending task result for task 0afff68d-5257-1673-d3fc-00000000062c 24134 1727096431.88721: done sending task result for task 0afff68d-5257-1673-d3fc-00000000062c 24134 1727096431.88724: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24134 1727096431.88805: no more pending results, returning what we have 24134 1727096431.88809: results queue empty 24134 1727096431.88810: checking for any_errors_fatal 24134 1727096431.88822: done checking for any_errors_fatal 24134 1727096431.88822: checking for max_fail_percentage 24134 1727096431.88824: done checking for max_fail_percentage 24134 1727096431.88825: checking to see if all hosts have failed and the running result is not ok 24134 1727096431.88826: done checking to see if all hosts have failed 24134 1727096431.88826: getting the remaining hosts for this loop 24134 1727096431.88828: done getting the remaining hosts for this loop 24134 1727096431.88831: getting the next task for host managed_node1 24134 1727096431.88837: done getting next task for host managed_node1 24134 1727096431.88839: ^ task is: TASK: Get NM profile info 24134 1727096431.88843: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096431.88847: getting variables 24134 1727096431.88849: in VariableManager get_vars() 24134 1727096431.88891: Calling all_inventory to load vars for managed_node1 24134 1727096431.88894: Calling groups_inventory to load vars for managed_node1 24134 1727096431.88897: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096431.88906: Calling all_plugins_play to load vars for managed_node1 24134 1727096431.88909: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096431.88911: Calling groups_plugins_play to load vars for managed_node1 24134 1727096431.90598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096431.92201: done with get_vars() 24134 1727096431.92226: done getting variables 24134 1727096431.92296: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 09:00:31 -0400 (0:00:00.058) 0:00:36.136 ****** 24134 1727096431.92329: entering _queue_task() for managed_node1/shell 24134 1727096431.92616: worker is 1 (out of 1 available) 24134 1727096431.92628: exiting _queue_task() for managed_node1/shell 24134 1727096431.92640: done queuing things up, now waiting for results queue to drain 24134 1727096431.92641: waiting for pending results... 24134 1727096431.92994: running TaskExecutor() for managed_node1/TASK: Get NM profile info 24134 1727096431.93041: in run() - task 0afff68d-5257-1673-d3fc-00000000062d 24134 1727096431.93066: variable 'ansible_search_path' from source: unknown 24134 1727096431.93078: variable 'ansible_search_path' from source: unknown 24134 1727096431.93122: calling self._execute() 24134 1727096431.93219: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.93230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.93273: variable 'omit' from source: magic vars 24134 1727096431.93708: variable 'ansible_distribution_major_version' from source: facts 24134 1727096431.93720: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096431.93726: variable 'omit' from source: magic vars 24134 1727096431.93785: variable 'omit' from source: magic vars 24134 1727096431.93956: variable 'profile' from source: include params 24134 1727096431.93960: variable 'interface' from source: set_fact 24134 1727096431.93971: variable 'interface' from source: set_fact 24134 1727096431.93994: variable 'omit' from source: magic vars 24134 1727096431.94034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096431.94081: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096431.94099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096431.94116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096431.94198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096431.94201: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096431.94204: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.94207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.94274: Set connection var ansible_shell_executable to /bin/sh 24134 1727096431.94282: Set connection var ansible_pipelining to False 24134 1727096431.94296: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096431.94305: Set connection var ansible_timeout to 10 24134 1727096431.94308: Set connection var ansible_connection to ssh 24134 1727096431.94310: Set connection var ansible_shell_type to sh 24134 1727096431.94334: variable 'ansible_shell_executable' from source: unknown 24134 1727096431.94337: variable 'ansible_connection' from source: unknown 24134 1727096431.94339: variable 'ansible_module_compression' from source: unknown 24134 1727096431.94342: variable 'ansible_shell_type' from source: unknown 24134 1727096431.94344: variable 'ansible_shell_executable' from source: unknown 24134 1727096431.94346: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096431.94348: variable 'ansible_pipelining' from source: unknown 24134 1727096431.94353: variable 'ansible_timeout' from source: unknown 24134 1727096431.94357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096431.94518: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096431.94673: variable 'omit' from source: magic vars 24134 1727096431.94677: starting attempt loop 24134 1727096431.94679: running the handler 24134 1727096431.94682: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096431.94685: _low_level_execute_command(): starting 24134 1727096431.94687: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096431.95323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096431.95388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096431.95438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096431.95453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096431.95476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096431.95585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096431.97318: stdout chunk (state=3): >>>/root <<< 24134 1727096431.97471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096431.97474: stdout chunk (state=3): >>><<< 24134 1727096431.97477: stderr chunk (state=3): >>><<< 24134 1727096431.97497: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096431.97587: _low_level_execute_command(): starting 24134 1727096431.97591: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199 `" && echo ansible-tmp-1727096431.9750388-25810-219660854388199="` echo /root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199 `" ) && sleep 0' 24134 1727096431.98230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096431.98273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096431.98373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096432.00378: stdout chunk (state=3): >>>ansible-tmp-1727096431.9750388-25810-219660854388199=/root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199 <<< 24134 1727096432.00487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096432.00521: stderr chunk (state=3): >>><<< 24134 1727096432.00523: stdout chunk (state=3): >>><<< 24134 1727096432.00535: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096431.9750388-25810-219660854388199=/root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096432.00576: variable 'ansible_module_compression' from source: unknown 24134 1727096432.00609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24134 1727096432.00643: variable 'ansible_facts' from source: unknown 24134 1727096432.00697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199/AnsiballZ_command.py 24134 1727096432.00798: Sending initial data 24134 1727096432.00801: Sent initial data (156 bytes) 24134 1727096432.01225: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096432.01229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096432.01231: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096432.01233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096432.01278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096432.01281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096432.01354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096432.03013: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 24134 1727096432.03019: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096432.03082: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096432.03142: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpccmqi1js /root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199/AnsiballZ_command.py <<< 24134 1727096432.03148: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199/AnsiballZ_command.py" <<< 24134 1727096432.03211: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpccmqi1js" to remote "/root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199/AnsiballZ_command.py" <<< 24134 1727096432.03214: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199/AnsiballZ_command.py" <<< 24134 1727096432.03834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096432.03874: stderr chunk (state=3): >>><<< 24134 1727096432.03878: stdout chunk (state=3): >>><<< 24134 1727096432.03909: done transferring module to remote 24134 1727096432.03919: _low_level_execute_command(): starting 24134 1727096432.03923: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199/ /root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199/AnsiballZ_command.py && sleep 0' 24134 1727096432.04508: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096432.04531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096432.04629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096432.06506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096432.06535: stderr chunk (state=3): >>><<< 24134 1727096432.06538: stdout chunk (state=3): >>><<< 24134 1727096432.06549: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096432.06557: _low_level_execute_command(): starting 24134 1727096432.06559: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199/AnsiballZ_command.py && sleep 0' 24134 1727096432.07234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096432.07239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096432.07312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096432.24724: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-23 09:00:32.229338", "end": "2024-09-23 09:00:32.245646", "delta": "0:00:00.016308", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24134 1727096432.26291: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.125 closed. <<< 24134 1727096432.26316: stderr chunk (state=3): >>><<< 24134 1727096432.26319: stdout chunk (state=3): >>><<< 24134 1727096432.26336: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-23 09:00:32.229338", "end": "2024-09-23 09:00:32.245646", "delta": "0:00:00.016308", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.125 closed. 24134 1727096432.26364: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096432.26375: _low_level_execute_command(): starting 24134 1727096432.26380: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096431.9750388-25810-219660854388199/ > /dev/null 2>&1 && sleep 0' 24134 1727096432.26846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096432.26849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096432.26852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 24134 1727096432.26854: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 24134 1727096432.26856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096432.26909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096432.26914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096432.26916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096432.26994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096432.28889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096432.28912: stderr chunk (state=3): >>><<< 24134 1727096432.28916: stdout chunk (state=3): >>><<< 24134 1727096432.28929: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096432.28936: handler run complete 24134 1727096432.28951: Evaluated conditional (False): False 24134 1727096432.28959: attempt loop complete, returning result 24134 1727096432.28962: _execute() done 24134 1727096432.28965: dumping result to json 24134 1727096432.28973: done dumping result, returning 24134 1727096432.28980: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0afff68d-5257-1673-d3fc-00000000062d] 24134 1727096432.28984: sending task result for task 0afff68d-5257-1673-d3fc-00000000062d 24134 1727096432.29081: done sending task result for task 0afff68d-5257-1673-d3fc-00000000062d 24134 1727096432.29084: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.016308", "end": "2024-09-23 09:00:32.245646", "rc": 1, "start": "2024-09-23 09:00:32.229338" } MSG: non-zero return code ...ignoring 24134 1727096432.29174: no more pending results, returning what we have 24134 1727096432.29178: results queue empty 24134 1727096432.29179: checking for any_errors_fatal 24134 1727096432.29185: done checking for any_errors_fatal 24134 1727096432.29186: checking for max_fail_percentage 24134 1727096432.29188: done checking for max_fail_percentage 24134 1727096432.29189: checking to see if all hosts have failed and the running result is not ok 24134 1727096432.29190: done checking to see if all hosts have failed 24134 1727096432.29190: getting the remaining hosts for this loop 24134 1727096432.29191: done getting the remaining hosts for this loop 24134 1727096432.29195: getting the next task for host managed_node1 24134 1727096432.29201: done getting next task for host managed_node1 24134 1727096432.29203: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 24134 1727096432.29206: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096432.29209: getting variables 24134 1727096432.29211: in VariableManager get_vars() 24134 1727096432.29239: Calling all_inventory to load vars for managed_node1 24134 1727096432.29241: Calling groups_inventory to load vars for managed_node1 24134 1727096432.29244: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096432.29254: Calling all_plugins_play to load vars for managed_node1 24134 1727096432.29256: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096432.29258: Calling groups_plugins_play to load vars for managed_node1 24134 1727096432.30307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096432.32030: done with get_vars() 24134 1727096432.32055: done getting variables 24134 1727096432.32210: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 09:00:32 -0400 (0:00:00.399) 0:00:36.536 ****** 24134 1727096432.32273: entering _queue_task() for managed_node1/set_fact 24134 1727096432.32785: worker is 1 (out of 1 available) 24134 1727096432.32797: exiting _queue_task() for managed_node1/set_fact 24134 1727096432.32811: done queuing things up, now waiting for results queue to drain 24134 1727096432.32812: waiting for pending results... 24134 1727096432.33196: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 24134 1727096432.33230: in run() - task 0afff68d-5257-1673-d3fc-00000000062e 24134 1727096432.33253: variable 'ansible_search_path' from source: unknown 24134 1727096432.33260: variable 'ansible_search_path' from source: unknown 24134 1727096432.33313: calling self._execute() 24134 1727096432.33421: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.33432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.33447: variable 'omit' from source: magic vars 24134 1727096432.33857: variable 'ansible_distribution_major_version' from source: facts 24134 1727096432.33884: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096432.34028: variable 'nm_profile_exists' from source: set_fact 24134 1727096432.34047: Evaluated conditional (nm_profile_exists.rc == 0): False 24134 1727096432.34060: when evaluation is False, skipping this task 24134 1727096432.34194: _execute() done 24134 1727096432.34197: dumping result to json 24134 1727096432.34199: done dumping result, returning 24134 1727096432.34202: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-1673-d3fc-00000000062e] 24134 1727096432.34204: sending task result for task 0afff68d-5257-1673-d3fc-00000000062e 24134 1727096432.34272: done sending task result for task 0afff68d-5257-1673-d3fc-00000000062e 24134 1727096432.34276: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 24134 1727096432.34419: no more pending results, returning what we have 24134 1727096432.34423: results queue empty 24134 1727096432.34424: checking for any_errors_fatal 24134 1727096432.34436: done checking for any_errors_fatal 24134 1727096432.34436: checking for max_fail_percentage 24134 1727096432.34438: done checking for max_fail_percentage 24134 1727096432.34439: checking to see if all hosts have failed and the running result is not ok 24134 1727096432.34440: done checking to see if all hosts have failed 24134 1727096432.34441: getting the remaining hosts for this loop 24134 1727096432.34442: done getting the remaining hosts for this loop 24134 1727096432.34446: getting the next task for host managed_node1 24134 1727096432.34456: done getting next task for host managed_node1 24134 1727096432.34458: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 24134 1727096432.34462: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096432.34467: getting variables 24134 1727096432.34473: in VariableManager get_vars() 24134 1727096432.34504: Calling all_inventory to load vars for managed_node1 24134 1727096432.34508: Calling groups_inventory to load vars for managed_node1 24134 1727096432.34511: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096432.34524: Calling all_plugins_play to load vars for managed_node1 24134 1727096432.34527: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096432.34530: Calling groups_plugins_play to load vars for managed_node1 24134 1727096432.37633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096432.40231: done with get_vars() 24134 1727096432.40259: done getting variables 24134 1727096432.40323: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096432.40446: variable 'profile' from source: include params 24134 1727096432.40450: variable 'interface' from source: set_fact 24134 1727096432.40516: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 09:00:32 -0400 (0:00:00.082) 0:00:36.618 ****** 24134 1727096432.40549: entering _queue_task() for managed_node1/command 24134 1727096432.40990: worker is 1 (out of 1 available) 24134 1727096432.41001: exiting _queue_task() for managed_node1/command 24134 1727096432.41014: done queuing things up, now waiting for results queue to drain 24134 1727096432.41015: waiting for pending results... 24134 1727096432.41533: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 24134 1727096432.41538: in run() - task 0afff68d-5257-1673-d3fc-000000000630 24134 1727096432.41542: variable 'ansible_search_path' from source: unknown 24134 1727096432.41545: variable 'ansible_search_path' from source: unknown 24134 1727096432.41624: calling self._execute() 24134 1727096432.41689: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.41700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.41712: variable 'omit' from source: magic vars 24134 1727096432.42074: variable 'ansible_distribution_major_version' from source: facts 24134 1727096432.42091: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096432.42219: variable 'profile_stat' from source: set_fact 24134 1727096432.42239: Evaluated conditional (profile_stat.stat.exists): False 24134 1727096432.42247: when evaluation is False, skipping this task 24134 1727096432.42254: _execute() done 24134 1727096432.42275: dumping result to json 24134 1727096432.42278: done dumping result, returning 24134 1727096432.42283: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [0afff68d-5257-1673-d3fc-000000000630] 24134 1727096432.42375: sending task result for task 0afff68d-5257-1673-d3fc-000000000630 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24134 1727096432.42496: no more pending results, returning what we have 24134 1727096432.42499: results queue empty 24134 1727096432.42501: checking for any_errors_fatal 24134 1727096432.42506: done checking for any_errors_fatal 24134 1727096432.42507: checking for max_fail_percentage 24134 1727096432.42509: done checking for max_fail_percentage 24134 1727096432.42510: checking to see if all hosts have failed and the running result is not ok 24134 1727096432.42511: done checking to see if all hosts have failed 24134 1727096432.42512: getting the remaining hosts for this loop 24134 1727096432.42513: done getting the remaining hosts for this loop 24134 1727096432.42517: getting the next task for host managed_node1 24134 1727096432.42524: done getting next task for host managed_node1 24134 1727096432.42526: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 24134 1727096432.42530: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096432.42535: getting variables 24134 1727096432.42536: in VariableManager get_vars() 24134 1727096432.42566: Calling all_inventory to load vars for managed_node1 24134 1727096432.42573: Calling groups_inventory to load vars for managed_node1 24134 1727096432.42577: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096432.42590: Calling all_plugins_play to load vars for managed_node1 24134 1727096432.42594: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096432.42596: Calling groups_plugins_play to load vars for managed_node1 24134 1727096432.43183: done sending task result for task 0afff68d-5257-1673-d3fc-000000000630 24134 1727096432.43186: WORKER PROCESS EXITING 24134 1727096432.45662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096432.48853: done with get_vars() 24134 1727096432.48882: done getting variables 24134 1727096432.48950: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096432.49266: variable 'profile' from source: include params 24134 1727096432.49274: variable 'interface' from source: set_fact 24134 1727096432.49331: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 09:00:32 -0400 (0:00:00.088) 0:00:36.707 ****** 24134 1727096432.49363: entering _queue_task() for managed_node1/set_fact 24134 1727096432.50099: worker is 1 (out of 1 available) 24134 1727096432.50111: exiting _queue_task() for managed_node1/set_fact 24134 1727096432.50123: done queuing things up, now waiting for results queue to drain 24134 1727096432.50124: waiting for pending results... 24134 1727096432.50532: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 24134 1727096432.50777: in run() - task 0afff68d-5257-1673-d3fc-000000000631 24134 1727096432.50910: variable 'ansible_search_path' from source: unknown 24134 1727096432.50914: variable 'ansible_search_path' from source: unknown 24134 1727096432.50926: calling self._execute() 24134 1727096432.51173: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.51177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.51180: variable 'omit' from source: magic vars 24134 1727096432.51911: variable 'ansible_distribution_major_version' from source: facts 24134 1727096432.51934: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096432.52165: variable 'profile_stat' from source: set_fact 24134 1727096432.52227: Evaluated conditional (profile_stat.stat.exists): False 24134 1727096432.52375: when evaluation is False, skipping this task 24134 1727096432.52379: _execute() done 24134 1727096432.52382: dumping result to json 24134 1727096432.52385: done dumping result, returning 24134 1727096432.52388: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [0afff68d-5257-1673-d3fc-000000000631] 24134 1727096432.52390: sending task result for task 0afff68d-5257-1673-d3fc-000000000631 24134 1727096432.52504: done sending task result for task 0afff68d-5257-1673-d3fc-000000000631 24134 1727096432.52508: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24134 1727096432.52562: no more pending results, returning what we have 24134 1727096432.52566: results queue empty 24134 1727096432.52566: checking for any_errors_fatal 24134 1727096432.52579: done checking for any_errors_fatal 24134 1727096432.52580: checking for max_fail_percentage 24134 1727096432.52582: done checking for max_fail_percentage 24134 1727096432.52583: checking to see if all hosts have failed and the running result is not ok 24134 1727096432.52584: done checking to see if all hosts have failed 24134 1727096432.52584: getting the remaining hosts for this loop 24134 1727096432.52586: done getting the remaining hosts for this loop 24134 1727096432.52595: getting the next task for host managed_node1 24134 1727096432.52602: done getting next task for host managed_node1 24134 1727096432.52605: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 24134 1727096432.52609: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096432.52613: getting variables 24134 1727096432.52615: in VariableManager get_vars() 24134 1727096432.52646: Calling all_inventory to load vars for managed_node1 24134 1727096432.52648: Calling groups_inventory to load vars for managed_node1 24134 1727096432.52652: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096432.52664: Calling all_plugins_play to load vars for managed_node1 24134 1727096432.52670: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096432.52673: Calling groups_plugins_play to load vars for managed_node1 24134 1727096432.54253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096432.55973: done with get_vars() 24134 1727096432.55996: done getting variables 24134 1727096432.56064: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096432.56182: variable 'profile' from source: include params 24134 1727096432.56186: variable 'interface' from source: set_fact 24134 1727096432.56252: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 09:00:32 -0400 (0:00:00.069) 0:00:36.776 ****** 24134 1727096432.56284: entering _queue_task() for managed_node1/command 24134 1727096432.56704: worker is 1 (out of 1 available) 24134 1727096432.56717: exiting _queue_task() for managed_node1/command 24134 1727096432.56728: done queuing things up, now waiting for results queue to drain 24134 1727096432.56729: waiting for pending results... 24134 1727096432.57384: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 24134 1727096432.57388: in run() - task 0afff68d-5257-1673-d3fc-000000000632 24134 1727096432.57392: variable 'ansible_search_path' from source: unknown 24134 1727096432.57401: variable 'ansible_search_path' from source: unknown 24134 1727096432.57516: calling self._execute() 24134 1727096432.57729: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.57739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.57754: variable 'omit' from source: magic vars 24134 1727096432.58330: variable 'ansible_distribution_major_version' from source: facts 24134 1727096432.58349: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096432.58482: variable 'profile_stat' from source: set_fact 24134 1727096432.58501: Evaluated conditional (profile_stat.stat.exists): False 24134 1727096432.58507: when evaluation is False, skipping this task 24134 1727096432.58514: _execute() done 24134 1727096432.58521: dumping result to json 24134 1727096432.58527: done dumping result, returning 24134 1727096432.58539: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 [0afff68d-5257-1673-d3fc-000000000632] 24134 1727096432.58554: sending task result for task 0afff68d-5257-1673-d3fc-000000000632 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24134 1727096432.58745: no more pending results, returning what we have 24134 1727096432.58749: results queue empty 24134 1727096432.58749: checking for any_errors_fatal 24134 1727096432.58755: done checking for any_errors_fatal 24134 1727096432.58756: checking for max_fail_percentage 24134 1727096432.58758: done checking for max_fail_percentage 24134 1727096432.58759: checking to see if all hosts have failed and the running result is not ok 24134 1727096432.58760: done checking to see if all hosts have failed 24134 1727096432.58761: getting the remaining hosts for this loop 24134 1727096432.58762: done getting the remaining hosts for this loop 24134 1727096432.58766: getting the next task for host managed_node1 24134 1727096432.58774: done getting next task for host managed_node1 24134 1727096432.58777: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 24134 1727096432.58781: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096432.58786: getting variables 24134 1727096432.58787: in VariableManager get_vars() 24134 1727096432.58888: Calling all_inventory to load vars for managed_node1 24134 1727096432.58891: Calling groups_inventory to load vars for managed_node1 24134 1727096432.58896: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096432.58909: Calling all_plugins_play to load vars for managed_node1 24134 1727096432.58917: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096432.58921: Calling groups_plugins_play to load vars for managed_node1 24134 1727096432.59480: done sending task result for task 0afff68d-5257-1673-d3fc-000000000632 24134 1727096432.59484: WORKER PROCESS EXITING 24134 1727096432.60914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096432.63036: done with get_vars() 24134 1727096432.63065: done getting variables 24134 1727096432.63131: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096432.63252: variable 'profile' from source: include params 24134 1727096432.63256: variable 'interface' from source: set_fact 24134 1727096432.63327: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 09:00:32 -0400 (0:00:00.070) 0:00:36.847 ****** 24134 1727096432.63359: entering _queue_task() for managed_node1/set_fact 24134 1727096432.63912: worker is 1 (out of 1 available) 24134 1727096432.63924: exiting _queue_task() for managed_node1/set_fact 24134 1727096432.63935: done queuing things up, now waiting for results queue to drain 24134 1727096432.63936: waiting for pending results... 24134 1727096432.64389: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 24134 1727096432.64539: in run() - task 0afff68d-5257-1673-d3fc-000000000633 24134 1727096432.64614: variable 'ansible_search_path' from source: unknown 24134 1727096432.64624: variable 'ansible_search_path' from source: unknown 24134 1727096432.64717: calling self._execute() 24134 1727096432.64966: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.64987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.65002: variable 'omit' from source: magic vars 24134 1727096432.65712: variable 'ansible_distribution_major_version' from source: facts 24134 1727096432.65907: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096432.66034: variable 'profile_stat' from source: set_fact 24134 1727096432.66055: Evaluated conditional (profile_stat.stat.exists): False 24134 1727096432.66234: when evaluation is False, skipping this task 24134 1727096432.66238: _execute() done 24134 1727096432.66240: dumping result to json 24134 1727096432.66242: done dumping result, returning 24134 1727096432.66246: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [0afff68d-5257-1673-d3fc-000000000633] 24134 1727096432.66248: sending task result for task 0afff68d-5257-1673-d3fc-000000000633 24134 1727096432.66322: done sending task result for task 0afff68d-5257-1673-d3fc-000000000633 24134 1727096432.66325: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24134 1727096432.66393: no more pending results, returning what we have 24134 1727096432.66398: results queue empty 24134 1727096432.66399: checking for any_errors_fatal 24134 1727096432.66407: done checking for any_errors_fatal 24134 1727096432.66408: checking for max_fail_percentage 24134 1727096432.66410: done checking for max_fail_percentage 24134 1727096432.66411: checking to see if all hosts have failed and the running result is not ok 24134 1727096432.66412: done checking to see if all hosts have failed 24134 1727096432.66413: getting the remaining hosts for this loop 24134 1727096432.66414: done getting the remaining hosts for this loop 24134 1727096432.66418: getting the next task for host managed_node1 24134 1727096432.66427: done getting next task for host managed_node1 24134 1727096432.66430: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 24134 1727096432.66433: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096432.66439: getting variables 24134 1727096432.66441: in VariableManager get_vars() 24134 1727096432.66478: Calling all_inventory to load vars for managed_node1 24134 1727096432.66481: Calling groups_inventory to load vars for managed_node1 24134 1727096432.66485: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096432.66499: Calling all_plugins_play to load vars for managed_node1 24134 1727096432.66502: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096432.66505: Calling groups_plugins_play to load vars for managed_node1 24134 1727096432.68314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096432.72091: done with get_vars() 24134 1727096432.72115: done getting variables 24134 1727096432.72190: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096432.72314: variable 'profile' from source: include params 24134 1727096432.72317: variable 'interface' from source: set_fact 24134 1727096432.72384: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Monday 23 September 2024 09:00:32 -0400 (0:00:00.090) 0:00:36.937 ****** 24134 1727096432.72415: entering _queue_task() for managed_node1/assert 24134 1727096432.72751: worker is 1 (out of 1 available) 24134 1727096432.72763: exiting _queue_task() for managed_node1/assert 24134 1727096432.72915: done queuing things up, now waiting for results queue to drain 24134 1727096432.72918: waiting for pending results... 24134 1727096432.73141: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'ethtest0' 24134 1727096432.73184: in run() - task 0afff68d-5257-1673-d3fc-000000000613 24134 1727096432.73204: variable 'ansible_search_path' from source: unknown 24134 1727096432.73211: variable 'ansible_search_path' from source: unknown 24134 1727096432.73258: calling self._execute() 24134 1727096432.73366: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.73455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.73459: variable 'omit' from source: magic vars 24134 1727096432.73787: variable 'ansible_distribution_major_version' from source: facts 24134 1727096432.73813: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096432.74138: variable 'omit' from source: magic vars 24134 1727096432.74142: variable 'omit' from source: magic vars 24134 1727096432.74294: variable 'profile' from source: include params 24134 1727096432.74370: variable 'interface' from source: set_fact 24134 1727096432.74494: variable 'interface' from source: set_fact 24134 1727096432.74524: variable 'omit' from source: magic vars 24134 1727096432.74877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096432.74881: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096432.74884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096432.74887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096432.74889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096432.74916: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096432.74925: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.74933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.75161: Set connection var ansible_shell_executable to /bin/sh 24134 1727096432.75178: Set connection var ansible_pipelining to False 24134 1727096432.75211: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096432.75228: Set connection var ansible_timeout to 10 24134 1727096432.75282: Set connection var ansible_connection to ssh 24134 1727096432.75291: Set connection var ansible_shell_type to sh 24134 1727096432.75324: variable 'ansible_shell_executable' from source: unknown 24134 1727096432.75372: variable 'ansible_connection' from source: unknown 24134 1727096432.75389: variable 'ansible_module_compression' from source: unknown 24134 1727096432.75425: variable 'ansible_shell_type' from source: unknown 24134 1727096432.75529: variable 'ansible_shell_executable' from source: unknown 24134 1727096432.75532: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.75534: variable 'ansible_pipelining' from source: unknown 24134 1727096432.75537: variable 'ansible_timeout' from source: unknown 24134 1727096432.75539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.75848: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096432.75872: variable 'omit' from source: magic vars 24134 1727096432.75885: starting attempt loop 24134 1727096432.75893: running the handler 24134 1727096432.76026: variable 'lsr_net_profile_exists' from source: set_fact 24134 1727096432.76048: Evaluated conditional (not lsr_net_profile_exists): True 24134 1727096432.76059: handler run complete 24134 1727096432.76085: attempt loop complete, returning result 24134 1727096432.76093: _execute() done 24134 1727096432.76099: dumping result to json 24134 1727096432.76107: done dumping result, returning 24134 1727096432.76119: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'ethtest0' [0afff68d-5257-1673-d3fc-000000000613] 24134 1727096432.76129: sending task result for task 0afff68d-5257-1673-d3fc-000000000613 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 24134 1727096432.76357: no more pending results, returning what we have 24134 1727096432.76361: results queue empty 24134 1727096432.76362: checking for any_errors_fatal 24134 1727096432.76371: done checking for any_errors_fatal 24134 1727096432.76372: checking for max_fail_percentage 24134 1727096432.76375: done checking for max_fail_percentage 24134 1727096432.76376: checking to see if all hosts have failed and the running result is not ok 24134 1727096432.76377: done checking to see if all hosts have failed 24134 1727096432.76378: getting the remaining hosts for this loop 24134 1727096432.76379: done getting the remaining hosts for this loop 24134 1727096432.76383: getting the next task for host managed_node1 24134 1727096432.76391: done getting next task for host managed_node1 24134 1727096432.76396: ^ task is: TASK: Include the task 'assert_device_absent.yml' 24134 1727096432.76398: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096432.76404: getting variables 24134 1727096432.76405: in VariableManager get_vars() 24134 1727096432.76438: Calling all_inventory to load vars for managed_node1 24134 1727096432.76441: Calling groups_inventory to load vars for managed_node1 24134 1727096432.76445: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096432.76458: Calling all_plugins_play to load vars for managed_node1 24134 1727096432.76461: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096432.76463: Calling groups_plugins_play to load vars for managed_node1 24134 1727096432.77200: done sending task result for task 0afff68d-5257-1673-d3fc-000000000613 24134 1727096432.77204: WORKER PROCESS EXITING 24134 1727096432.78079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096432.79632: done with get_vars() 24134 1727096432.79655: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:89 Monday 23 September 2024 09:00:32 -0400 (0:00:00.073) 0:00:37.010 ****** 24134 1727096432.79746: entering _queue_task() for managed_node1/include_tasks 24134 1727096432.80056: worker is 1 (out of 1 available) 24134 1727096432.80075: exiting _queue_task() for managed_node1/include_tasks 24134 1727096432.80087: done queuing things up, now waiting for results queue to drain 24134 1727096432.80089: waiting for pending results... 24134 1727096432.80334: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 24134 1727096432.80434: in run() - task 0afff68d-5257-1673-d3fc-00000000009e 24134 1727096432.80454: variable 'ansible_search_path' from source: unknown 24134 1727096432.80496: calling self._execute() 24134 1727096432.80597: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.80611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.80627: variable 'omit' from source: magic vars 24134 1727096432.80979: variable 'ansible_distribution_major_version' from source: facts 24134 1727096432.80996: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096432.81006: _execute() done 24134 1727096432.81013: dumping result to json 24134 1727096432.81020: done dumping result, returning 24134 1727096432.81029: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [0afff68d-5257-1673-d3fc-00000000009e] 24134 1727096432.81039: sending task result for task 0afff68d-5257-1673-d3fc-00000000009e 24134 1727096432.81173: no more pending results, returning what we have 24134 1727096432.81177: in VariableManager get_vars() 24134 1727096432.81210: Calling all_inventory to load vars for managed_node1 24134 1727096432.81213: Calling groups_inventory to load vars for managed_node1 24134 1727096432.81216: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096432.81227: Calling all_plugins_play to load vars for managed_node1 24134 1727096432.81230: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096432.81232: Calling groups_plugins_play to load vars for managed_node1 24134 1727096432.81882: done sending task result for task 0afff68d-5257-1673-d3fc-00000000009e 24134 1727096432.81886: WORKER PROCESS EXITING 24134 1727096432.82742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096432.84353: done with get_vars() 24134 1727096432.84381: variable 'ansible_search_path' from source: unknown 24134 1727096432.84409: we have included files to process 24134 1727096432.84410: generating all_blocks data 24134 1727096432.84413: done generating all_blocks data 24134 1727096432.84419: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 24134 1727096432.84420: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 24134 1727096432.84423: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 24134 1727096432.84590: in VariableManager get_vars() 24134 1727096432.84607: done with get_vars() 24134 1727096432.84721: done processing included file 24134 1727096432.84723: iterating over new_blocks loaded from include file 24134 1727096432.84724: in VariableManager get_vars() 24134 1727096432.84735: done with get_vars() 24134 1727096432.84737: filtering new block on tags 24134 1727096432.84754: done filtering new block on tags 24134 1727096432.84756: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 24134 1727096432.84762: extending task lists for all hosts with included blocks 24134 1727096432.85002: done extending task lists 24134 1727096432.85004: done processing included files 24134 1727096432.85004: results queue empty 24134 1727096432.85005: checking for any_errors_fatal 24134 1727096432.85008: done checking for any_errors_fatal 24134 1727096432.85009: checking for max_fail_percentage 24134 1727096432.85010: done checking for max_fail_percentage 24134 1727096432.85011: checking to see if all hosts have failed and the running result is not ok 24134 1727096432.85012: done checking to see if all hosts have failed 24134 1727096432.85013: getting the remaining hosts for this loop 24134 1727096432.85014: done getting the remaining hosts for this loop 24134 1727096432.85017: getting the next task for host managed_node1 24134 1727096432.85020: done getting next task for host managed_node1 24134 1727096432.85023: ^ task is: TASK: Include the task 'get_interface_stat.yml' 24134 1727096432.85025: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096432.85027: getting variables 24134 1727096432.85028: in VariableManager get_vars() 24134 1727096432.85036: Calling all_inventory to load vars for managed_node1 24134 1727096432.85039: Calling groups_inventory to load vars for managed_node1 24134 1727096432.85041: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096432.85048: Calling all_plugins_play to load vars for managed_node1 24134 1727096432.85050: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096432.85053: Calling groups_plugins_play to load vars for managed_node1 24134 1727096432.86238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096432.87941: done with get_vars() 24134 1727096432.87962: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Monday 23 September 2024 09:00:32 -0400 (0:00:00.082) 0:00:37.093 ****** 24134 1727096432.88038: entering _queue_task() for managed_node1/include_tasks 24134 1727096432.88385: worker is 1 (out of 1 available) 24134 1727096432.88397: exiting _queue_task() for managed_node1/include_tasks 24134 1727096432.88409: done queuing things up, now waiting for results queue to drain 24134 1727096432.88410: waiting for pending results... 24134 1727096432.88703: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 24134 1727096432.88827: in run() - task 0afff68d-5257-1673-d3fc-000000000664 24134 1727096432.88845: variable 'ansible_search_path' from source: unknown 24134 1727096432.88852: variable 'ansible_search_path' from source: unknown 24134 1727096432.88900: calling self._execute() 24134 1727096432.89007: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.89018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.89031: variable 'omit' from source: magic vars 24134 1727096432.89414: variable 'ansible_distribution_major_version' from source: facts 24134 1727096432.89431: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096432.89441: _execute() done 24134 1727096432.89450: dumping result to json 24134 1727096432.89457: done dumping result, returning 24134 1727096432.89466: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-1673-d3fc-000000000664] 24134 1727096432.89481: sending task result for task 0afff68d-5257-1673-d3fc-000000000664 24134 1727096432.89600: no more pending results, returning what we have 24134 1727096432.89605: in VariableManager get_vars() 24134 1727096432.89640: Calling all_inventory to load vars for managed_node1 24134 1727096432.89643: Calling groups_inventory to load vars for managed_node1 24134 1727096432.89646: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096432.89660: Calling all_plugins_play to load vars for managed_node1 24134 1727096432.89663: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096432.89665: Calling groups_plugins_play to load vars for managed_node1 24134 1727096432.90485: done sending task result for task 0afff68d-5257-1673-d3fc-000000000664 24134 1727096432.90489: WORKER PROCESS EXITING 24134 1727096432.91389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096432.93329: done with get_vars() 24134 1727096432.93357: variable 'ansible_search_path' from source: unknown 24134 1727096432.93359: variable 'ansible_search_path' from source: unknown 24134 1727096432.93511: we have included files to process 24134 1727096432.93513: generating all_blocks data 24134 1727096432.93514: done generating all_blocks data 24134 1727096432.93515: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24134 1727096432.93516: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24134 1727096432.93519: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24134 1727096432.93734: done processing included file 24134 1727096432.93736: iterating over new_blocks loaded from include file 24134 1727096432.93738: in VariableManager get_vars() 24134 1727096432.93749: done with get_vars() 24134 1727096432.93751: filtering new block on tags 24134 1727096432.93764: done filtering new block on tags 24134 1727096432.93766: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 24134 1727096432.93986: extending task lists for all hosts with included blocks 24134 1727096432.94084: done extending task lists 24134 1727096432.94173: done processing included files 24134 1727096432.94174: results queue empty 24134 1727096432.94175: checking for any_errors_fatal 24134 1727096432.94178: done checking for any_errors_fatal 24134 1727096432.94179: checking for max_fail_percentage 24134 1727096432.94180: done checking for max_fail_percentage 24134 1727096432.94181: checking to see if all hosts have failed and the running result is not ok 24134 1727096432.94182: done checking to see if all hosts have failed 24134 1727096432.94182: getting the remaining hosts for this loop 24134 1727096432.94183: done getting the remaining hosts for this loop 24134 1727096432.94186: getting the next task for host managed_node1 24134 1727096432.94190: done getting next task for host managed_node1 24134 1727096432.94192: ^ task is: TASK: Get stat for interface {{ interface }} 24134 1727096432.94233: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096432.94236: getting variables 24134 1727096432.94238: in VariableManager get_vars() 24134 1727096432.94247: Calling all_inventory to load vars for managed_node1 24134 1727096432.94249: Calling groups_inventory to load vars for managed_node1 24134 1727096432.94252: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096432.94257: Calling all_plugins_play to load vars for managed_node1 24134 1727096432.94259: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096432.94261: Calling groups_plugins_play to load vars for managed_node1 24134 1727096432.95700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096432.97432: done with get_vars() 24134 1727096432.97454: done getting variables 24134 1727096432.97600: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 09:00:32 -0400 (0:00:00.095) 0:00:37.189 ****** 24134 1727096432.97629: entering _queue_task() for managed_node1/stat 24134 1727096432.97959: worker is 1 (out of 1 available) 24134 1727096432.97975: exiting _queue_task() for managed_node1/stat 24134 1727096432.97986: done queuing things up, now waiting for results queue to drain 24134 1727096432.97987: waiting for pending results... 24134 1727096432.98260: running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 24134 1727096432.98389: in run() - task 0afff68d-5257-1673-d3fc-000000000687 24134 1727096432.98409: variable 'ansible_search_path' from source: unknown 24134 1727096432.98416: variable 'ansible_search_path' from source: unknown 24134 1727096432.98452: calling self._execute() 24134 1727096432.98552: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.98564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.98583: variable 'omit' from source: magic vars 24134 1727096432.98973: variable 'ansible_distribution_major_version' from source: facts 24134 1727096432.98991: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096432.99001: variable 'omit' from source: magic vars 24134 1727096432.99050: variable 'omit' from source: magic vars 24134 1727096432.99156: variable 'interface' from source: set_fact 24134 1727096432.99183: variable 'omit' from source: magic vars 24134 1727096432.99226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096432.99274: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096432.99299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096432.99320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096432.99337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096432.99380: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096432.99389: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.99396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.99505: Set connection var ansible_shell_executable to /bin/sh 24134 1727096432.99515: Set connection var ansible_pipelining to False 24134 1727096432.99524: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096432.99537: Set connection var ansible_timeout to 10 24134 1727096432.99543: Set connection var ansible_connection to ssh 24134 1727096432.99549: Set connection var ansible_shell_type to sh 24134 1727096432.99582: variable 'ansible_shell_executable' from source: unknown 24134 1727096432.99590: variable 'ansible_connection' from source: unknown 24134 1727096432.99596: variable 'ansible_module_compression' from source: unknown 24134 1727096432.99602: variable 'ansible_shell_type' from source: unknown 24134 1727096432.99609: variable 'ansible_shell_executable' from source: unknown 24134 1727096432.99685: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096432.99688: variable 'ansible_pipelining' from source: unknown 24134 1727096432.99691: variable 'ansible_timeout' from source: unknown 24134 1727096432.99693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096432.99830: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24134 1727096432.99849: variable 'omit' from source: magic vars 24134 1727096432.99860: starting attempt loop 24134 1727096432.99866: running the handler 24134 1727096432.99889: _low_level_execute_command(): starting 24134 1727096432.99905: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096433.00623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096433.00641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096433.00656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096433.00679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096433.00696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096433.00708: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096433.00720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096433.00736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096433.00759: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096433.00842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096433.00866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096433.00886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.00987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.02743: stdout chunk (state=3): >>>/root <<< 24134 1727096433.02930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096433.02933: stdout chunk (state=3): >>><<< 24134 1727096433.02936: stderr chunk (state=3): >>><<< 24134 1727096433.02940: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096433.02943: _low_level_execute_command(): starting 24134 1727096433.02976: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429 `" && echo ansible-tmp-1727096433.0290542-25856-84873216584429="` echo /root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429 `" ) && sleep 0' 24134 1727096433.04253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096433.04296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096433.04307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096433.04326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096433.04339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096433.04535: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096433.04574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096433.04609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096433.04650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.04787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.06780: stdout chunk (state=3): >>>ansible-tmp-1727096433.0290542-25856-84873216584429=/root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429 <<< 24134 1727096433.06922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096433.06925: stdout chunk (state=3): >>><<< 24134 1727096433.06933: stderr chunk (state=3): >>><<< 24134 1727096433.06952: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096433.0290542-25856-84873216584429=/root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096433.07001: variable 'ansible_module_compression' from source: unknown 24134 1727096433.07058: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 24134 1727096433.07199: variable 'ansible_facts' from source: unknown 24134 1727096433.07383: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429/AnsiballZ_stat.py 24134 1727096433.07688: Sending initial data 24134 1727096433.07692: Sent initial data (152 bytes) 24134 1727096433.09152: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096433.09155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096433.09158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096433.09160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096433.09162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096433.09164: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096433.09260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096433.09266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096433.09309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.09481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.11046: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096433.11131: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096433.11307: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpv2vfxinj /root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429/AnsiballZ_stat.py <<< 24134 1727096433.11311: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429/AnsiballZ_stat.py" <<< 24134 1727096433.11390: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpv2vfxinj" to remote "/root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429/AnsiballZ_stat.py" <<< 24134 1727096433.13148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096433.13151: stderr chunk (state=3): >>><<< 24134 1727096433.13153: stdout chunk (state=3): >>><<< 24134 1727096433.13155: done transferring module to remote 24134 1727096433.13157: _low_level_execute_command(): starting 24134 1727096433.13159: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429/ /root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429/AnsiballZ_stat.py && sleep 0' 24134 1727096433.14120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096433.14243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096433.14284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096433.14450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096433.14578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.14663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.16640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096433.16643: stdout chunk (state=3): >>><<< 24134 1727096433.16647: stderr chunk (state=3): >>><<< 24134 1727096433.16851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096433.16855: _low_level_execute_command(): starting 24134 1727096433.16857: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429/AnsiballZ_stat.py && sleep 0' 24134 1727096433.18200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096433.18238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096433.18251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096433.18270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.18505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.34140: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 24134 1727096433.35663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096433.35671: stdout chunk (state=3): >>><<< 24134 1727096433.35681: stderr chunk (state=3): >>><<< 24134 1727096433.35699: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096433.35727: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096433.35736: _low_level_execute_command(): starting 24134 1727096433.35742: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096433.0290542-25856-84873216584429/ > /dev/null 2>&1 && sleep 0' 24134 1727096433.36349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096433.36376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096433.36379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096433.36384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096433.36475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096433.36479: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096433.36481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096433.36499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096433.36511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096433.36520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.36609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.38557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096433.38561: stdout chunk (state=3): >>><<< 24134 1727096433.38675: stderr chunk (state=3): >>><<< 24134 1727096433.38679: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096433.38686: handler run complete 24134 1727096433.38688: attempt loop complete, returning result 24134 1727096433.38691: _execute() done 24134 1727096433.38693: dumping result to json 24134 1727096433.38695: done dumping result, returning 24134 1727096433.38697: done running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 [0afff68d-5257-1673-d3fc-000000000687] 24134 1727096433.38699: sending task result for task 0afff68d-5257-1673-d3fc-000000000687 24134 1727096433.38759: done sending task result for task 0afff68d-5257-1673-d3fc-000000000687 24134 1727096433.38762: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 24134 1727096433.38854: no more pending results, returning what we have 24134 1727096433.38858: results queue empty 24134 1727096433.38859: checking for any_errors_fatal 24134 1727096433.38861: done checking for any_errors_fatal 24134 1727096433.38862: checking for max_fail_percentage 24134 1727096433.38864: done checking for max_fail_percentage 24134 1727096433.38865: checking to see if all hosts have failed and the running result is not ok 24134 1727096433.38866: done checking to see if all hosts have failed 24134 1727096433.38866: getting the remaining hosts for this loop 24134 1727096433.38871: done getting the remaining hosts for this loop 24134 1727096433.38875: getting the next task for host managed_node1 24134 1727096433.38885: done getting next task for host managed_node1 24134 1727096433.38888: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 24134 1727096433.38891: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096433.38898: getting variables 24134 1727096433.38900: in VariableManager get_vars() 24134 1727096433.38934: Calling all_inventory to load vars for managed_node1 24134 1727096433.38937: Calling groups_inventory to load vars for managed_node1 24134 1727096433.38941: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096433.38954: Calling all_plugins_play to load vars for managed_node1 24134 1727096433.38958: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096433.38961: Calling groups_plugins_play to load vars for managed_node1 24134 1727096433.40681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096433.42490: done with get_vars() 24134 1727096433.42513: done getting variables 24134 1727096433.42608: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24134 1727096433.42749: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Monday 23 September 2024 09:00:33 -0400 (0:00:00.451) 0:00:37.641 ****** 24134 1727096433.42806: entering _queue_task() for managed_node1/assert 24134 1727096433.43165: worker is 1 (out of 1 available) 24134 1727096433.43184: exiting _queue_task() for managed_node1/assert 24134 1727096433.43204: done queuing things up, now waiting for results queue to drain 24134 1727096433.43206: waiting for pending results... 24134 1727096433.43466: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'ethtest0' 24134 1727096433.43663: in run() - task 0afff68d-5257-1673-d3fc-000000000665 24134 1727096433.43671: variable 'ansible_search_path' from source: unknown 24134 1727096433.43675: variable 'ansible_search_path' from source: unknown 24134 1727096433.43678: calling self._execute() 24134 1727096433.43780: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096433.43821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096433.43826: variable 'omit' from source: magic vars 24134 1727096433.44287: variable 'ansible_distribution_major_version' from source: facts 24134 1727096433.44291: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096433.44294: variable 'omit' from source: magic vars 24134 1727096433.44307: variable 'omit' from source: magic vars 24134 1727096433.44414: variable 'interface' from source: set_fact 24134 1727096433.44437: variable 'omit' from source: magic vars 24134 1727096433.44487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096433.44528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096433.44554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096433.44583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096433.44612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096433.44676: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096433.44679: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096433.44681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096433.44784: Set connection var ansible_shell_executable to /bin/sh 24134 1727096433.44796: Set connection var ansible_pipelining to False 24134 1727096433.44875: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096433.44878: Set connection var ansible_timeout to 10 24134 1727096433.44881: Set connection var ansible_connection to ssh 24134 1727096433.44883: Set connection var ansible_shell_type to sh 24134 1727096433.44885: variable 'ansible_shell_executable' from source: unknown 24134 1727096433.44888: variable 'ansible_connection' from source: unknown 24134 1727096433.44890: variable 'ansible_module_compression' from source: unknown 24134 1727096433.44892: variable 'ansible_shell_type' from source: unknown 24134 1727096433.44894: variable 'ansible_shell_executable' from source: unknown 24134 1727096433.44897: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096433.44899: variable 'ansible_pipelining' from source: unknown 24134 1727096433.44901: variable 'ansible_timeout' from source: unknown 24134 1727096433.44903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096433.45041: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096433.45057: variable 'omit' from source: magic vars 24134 1727096433.45073: starting attempt loop 24134 1727096433.45083: running the handler 24134 1727096433.45243: variable 'interface_stat' from source: set_fact 24134 1727096433.45258: Evaluated conditional (not interface_stat.stat.exists): True 24134 1727096433.45272: handler run complete 24134 1727096433.45375: attempt loop complete, returning result 24134 1727096433.45378: _execute() done 24134 1727096433.45380: dumping result to json 24134 1727096433.45382: done dumping result, returning 24134 1727096433.45384: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'ethtest0' [0afff68d-5257-1673-d3fc-000000000665] 24134 1727096433.45390: sending task result for task 0afff68d-5257-1673-d3fc-000000000665 24134 1727096433.45452: done sending task result for task 0afff68d-5257-1673-d3fc-000000000665 24134 1727096433.45455: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 24134 1727096433.45538: no more pending results, returning what we have 24134 1727096433.45541: results queue empty 24134 1727096433.45542: checking for any_errors_fatal 24134 1727096433.45550: done checking for any_errors_fatal 24134 1727096433.45551: checking for max_fail_percentage 24134 1727096433.45552: done checking for max_fail_percentage 24134 1727096433.45553: checking to see if all hosts have failed and the running result is not ok 24134 1727096433.45554: done checking to see if all hosts have failed 24134 1727096433.45555: getting the remaining hosts for this loop 24134 1727096433.45556: done getting the remaining hosts for this loop 24134 1727096433.45559: getting the next task for host managed_node1 24134 1727096433.45566: done getting next task for host managed_node1 24134 1727096433.45572: ^ task is: TASK: Verify network state restored to default 24134 1727096433.45574: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096433.45578: getting variables 24134 1727096433.45580: in VariableManager get_vars() 24134 1727096433.45609: Calling all_inventory to load vars for managed_node1 24134 1727096433.45611: Calling groups_inventory to load vars for managed_node1 24134 1727096433.45615: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096433.45624: Calling all_plugins_play to load vars for managed_node1 24134 1727096433.45627: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096433.45630: Calling groups_plugins_play to load vars for managed_node1 24134 1727096433.47282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096433.48954: done with get_vars() 24134 1727096433.48980: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:91 Monday 23 September 2024 09:00:33 -0400 (0:00:00.062) 0:00:37.704 ****** 24134 1727096433.49066: entering _queue_task() for managed_node1/include_tasks 24134 1727096433.49356: worker is 1 (out of 1 available) 24134 1727096433.49474: exiting _queue_task() for managed_node1/include_tasks 24134 1727096433.49485: done queuing things up, now waiting for results queue to drain 24134 1727096433.49486: waiting for pending results... 24134 1727096433.49687: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 24134 1727096433.49761: in run() - task 0afff68d-5257-1673-d3fc-00000000009f 24134 1727096433.49975: variable 'ansible_search_path' from source: unknown 24134 1727096433.49979: calling self._execute() 24134 1727096433.49982: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096433.49984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096433.49987: variable 'omit' from source: magic vars 24134 1727096433.50298: variable 'ansible_distribution_major_version' from source: facts 24134 1727096433.50318: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096433.50329: _execute() done 24134 1727096433.50336: dumping result to json 24134 1727096433.50344: done dumping result, returning 24134 1727096433.50353: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0afff68d-5257-1673-d3fc-00000000009f] 24134 1727096433.50361: sending task result for task 0afff68d-5257-1673-d3fc-00000000009f 24134 1727096433.50481: no more pending results, returning what we have 24134 1727096433.50486: in VariableManager get_vars() 24134 1727096433.50519: Calling all_inventory to load vars for managed_node1 24134 1727096433.50522: Calling groups_inventory to load vars for managed_node1 24134 1727096433.50525: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096433.50539: Calling all_plugins_play to load vars for managed_node1 24134 1727096433.50542: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096433.50545: Calling groups_plugins_play to load vars for managed_node1 24134 1727096433.51383: done sending task result for task 0afff68d-5257-1673-d3fc-00000000009f 24134 1727096433.51386: WORKER PROCESS EXITING 24134 1727096433.52097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096433.53776: done with get_vars() 24134 1727096433.53798: variable 'ansible_search_path' from source: unknown 24134 1727096433.53813: we have included files to process 24134 1727096433.53815: generating all_blocks data 24134 1727096433.53816: done generating all_blocks data 24134 1727096433.53819: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24134 1727096433.53820: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24134 1727096433.53823: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24134 1727096433.54228: done processing included file 24134 1727096433.54230: iterating over new_blocks loaded from include file 24134 1727096433.54231: in VariableManager get_vars() 24134 1727096433.54243: done with get_vars() 24134 1727096433.54246: filtering new block on tags 24134 1727096433.54263: done filtering new block on tags 24134 1727096433.54266: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 24134 1727096433.54275: extending task lists for all hosts with included blocks 24134 1727096433.54566: done extending task lists 24134 1727096433.54572: done processing included files 24134 1727096433.54573: results queue empty 24134 1727096433.54573: checking for any_errors_fatal 24134 1727096433.54577: done checking for any_errors_fatal 24134 1727096433.54577: checking for max_fail_percentage 24134 1727096433.54578: done checking for max_fail_percentage 24134 1727096433.54579: checking to see if all hosts have failed and the running result is not ok 24134 1727096433.54580: done checking to see if all hosts have failed 24134 1727096433.54581: getting the remaining hosts for this loop 24134 1727096433.54582: done getting the remaining hosts for this loop 24134 1727096433.54585: getting the next task for host managed_node1 24134 1727096433.54588: done getting next task for host managed_node1 24134 1727096433.54590: ^ task is: TASK: Check routes and DNS 24134 1727096433.54592: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096433.54594: getting variables 24134 1727096433.54595: in VariableManager get_vars() 24134 1727096433.54602: Calling all_inventory to load vars for managed_node1 24134 1727096433.54604: Calling groups_inventory to load vars for managed_node1 24134 1727096433.54606: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096433.54611: Calling all_plugins_play to load vars for managed_node1 24134 1727096433.54613: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096433.54616: Calling groups_plugins_play to load vars for managed_node1 24134 1727096433.55908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096433.57562: done with get_vars() 24134 1727096433.57591: done getting variables 24134 1727096433.57637: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Monday 23 September 2024 09:00:33 -0400 (0:00:00.085) 0:00:37.790 ****** 24134 1727096433.57666: entering _queue_task() for managed_node1/shell 24134 1727096433.58023: worker is 1 (out of 1 available) 24134 1727096433.58035: exiting _queue_task() for managed_node1/shell 24134 1727096433.58046: done queuing things up, now waiting for results queue to drain 24134 1727096433.58048: waiting for pending results... 24134 1727096433.58311: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 24134 1727096433.58401: in run() - task 0afff68d-5257-1673-d3fc-00000000069f 24134 1727096433.58415: variable 'ansible_search_path' from source: unknown 24134 1727096433.58419: variable 'ansible_search_path' from source: unknown 24134 1727096433.58451: calling self._execute() 24134 1727096433.58546: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096433.58550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096433.58562: variable 'omit' from source: magic vars 24134 1727096433.58926: variable 'ansible_distribution_major_version' from source: facts 24134 1727096433.58936: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096433.58943: variable 'omit' from source: magic vars 24134 1727096433.58980: variable 'omit' from source: magic vars 24134 1727096433.59014: variable 'omit' from source: magic vars 24134 1727096433.59059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096433.59097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096433.59116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096433.59140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096433.59151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096433.59181: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096433.59184: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096433.59187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096433.59291: Set connection var ansible_shell_executable to /bin/sh 24134 1727096433.59364: Set connection var ansible_pipelining to False 24134 1727096433.59371: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096433.59374: Set connection var ansible_timeout to 10 24134 1727096433.59377: Set connection var ansible_connection to ssh 24134 1727096433.59379: Set connection var ansible_shell_type to sh 24134 1727096433.59381: variable 'ansible_shell_executable' from source: unknown 24134 1727096433.59384: variable 'ansible_connection' from source: unknown 24134 1727096433.59386: variable 'ansible_module_compression' from source: unknown 24134 1727096433.59388: variable 'ansible_shell_type' from source: unknown 24134 1727096433.59390: variable 'ansible_shell_executable' from source: unknown 24134 1727096433.59392: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096433.59394: variable 'ansible_pipelining' from source: unknown 24134 1727096433.59396: variable 'ansible_timeout' from source: unknown 24134 1727096433.59398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096433.59499: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096433.59509: variable 'omit' from source: magic vars 24134 1727096433.59514: starting attempt loop 24134 1727096433.59518: running the handler 24134 1727096433.59529: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096433.59548: _low_level_execute_command(): starting 24134 1727096433.59584: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096433.60259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096433.60274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096433.60283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096433.60346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096433.60383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096433.60454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096433.60459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.60542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.62250: stdout chunk (state=3): >>>/root <<< 24134 1727096433.62414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096433.62418: stdout chunk (state=3): >>><<< 24134 1727096433.62420: stderr chunk (state=3): >>><<< 24134 1727096433.62441: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096433.62537: _low_level_execute_command(): starting 24134 1727096433.62540: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043 `" && echo ansible-tmp-1727096433.624471-25881-89372454504043="` echo /root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043 `" ) && sleep 0' 24134 1727096433.63087: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096433.63108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096433.63136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096433.63187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096433.63266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096433.63294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.63400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.65421: stdout chunk (state=3): >>>ansible-tmp-1727096433.624471-25881-89372454504043=/root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043 <<< 24134 1727096433.65595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096433.65598: stdout chunk (state=3): >>><<< 24134 1727096433.65601: stderr chunk (state=3): >>><<< 24134 1727096433.65775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096433.624471-25881-89372454504043=/root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096433.65778: variable 'ansible_module_compression' from source: unknown 24134 1727096433.65781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24134 1727096433.65783: variable 'ansible_facts' from source: unknown 24134 1727096433.65854: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043/AnsiballZ_command.py 24134 1727096433.66025: Sending initial data 24134 1727096433.66034: Sent initial data (154 bytes) 24134 1727096433.66777: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096433.66781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096433.66824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.66944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.68546: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 24134 1727096433.68551: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096433.68619: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096433.68695: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpxser42wg /root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043/AnsiballZ_command.py <<< 24134 1727096433.68698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043/AnsiballZ_command.py" <<< 24134 1727096433.68760: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpxser42wg" to remote "/root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043/AnsiballZ_command.py" <<< 24134 1727096433.69634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096433.69638: stdout chunk (state=3): >>><<< 24134 1727096433.69640: stderr chunk (state=3): >>><<< 24134 1727096433.69740: done transferring module to remote 24134 1727096433.69747: _low_level_execute_command(): starting 24134 1727096433.69754: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043/ /root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043/AnsiballZ_command.py && sleep 0' 24134 1727096433.70581: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096433.70600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096433.70723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096433.70755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096433.70814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.70886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.72961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096433.72965: stdout chunk (state=3): >>><<< 24134 1727096433.72970: stderr chunk (state=3): >>><<< 24134 1727096433.73173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096433.73181: _low_level_execute_command(): starting 24134 1727096433.73185: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043/AnsiballZ_command.py && sleep 0' 24134 1727096433.74214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096433.74217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096433.74220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096433.74222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 24134 1727096433.74224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096433.74388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096433.74517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.74600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.91244: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:ff:ac:3f:90:f5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.125/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3028sec preferred_lft 3028sec\n inet6 fe80::10ff:acff:fe3f:90f5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.125 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.125 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 09:00:33.901792", "end": "2024-09-23 09:00:33.910775", "delta": "0:00:00.008983", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24134 1727096433.92929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096433.92956: stderr chunk (state=3): >>><<< 24134 1727096433.92962: stdout chunk (state=3): >>><<< 24134 1727096433.92987: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:ff:ac:3f:90:f5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.125/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3028sec preferred_lft 3028sec\n inet6 fe80::10ff:acff:fe3f:90f5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.125 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.125 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 09:00:33.901792", "end": "2024-09-23 09:00:33.910775", "delta": "0:00:00.008983", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096433.93025: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096433.93035: _low_level_execute_command(): starting 24134 1727096433.93049: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096433.624471-25881-89372454504043/ > /dev/null 2>&1 && sleep 0' 24134 1727096433.93772: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096433.93861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096433.93888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096433.93919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096433.94085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096433.95959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096433.95986: stderr chunk (state=3): >>><<< 24134 1727096433.95990: stdout chunk (state=3): >>><<< 24134 1727096433.96002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096433.96009: handler run complete 24134 1727096433.96030: Evaluated conditional (False): False 24134 1727096433.96040: attempt loop complete, returning result 24134 1727096433.96043: _execute() done 24134 1727096433.96045: dumping result to json 24134 1727096433.96051: done dumping result, returning 24134 1727096433.96058: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0afff68d-5257-1673-d3fc-00000000069f] 24134 1727096433.96063: sending task result for task 0afff68d-5257-1673-d3fc-00000000069f 24134 1727096433.96160: done sending task result for task 0afff68d-5257-1673-d3fc-00000000069f 24134 1727096433.96163: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008983", "end": "2024-09-23 09:00:33.910775", "rc": 0, "start": "2024-09-23 09:00:33.901792" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:ff:ac:3f:90:f5 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.125/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3028sec preferred_lft 3028sec inet6 fe80::10ff:acff:fe3f:90f5/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.125 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.125 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 24134 1727096433.96230: no more pending results, returning what we have 24134 1727096433.96233: results queue empty 24134 1727096433.96234: checking for any_errors_fatal 24134 1727096433.96236: done checking for any_errors_fatal 24134 1727096433.96236: checking for max_fail_percentage 24134 1727096433.96238: done checking for max_fail_percentage 24134 1727096433.96239: checking to see if all hosts have failed and the running result is not ok 24134 1727096433.96240: done checking to see if all hosts have failed 24134 1727096433.96241: getting the remaining hosts for this loop 24134 1727096433.96242: done getting the remaining hosts for this loop 24134 1727096433.96246: getting the next task for host managed_node1 24134 1727096433.96252: done getting next task for host managed_node1 24134 1727096433.96255: ^ task is: TASK: Verify DNS and network connectivity 24134 1727096433.96258: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096433.96261: getting variables 24134 1727096433.96263: in VariableManager get_vars() 24134 1727096433.96303: Calling all_inventory to load vars for managed_node1 24134 1727096433.96306: Calling groups_inventory to load vars for managed_node1 24134 1727096433.96309: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096433.96324: Calling all_plugins_play to load vars for managed_node1 24134 1727096433.96326: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096433.96329: Calling groups_plugins_play to load vars for managed_node1 24134 1727096433.97613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096433.99424: done with get_vars() 24134 1727096433.99447: done getting variables 24134 1727096433.99493: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Monday 23 September 2024 09:00:33 -0400 (0:00:00.418) 0:00:38.208 ****** 24134 1727096433.99519: entering _queue_task() for managed_node1/shell 24134 1727096433.99777: worker is 1 (out of 1 available) 24134 1727096433.99790: exiting _queue_task() for managed_node1/shell 24134 1727096433.99803: done queuing things up, now waiting for results queue to drain 24134 1727096433.99804: waiting for pending results... 24134 1727096433.99988: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 24134 1727096434.00058: in run() - task 0afff68d-5257-1673-d3fc-0000000006a0 24134 1727096434.00065: variable 'ansible_search_path' from source: unknown 24134 1727096434.00070: variable 'ansible_search_path' from source: unknown 24134 1727096434.00099: calling self._execute() 24134 1727096434.00176: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096434.00181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096434.00190: variable 'omit' from source: magic vars 24134 1727096434.00459: variable 'ansible_distribution_major_version' from source: facts 24134 1727096434.00471: Evaluated conditional (ansible_distribution_major_version != '6'): True 24134 1727096434.00561: variable 'ansible_facts' from source: unknown 24134 1727096434.01427: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 24134 1727096434.01431: variable 'omit' from source: magic vars 24134 1727096434.01472: variable 'omit' from source: magic vars 24134 1727096434.01514: variable 'omit' from source: magic vars 24134 1727096434.01543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24134 1727096434.01589: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24134 1727096434.01633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24134 1727096434.01637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096434.01725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24134 1727096434.01728: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24134 1727096434.01731: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096434.01733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096434.01792: Set connection var ansible_shell_executable to /bin/sh 24134 1727096434.01796: Set connection var ansible_pipelining to False 24134 1727096434.01798: Set connection var ansible_module_compression to ZIP_DEFLATED 24134 1727096434.01801: Set connection var ansible_timeout to 10 24134 1727096434.01803: Set connection var ansible_connection to ssh 24134 1727096434.01805: Set connection var ansible_shell_type to sh 24134 1727096434.01831: variable 'ansible_shell_executable' from source: unknown 24134 1727096434.01885: variable 'ansible_connection' from source: unknown 24134 1727096434.01888: variable 'ansible_module_compression' from source: unknown 24134 1727096434.01890: variable 'ansible_shell_type' from source: unknown 24134 1727096434.01892: variable 'ansible_shell_executable' from source: unknown 24134 1727096434.01895: variable 'ansible_host' from source: host vars for 'managed_node1' 24134 1727096434.01897: variable 'ansible_pipelining' from source: unknown 24134 1727096434.01899: variable 'ansible_timeout' from source: unknown 24134 1727096434.01901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24134 1727096434.01973: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096434.01984: variable 'omit' from source: magic vars 24134 1727096434.01989: starting attempt loop 24134 1727096434.01991: running the handler 24134 1727096434.02001: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24134 1727096434.02017: _low_level_execute_command(): starting 24134 1727096434.02023: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24134 1727096434.02531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096434.02536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 24134 1727096434.02540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096434.02587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096434.02591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096434.02671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096434.04418: stdout chunk (state=3): >>>/root <<< 24134 1727096434.04517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096434.04545: stderr chunk (state=3): >>><<< 24134 1727096434.04548: stdout chunk (state=3): >>><<< 24134 1727096434.04575: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096434.04584: _low_level_execute_command(): starting 24134 1727096434.04590: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669 `" && echo ansible-tmp-1727096434.045722-25912-233808478142669="` echo /root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669 `" ) && sleep 0' 24134 1727096434.05028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096434.05039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096434.05042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096434.05045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096434.05093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096434.05102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096434.05108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096434.05172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096434.07160: stdout chunk (state=3): >>>ansible-tmp-1727096434.045722-25912-233808478142669=/root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669 <<< 24134 1727096434.07274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096434.07296: stderr chunk (state=3): >>><<< 24134 1727096434.07299: stdout chunk (state=3): >>><<< 24134 1727096434.07314: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096434.045722-25912-233808478142669=/root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096434.07342: variable 'ansible_module_compression' from source: unknown 24134 1727096434.07391: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24134y94e_2op/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24134 1727096434.07423: variable 'ansible_facts' from source: unknown 24134 1727096434.07472: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669/AnsiballZ_command.py 24134 1727096434.07579: Sending initial data 24134 1727096434.07583: Sent initial data (155 bytes) 24134 1727096434.08042: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096434.08045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 24134 1727096434.08047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24134 1727096434.08050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096434.08052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096434.08099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096434.08106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096434.08108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096434.08174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096434.09807: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24134 1727096434.09875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24134 1727096434.09937: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24134y94e_2op/tmpit0ginev /root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669/AnsiballZ_command.py <<< 24134 1727096434.09943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669/AnsiballZ_command.py" <<< 24134 1727096434.10004: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24134y94e_2op/tmpit0ginev" to remote "/root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669/AnsiballZ_command.py" <<< 24134 1727096434.10007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669/AnsiballZ_command.py" <<< 24134 1727096434.10636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096434.10674: stderr chunk (state=3): >>><<< 24134 1727096434.10678: stdout chunk (state=3): >>><<< 24134 1727096434.10702: done transferring module to remote 24134 1727096434.10711: _low_level_execute_command(): starting 24134 1727096434.10716: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669/ /root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669/AnsiballZ_command.py && sleep 0' 24134 1727096434.11135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096434.11144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096434.11171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096434.11178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096434.11180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096434.11227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096434.11231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096434.11307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096434.13172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096434.13194: stderr chunk (state=3): >>><<< 24134 1727096434.13197: stdout chunk (state=3): >>><<< 24134 1727096434.13212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096434.13215: _low_level_execute_command(): starting 24134 1727096434.13220: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669/AnsiballZ_command.py && sleep 0' 24134 1727096434.13630: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096434.13638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096434.13665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096434.13676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096434.13681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096434.13723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096434.13726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096434.13807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096434.76575: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1371 0 --:--:-- --:--:-- --:--:-- 1367\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1324 0 --:--:-- --:--:-- --:--:-- 1328", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 09:00:34.295081", "end": "2024-09-23 09:00:34.761092", "delta": "0:00:00.466011", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24134 1727096434.77984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 24134 1727096434.77998: stdout chunk (state=3): >>><<< 24134 1727096434.78021: stderr chunk (state=3): >>><<< 24134 1727096434.78050: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1371 0 --:--:-- --:--:-- --:--:-- 1367\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1324 0 --:--:-- --:--:-- --:--:-- 1328", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 09:00:34.295081", "end": "2024-09-23 09:00:34.761092", "delta": "0:00:00.466011", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 24134 1727096434.78177: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24134 1727096434.78181: _low_level_execute_command(): starting 24134 1727096434.78184: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096434.045722-25912-233808478142669/ > /dev/null 2>&1 && sleep 0' 24134 1727096434.78783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24134 1727096434.78798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096434.78822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24134 1727096434.78843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24134 1727096434.78860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 24134 1727096434.78879: stderr chunk (state=3): >>>debug2: match not found <<< 24134 1727096434.78895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096434.78913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24134 1727096434.78939: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24134 1727096434.78985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24134 1727096434.79048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 24134 1727096434.79078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24134 1727096434.79104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24134 1727096434.79204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24134 1727096434.81139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24134 1727096434.81145: stderr chunk (state=3): >>><<< 24134 1727096434.81148: stdout chunk (state=3): >>><<< 24134 1727096434.81163: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24134 1727096434.81173: handler run complete 24134 1727096434.81190: Evaluated conditional (False): False 24134 1727096434.81198: attempt loop complete, returning result 24134 1727096434.81200: _execute() done 24134 1727096434.81203: dumping result to json 24134 1727096434.81209: done dumping result, returning 24134 1727096434.81216: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0afff68d-5257-1673-d3fc-0000000006a0] 24134 1727096434.81220: sending task result for task 0afff68d-5257-1673-d3fc-0000000006a0 24134 1727096434.81321: done sending task result for task 0afff68d-5257-1673-d3fc-0000000006a0 24134 1727096434.81324: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.466011", "end": "2024-09-23 09:00:34.761092", "rc": 0, "start": "2024-09-23 09:00:34.295081" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1371 0 --:--:-- --:--:-- --:--:-- 1367 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1324 0 --:--:-- --:--:-- --:--:-- 1328 24134 1727096434.81397: no more pending results, returning what we have 24134 1727096434.81400: results queue empty 24134 1727096434.81401: checking for any_errors_fatal 24134 1727096434.81409: done checking for any_errors_fatal 24134 1727096434.81414: checking for max_fail_percentage 24134 1727096434.81415: done checking for max_fail_percentage 24134 1727096434.81416: checking to see if all hosts have failed and the running result is not ok 24134 1727096434.81417: done checking to see if all hosts have failed 24134 1727096434.81418: getting the remaining hosts for this loop 24134 1727096434.81419: done getting the remaining hosts for this loop 24134 1727096434.81423: getting the next task for host managed_node1 24134 1727096434.81431: done getting next task for host managed_node1 24134 1727096434.81433: ^ task is: TASK: meta (flush_handlers) 24134 1727096434.81435: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096434.81439: getting variables 24134 1727096434.81440: in VariableManager get_vars() 24134 1727096434.81471: Calling all_inventory to load vars for managed_node1 24134 1727096434.81474: Calling groups_inventory to load vars for managed_node1 24134 1727096434.81477: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096434.81487: Calling all_plugins_play to load vars for managed_node1 24134 1727096434.81490: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096434.81497: Calling groups_plugins_play to load vars for managed_node1 24134 1727096434.86666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096434.87799: done with get_vars() 24134 1727096434.87818: done getting variables 24134 1727096434.87858: in VariableManager get_vars() 24134 1727096434.87864: Calling all_inventory to load vars for managed_node1 24134 1727096434.87866: Calling groups_inventory to load vars for managed_node1 24134 1727096434.87871: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096434.87875: Calling all_plugins_play to load vars for managed_node1 24134 1727096434.87877: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096434.87878: Calling groups_plugins_play to load vars for managed_node1 24134 1727096434.88503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096434.89448: done with get_vars() 24134 1727096434.89466: done queuing things up, now waiting for results queue to drain 24134 1727096434.89471: results queue empty 24134 1727096434.89472: checking for any_errors_fatal 24134 1727096434.89474: done checking for any_errors_fatal 24134 1727096434.89475: checking for max_fail_percentage 24134 1727096434.89475: done checking for max_fail_percentage 24134 1727096434.89476: checking to see if all hosts have failed and the running result is not ok 24134 1727096434.89476: done checking to see if all hosts have failed 24134 1727096434.89477: getting the remaining hosts for this loop 24134 1727096434.89478: done getting the remaining hosts for this loop 24134 1727096434.89480: getting the next task for host managed_node1 24134 1727096434.89482: done getting next task for host managed_node1 24134 1727096434.89483: ^ task is: TASK: meta (flush_handlers) 24134 1727096434.89484: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096434.89486: getting variables 24134 1727096434.89486: in VariableManager get_vars() 24134 1727096434.89492: Calling all_inventory to load vars for managed_node1 24134 1727096434.89493: Calling groups_inventory to load vars for managed_node1 24134 1727096434.89495: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096434.89499: Calling all_plugins_play to load vars for managed_node1 24134 1727096434.89500: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096434.89502: Calling groups_plugins_play to load vars for managed_node1 24134 1727096434.90144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096434.91011: done with get_vars() 24134 1727096434.91025: done getting variables 24134 1727096434.91060: in VariableManager get_vars() 24134 1727096434.91071: Calling all_inventory to load vars for managed_node1 24134 1727096434.91073: Calling groups_inventory to load vars for managed_node1 24134 1727096434.91074: Calling all_plugins_inventory to load vars for managed_node1 24134 1727096434.91078: Calling all_plugins_play to load vars for managed_node1 24134 1727096434.91079: Calling groups_plugins_inventory to load vars for managed_node1 24134 1727096434.91081: Calling groups_plugins_play to load vars for managed_node1 24134 1727096434.91782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24134 1727096434.92653: done with get_vars() 24134 1727096434.92674: done queuing things up, now waiting for results queue to drain 24134 1727096434.92675: results queue empty 24134 1727096434.92676: checking for any_errors_fatal 24134 1727096434.92677: done checking for any_errors_fatal 24134 1727096434.92677: checking for max_fail_percentage 24134 1727096434.92678: done checking for max_fail_percentage 24134 1727096434.92678: checking to see if all hosts have failed and the running result is not ok 24134 1727096434.92679: done checking to see if all hosts have failed 24134 1727096434.92679: getting the remaining hosts for this loop 24134 1727096434.92680: done getting the remaining hosts for this loop 24134 1727096434.92682: getting the next task for host managed_node1 24134 1727096434.92684: done getting next task for host managed_node1 24134 1727096434.92684: ^ task is: None 24134 1727096434.92685: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24134 1727096434.92686: done queuing things up, now waiting for results queue to drain 24134 1727096434.92687: results queue empty 24134 1727096434.92687: checking for any_errors_fatal 24134 1727096434.92687: done checking for any_errors_fatal 24134 1727096434.92688: checking for max_fail_percentage 24134 1727096434.92688: done checking for max_fail_percentage 24134 1727096434.92689: checking to see if all hosts have failed and the running result is not ok 24134 1727096434.92689: done checking to see if all hosts have failed 24134 1727096434.92690: getting the next task for host managed_node1 24134 1727096434.92692: done getting next task for host managed_node1 24134 1727096434.92693: ^ task is: None 24134 1727096434.92694: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=75 changed=2 unreachable=0 failed=0 skipped=75 rescued=0 ignored=1 Monday 23 September 2024 09:00:34 -0400 (0:00:00.932) 0:00:39.140 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.10s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.88s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.88s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.71s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 Create veth interface ethtest0 ------------------------------------------ 1.47s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Install iproute --------------------------------------------------------- 1.24s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 1.17s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.14s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.10s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 1.09s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.00s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 Gathering Facts --------------------------------------------------------- 0.96s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 0.95s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Verify DNS and network connectivity ------------------------------------- 0.93s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Check if system is ostree ----------------------------------------------- 0.84s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gather the minimum subset of ansible_facts required by the network role test --- 0.80s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.77s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.77s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.68s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 24134 1727096434.92759: RUNNING CLEANUP