[WARNING]: Collection community.general does not support Ansible version 2.16.14 [WARNING]: Could not match supplied host pattern, ignoring: ad ansible-playbook [core 2.16.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-efo executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_dyndns.yml ***************************************************** 1 plays in /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml PLAY [Ensure that the role configures dynamic dns] ***************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:20 Friday 14 November 2025 08:53:15 -0500 (0:00:00.028) 0:00:00.028 ******* ok: [managed-node2] TASK [Setup fake realm] ******************************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:33 Friday 14 November 2025 08:53:16 -0500 (0:00:01.177) 0:00:01.206 ******* included: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml for managed-node2 TASK [Get role variables] ****************************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:6 Friday 14 November 2025 08:53:16 -0500 (0:00:00.043) 0:00:01.250 ******* TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Friday 14 November 2025 08:53:16 -0500 (0:00:00.028) 0:00:01.278 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Friday 14 November 2025 08:53:16 -0500 (0:00:00.032) 0:00:01.311 ******* ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Friday 14 November 2025 08:53:17 -0500 (0:00:00.415) 0:00:01.726 ******* ok: [managed-node2] => { "ansible_facts": { "__ad_integration_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Friday 14 November 2025 08:53:17 -0500 (0:00:00.021) 0:00:01.748 ******* skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Create a temp file for fake realm cmd] *********************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 Friday 14 November 2025 08:53:17 -0500 (0:00:00.035) 0:00:01.783 ******* changed: [managed-node2] => { "changed": true, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/lsr_0awstxsn_ad_int_realm.py", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Set realm cmd variable for remainder of test] **************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:18 Friday 14 November 2025 08:53:17 -0500 (0:00:00.398) 0:00:02.182 ******* ok: [managed-node2] => { "ansible_facts": { "__ad_integration_realm_cmd": "/tmp/lsr_0awstxsn_ad_int_realm.py" }, "changed": false } TASK [Create fake realm cmd] *************************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 Friday 14 November 2025 08:53:17 -0500 (0:00:00.017) 0:00:02.199 ******* changed: [managed-node2] => { "changed": true, "checksum": "30318e4f54519605d60caa5bc62e429287b28973", "dest": "/tmp/lsr_0awstxsn_ad_int_realm.py", "gid": 0, "group": "root", "md5sum": "3ea3ed87c4442dcbe51dfff237c430ed", "mode": "0755", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 1867, "src": "/root/.ansible/tmp/ansible-tmp-1763128397.5978415-8068-259907715813747/source", "state": "file", "uid": 0 } TASK [Check if /etc/sssd exists] *********************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Friday 14 November 2025 08:53:18 -0500 (0:00:00.741) 0:00:02.940 ******* ok: [managed-node2] => { "changed": false, "stat": { "atime": 1716968740.483, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1716968740.245, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 993, "gr_name": "sssd", "inode": 7060576, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0700", "mtime": 1716968740.245, "nlink": 4, "path": "/etc/sssd", "pw_name": "sssd", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 31, "uid": 996, "version": "3583498373", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": true } } TASK [Install sssd-common for /etc/sssd] *************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:33 Friday 14 November 2025 08:53:18 -0500 (0:00:00.320) 0:00:03.261 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __sssd_dir_stat.stat.exists", "skip_reason": "Conditional result was False" } TASK [Remove realm cmd] ******************************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 Friday 14 November 2025 08:53:18 -0500 (0:00:00.014) 0:00:03.276 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"cleanup\"", "skip_reason": "Conditional result was False" } TASK [Remove sssd-common] ****************************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:49 Friday 14 November 2025 08:53:18 -0500 (0:00:00.012) 0:00:03.288 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"cleanup\"", "skip_reason": "Conditional result was False" } TASK [Test - Run the system role with bogus vars] ****************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:39 Friday 14 November 2025 08:53:18 -0500 (0:00:00.011) 0:00:03.300 ******* TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Friday 14 November 2025 08:53:18 -0500 (0:00:00.056) 0:00:03.357 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not ad_integration_realm", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing timesync if timesource is set] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:8 Friday 14 November 2025 08:53:18 -0500 (0:00:00.013) 0:00:03.370 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_timesync_source is not none", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure time source is provided if managing timesync] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:15 Friday 14 November 2025 08:53:18 -0500 (0:00:00.016) 0:00:03.387 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing crypto policies if allow_rc4_crypto is set] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:25 Friday 14 November 2025 08:53:18 -0500 (0:00:00.031) 0:00:03.419 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure manage_crypt_policies is set with crypto_allow_rc4] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:30 Friday 14 November 2025 08:53:18 -0500 (0:00:00.030) 0:00:03.449 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure all required dns variables are provided] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:40 Friday 14 November 2025 08:53:18 -0500 (0:00:00.031) 0:00:03.481 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:49 Friday 14 November 2025 08:53:18 -0500 (0:00:00.029) 0:00:03.511 ******* included: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Friday 14 November 2025 08:53:18 -0500 (0:00:00.021) 0:00:03.533 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Friday 14 November 2025 08:53:18 -0500 (0:00:00.034) 0:00:03.567 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Friday 14 November 2025 08:53:18 -0500 (0:00:00.017) 0:00:03.585 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Friday 14 November 2025 08:53:18 -0500 (0:00:00.016) 0:00:03.601 ******* skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.ad_integration : Ensure required packages are installed] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 Friday 14 November 2025 08:53:19 -0500 (0:00:00.039) 0:00:03.641 ******* changed: [managed-node2] => { "changed": true, "rc": 0, "results": [ "Installed: dejavu-sans-mono-fonts-2.35-7.el8.noarch", "Installed: gsettings-desktop-schemas-3.32.0-6.el8.x86_64", "Installed: realmd-0.17.1-2.el8.x86_64", "Installed: PackageKit-glib-1.1.12-7.el8.x86_64", "Installed: abattis-cantarell-fonts-0.0.25-6.el8.noarch", "Installed: libproxy-0.4.15-5.2.el8.x86_64", "Installed: gdk-pixbuf2-2.36.12-5.el8.x86_64", "Installed: libstemmer-0-10.585svn.el8.x86_64", "Installed: glib-networking-2.56.1-1.1.el8.x86_64", "Installed: libmodman-2.0.1-17.el8.x86_64", "Installed: json-glib-1.4.4-1.el8.x86_64", "Installed: libappstream-glib-0.7.14-3.el8.x86_64", "Installed: fontpackages-filesystem-1.44-22.el8.noarch", "Installed: libsoup-2.62.3-5.el8.x86_64", "Installed: PackageKit-1.1.12-7.el8.x86_64", "Installed: dejavu-fonts-common-2.35-7.el8.noarch" ] } TASK [fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Friday 14 November 2025 08:53:35 -0500 (0:00:16.571) 0:00:20.212 ******* changed: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "realmd", "name": "realmd", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system.slice sysinit.target systemd-journald.socket dbus.socket basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.freedesktop.realmd", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Realm and Domain Configuration", "DevicePolicy": "auto", "Documentation": "man:realm(8) man:realmd.conf(5)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/libexec/realmd ; argv[]=/usr/libexec/realmd ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/realmd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "realmd.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "realmd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.socket sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "static", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf] **** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Friday 14 November 2025 08:53:36 -0500 (0:00:00.761) 0:00:20.974 ******* Notification for handler Handler for ad_integration to restart services has been saved. changed: [managed-node2] => { "changed": true, "checksum": "7e0c9eddf5cee60f782f39e0f445b043ab4bcb61", "dest": "/etc/realmd.conf", "gid": 0, "group": "root", "md5sum": "59e15d6f22a95d67b152af5a634072a8", "mode": "0400", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 181, "src": "/root/.ansible/tmp/ansible-tmp-1763128416.3815036-8345-2093933641912/source", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.ad_integration : Flush handlers] *************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:75 Friday 14 November 2025 08:53:37 -0500 (0:00:00.665) 0:00:21.640 ******* NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services for managed-node2 META: triggered running handlers for managed-node2 RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:3 Friday 14 November 2025 08:53:37 -0500 (0:00:00.002) 0:00:21.643 ******* skipping: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | default(false)", "item": "realmd", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Add AD server to existing network connection for DNS] ******************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:79 Friday 14 November 2025 08:53:37 -0500 (0:00:00.035) 0:00:21.679 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [Manage timesync] ********************************************************* task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:93 Friday 14 November 2025 08:53:37 -0500 (0:00:00.030) 0:00:21.709 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [Manage crypto policies] ************************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:102 Friday 14 November 2025 08:53:37 -0500 (0:00:00.030) 0:00:21.739 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_crypto_policies | bool", "skip_reason": "Conditional result was False" } TASK [Enable crypto policy allowing RC4 encryption] **************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:114 Friday 14 November 2025 08:53:37 -0500 (0:00:00.030) 0:00:21.770 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists] ****** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:130 Friday 14 November 2025 08:53:37 -0500 (0:00:00.029) 0:00:21.800 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_sssd_merge_duplicate_sections | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:135 Friday 14 November 2025 08:53:37 -0500 (0:00:00.029) 0:00:21.830 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 1] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:142 Friday 14 November 2025 08:53:37 -0500 (0:00:00.029) 0:00:21.860 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 2] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:147 Friday 14 November 2025 08:53:37 -0500 (0:00:00.038) 0:00:21.898 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Check if we are already joined to a domain] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:153 Friday 14 November 2025 08:53:37 -0500 (0:00:00.029) 0:00:21.927 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Leave existing joined domain] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:161 Friday 14 November 2025 08:53:37 -0500 (0:00:00.031) 0:00:21.958 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm leave] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:174 Friday 14 November 2025 08:53:37 -0500 (0:00:00.030) 0:00:21.988 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Remove duplicate sections] **** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:179 Friday 14 November 2025 08:53:37 -0500 (0:00:00.029) 0:00:22.018 ******* skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Build Command - Join to a specific Domain Controller] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:191 Friday 14 November 2025 08:53:37 -0500 (0:00:00.031) 0:00:22.049 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Build Join Command - Perform discovery-based realm join operation] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:205 Friday 14 November 2025 08:53:37 -0500 (0:00:00.016) 0:00:22.066 ******* ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Show the join command for debug] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:219 Friday 14 November 2025 08:53:37 -0500 (0:00:00.038) 0:00:22.104 ******* skipping: [managed-node2] => { "false_condition": "ad_integration_join_to_dc == __ad_integration_sample_dc or ad_integration_realm == __ad_integration_sample_realm or ansible_check_mode" } TASK [fedora.linux_system_roles.ad_integration : Run realm join command] ******* task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 Friday 14 November 2025 08:53:37 -0500 (0:00:00.012) 0:00:22.117 ******* changed: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 Friday 14 November 2025 08:53:37 -0500 (0:00:00.456) 0:00:22.573 ******* ok: [managed-node2] => { "changed": false, "stat": { "atime": 1763128417.8795457, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "049911c7517fba993eeb39dc494de8bf33faa685", "ctime": 1763128417.8775456, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 7074497, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1763128417.8775456, "nlink": 1, "path": "/etc/sssd/sssd.conf", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 87, "uid": 0, "version": "2036297279", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 Friday 14 November 2025 08:53:38 -0500 (0:00:00.375) 0:00:22.949 ******* ok: [managed-node2] => { "changed": false, "content": "W2RvbWFpbi9keW5kbnMtc2FtcGxlLXJlYWxtLmNvbV0KYWRfZG9tYWluID0gZHluZG5zLXNhbXBsZS1yZWFsbS5jb20KaWRfcHJvdmlkZXIgPSBhZAoK", "encoding": "base64", "source": "/etc/sssd/sssd.conf" } TASK [fedora.linux_system_roles.ad_integration : Consolidate options from duplicate sections] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:269 Friday 14 November 2025 08:53:38 -0500 (0:00:00.518) 0:00:23.468 ******* skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure SSSD settings] ****** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:284 Friday 14 November 2025 08:53:38 -0500 (0:00:00.048) 0:00:23.517 ******* skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates] *** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Friday 14 November 2025 08:53:38 -0500 (0:00:00.016) 0:00:23.533 ******* An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_update", "value": "True" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128418.9990766-8552-25798853612517/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128418.9990766-8552-25798853612517/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128418.9990766-8552-25798853612517/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_4xf6v8_r/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.10.116 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_ttl', 'value': '3600'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_ttl", "value": "3600" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128419.4563928-8552-197377821164809/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128419.4563928-8552-197377821164809/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128419.4563928-8552-197377821164809/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_mv1m38y0/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.10.116 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_iface", "value": "TESTING" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128419.7795339-8552-23010174138433/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128419.7795339-8552-23010174138433/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128419.7795339-8552-23010174138433/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_zvck31e8/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.10.116 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_refresh_interval', 'value': '86400'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_refresh_interval", "value": "86400" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128420.0954242-8552-216766679744624/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128420.0954242-8552-216766679744624/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128420.0954242-8552-216766679744624/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_f_ctwewj/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.10.116 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_update_ptr', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_update_ptr", "value": "True" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128420.4308088-8552-180843296208800/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128420.4308088-8552-180843296208800/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128420.4308088-8552-180843296208800/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_j0wx9cwm/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.10.116 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_force_tcp', 'value': 'False'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_force_tcp", "value": "False" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128420.7792463-8552-175455464375616/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128420.7792463-8552-175455464375616/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128420.7792463-8552-175455464375616/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_c4_ofhy3/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.10.116 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_auth', 'value': 'GSS-TSIG'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128421.1039686-8552-143524778734644/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128421.1039686-8552-143524778734644/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128421.1039686-8552-143524778734644/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_jlt64r2b/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.10.116 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128421.4309702-8552-52053230924385/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128421.4309702-8552-52053230924385/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128421.4309702-8552-52053230924385/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_ybions38/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.10.116 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'ad_hostname', 'value': 'managed-node2.dyndns-sample-realm.com'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "ad_hostname", "value": "managed-node2.dyndns-sample-realm.com" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128421.7468839-8552-166136365557119/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128421.7468839-8552-166136365557119/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128421.7468839-8552-166136365557119/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_f4l1kedl/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.10.116 closed. TASK [Cleanup fake realm] ****************************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:144 Friday 14 November 2025 08:53:42 -0500 (0:00:03.134) 0:00:26.667 ******* included: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml for managed-node2 TASK [Get role variables] ****************************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:6 Friday 14 November 2025 08:53:42 -0500 (0:00:00.053) 0:00:26.721 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Create a temp file for fake realm cmd] *********************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 Friday 14 November 2025 08:53:42 -0500 (0:00:00.012) 0:00:26.733 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Set realm cmd variable for remainder of test] **************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:18 Friday 14 November 2025 08:53:42 -0500 (0:00:00.013) 0:00:26.747 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Create fake realm cmd] *************************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 Friday 14 November 2025 08:53:42 -0500 (0:00:00.011) 0:00:26.758 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Check if /etc/sssd exists] *********************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Friday 14 November 2025 08:53:42 -0500 (0:00:00.011) 0:00:26.770 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Install sssd-common for /etc/sssd] *************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:33 Friday 14 November 2025 08:53:42 -0500 (0:00:00.010) 0:00:26.781 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Remove realm cmd] ******************************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 Friday 14 November 2025 08:53:42 -0500 (0:00:00.011) 0:00:26.793 ******* changed: [managed-node2] => { "changed": true, "path": "/tmp/lsr_0awstxsn_ad_int_realm.py", "state": "absent" } TASK [Remove sssd-common] ****************************************************** task path: /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:49 Friday 14 November 2025 08:53:42 -0500 (0:00:00.454) 0:00:27.247 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__installed_sssd_package is changed", "skip_reason": "Conditional result was False" } PLAY RECAP ********************************************************************* managed-node2 : ok=18 changed=7 unreachable=0 failed=1 skipped=39 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.16.14", "end_time": "2025-11-14T13:53:42.027391+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_update", "value": "True" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:38.897270+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-14T13:53:42.027430+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_ttl", "value": "3600" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:38.897270+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-14T13:53:42.027442+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_iface", "value": "TESTING" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:38.897270+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-14T13:53:42.027451+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_refresh_interval", "value": "86400" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:38.897270+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-14T13:53:42.027460+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_update_ptr", "value": "True" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:38.897270+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-14T13:53:42.027467+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_force_tcp", "value": "False" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:38.897270+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-14T13:53:42.027475+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:38.897270+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-14T13:53:42.027483+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_server", "value": "127.0.0.1" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:38.897270+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-14T13:53:42.027490+00:00Z", "host": "managed-node2", "loop_item": { "key": "ad_hostname", "value": "managed-node2.dyndns-sample-realm.com" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:38.897270+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Friday 14 November 2025 08:53:42 -0500 (0:00:00.021) 0:00:27.269 ******* =============================================================================== fedora.linux_system_roles.ad_integration : Ensure required packages are installed -- 16.57s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates --- 3.13s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Gathering Facts --------------------------------------------------------- 1.18s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:20 fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started --- 0.76s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Create fake realm cmd --------------------------------------------------- 0.74s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf ---- 0.67s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join --- 0.52s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 fedora.linux_system_roles.ad_integration : Run realm join command ------- 0.46s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 Remove realm cmd -------------------------------------------------------- 0.45s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 fedora.linux_system_roles.ad_integration : Check if system is ostree ---- 0.42s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Create a temp file for fake realm cmd ----------------------------------- 0.40s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join --- 0.38s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 Check if /etc/sssd exists ----------------------------------------------- 0.32s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Test - Run the system role with bogus vars ------------------------------ 0.06s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:39 Cleanup fake realm ------------------------------------------------------ 0.05s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:144 fedora.linux_system_roles.ad_integration : Consolidate options from duplicate sections --- 0.05s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:269 Setup fake realm -------------------------------------------------------- 0.04s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:33 fedora.linux_system_roles.ad_integration : Set platform/version specific variables --- 0.04s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 1 --- 0.04s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:142 fedora.linux_system_roles.ad_integration : Build Join Command - Perform discovery-based realm join operation --- 0.04s /tmp/collections-efo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:205 -- Logs begin at Fri 2025-11-14 08:50:36 EST, end at Fri 2025-11-14 08:53:42 EST. -- Nov 14 08:53:15 managed-node2 sshd[7079]: Accepted publickey for root from 10.31.15.127 port 42998 ssh2: ECDSA SHA256:itAIZKIXXTK/s8CTjRn7/bIer9rkGMksD/iH4P86j3I Nov 14 08:53:15 managed-node2 systemd[1]: Started Session 8 of user root. -- Subject: Unit session-8.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-8.scope has finished starting up. -- -- The start-up result is done. Nov 14 08:53:15 managed-node2 systemd-logind[592]: New session 8 of user root. -- Subject: A new session 8 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 8 has been created for the user root. -- -- The leading process of the session is 7079. Nov 14 08:53:15 managed-node2 sshd[7079]: pam_unix(sshd:session): session opened for user root by (uid=0) Nov 14 08:53:16 managed-node2 platform-python[7224]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 14 08:53:17 managed-node2 platform-python[7376]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 14 08:53:17 managed-node2 platform-python[7499]: ansible-tempfile Invoked with prefix=lsr_ suffix=_ad_int_realm.py state=file path=None Nov 14 08:53:17 managed-node2 platform-python[7622]: ansible-ansible.legacy.stat Invoked with path=/tmp/lsr_0awstxsn_ad_int_realm.py follow=False get_checksum=True checksum_algorithm=sha1 get_mime=True get_attributes=True Nov 14 08:53:18 managed-node2 platform-python[7723]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1763128397.5978415-8068-259907715813747/source dest=/tmp/lsr_0awstxsn_ad_int_realm.py mode=0755 follow=False _original_basename=fake_realm.py.j2 checksum=30318e4f54519605d60caa5bc62e429287b28973 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 14 08:53:18 managed-node2 platform-python[7848]: ansible-stat Invoked with path=/etc/sssd follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 14 08:53:19 managed-node2 platform-python[7973]: ansible-ansible.legacy.dnf Invoked with name=['realmd', 'PackageKit'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 14 08:53:34 managed-node2 dbus-daemon[590]: [system] Reloaded configuration Nov 14 08:53:34 managed-node2 dbus-daemon[590]: [system] Reloaded configuration Nov 14 08:53:34 managed-node2 dbus-daemon[590]: [system] Reloaded configuration Nov 14 08:53:34 managed-node2 dbus-daemon[590]: [system] Reloaded configuration Nov 14 08:53:34 managed-node2 systemd[1]: Reloading. Nov 14 08:53:34 managed-node2 polkitd[936]: Reloading rules Nov 14 08:53:34 managed-node2 polkitd[936]: Collecting garbage unconditionally... Nov 14 08:53:34 managed-node2 polkitd[936]: Loading rules from directory /etc/polkit-1/rules.d Nov 14 08:53:34 managed-node2 polkitd[936]: Loading rules from directory /usr/share/polkit-1/rules.d Nov 14 08:53:34 managed-node2 polkitd[936]: Finished loading, compiling and executing 3 rules Nov 14 08:53:34 managed-node2 polkitd[936]: Reloading rules Nov 14 08:53:34 managed-node2 polkitd[936]: Collecting garbage unconditionally... Nov 14 08:53:34 managed-node2 polkitd[936]: Loading rules from directory /etc/polkit-1/rules.d Nov 14 08:53:34 managed-node2 polkitd[936]: Loading rules from directory /usr/share/polkit-1/rules.d Nov 14 08:53:34 managed-node2 polkitd[936]: Finished loading, compiling and executing 3 rules Nov 14 08:53:34 managed-node2 dbus-daemon[590]: [system] Reloaded configuration Nov 14 08:53:34 managed-node2 dbus-daemon[590]: [system] Reloaded configuration Nov 14 08:53:34 managed-node2 dbus-daemon[590]: [system] Reloaded configuration Nov 14 08:53:34 managed-node2 dbus-daemon[590]: [system] Reloaded configuration Nov 14 08:53:34 managed-node2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-rc443eeef67094b7c85c359d614e4d331.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-rc443eeef67094b7c85c359d614e4d331.service has finished starting up. -- -- The start-up result is done. Nov 14 08:53:34 managed-node2 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Nov 14 08:53:34 managed-node2 systemd[1]: Starting man-db-cache-update.service... -- Subject: Unit man-db-cache-update.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has begun starting up. Nov 14 08:53:34 managed-node2 systemd[1]: Reloading. Nov 14 08:53:35 managed-node2 systemd[1]: man-db-cache-update.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit man-db-cache-update.service has successfully entered the 'dead' state. Nov 14 08:53:35 managed-node2 systemd[1]: Started man-db-cache-update.service. -- Subject: Unit man-db-cache-update.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has finished starting up. -- -- The start-up result is done. Nov 14 08:53:35 managed-node2 systemd[1]: run-rc443eeef67094b7c85c359d614e4d331.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-rc443eeef67094b7c85c359d614e4d331.service has successfully entered the 'dead' state. Nov 14 08:53:36 managed-node2 platform-python[8583]: ansible-ansible.legacy.systemd Invoked with name=realmd state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 14 08:53:36 managed-node2 systemd[1]: Starting Realm and Domain Configuration... -- Subject: Unit realmd.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit realmd.service has begun starting up. Nov 14 08:53:36 managed-node2 realmd[8591]: Loaded settings from: /usr/lib/realmd/realmd-defaults.conf /usr/lib/realmd/realmd-distro.conf Nov 14 08:53:36 managed-node2 realmd[8591]: holding daemon: startup Nov 14 08:53:36 managed-node2 realmd[8591]: starting service Nov 14 08:53:36 managed-node2 realmd[8591]: connected to bus Nov 14 08:53:36 managed-node2 realmd[8591]: released daemon: startup Nov 14 08:53:36 managed-node2 realmd[8591]: claimed name on bus: org.freedesktop.realmd Nov 14 08:53:36 managed-node2 systemd[1]: Started Realm and Domain Configuration. -- Subject: Unit realmd.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit realmd.service has finished starting up. -- -- The start-up result is done. Nov 14 08:53:36 managed-node2 platform-python[8716]: ansible-ansible.legacy.stat Invoked with path=/etc/realmd.conf follow=False get_checksum=True checksum_algorithm=sha1 get_mime=True get_attributes=True Nov 14 08:53:36 managed-node2 platform-python[8815]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1763128416.3815036-8345-2093933641912/source dest=/etc/realmd.conf backup=True mode=0400 follow=False _original_basename=realmd.conf.j2 checksum=7e0c9eddf5cee60f782f39e0f445b043ab4bcb61 force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 14 08:53:38 managed-node2 platform-python[9064]: ansible-stat Invoked with path=/etc/sssd/sssd.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 14 08:53:38 managed-node2 platform-python[9189]: ansible-slurp Invoked with path=/etc/sssd/sssd.conf src=/etc/sssd/sssd.conf Nov 14 08:53:42 managed-node2 platform-python[10419]: ansible-file Invoked with path=/tmp/lsr_0awstxsn_ad_int_realm.py state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 14 08:53:42 managed-node2 sshd[10440]: Accepted publickey for root from 10.31.15.127 port 54716 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 14 08:53:42 managed-node2 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Nov 14 08:53:42 managed-node2 systemd-logind[592]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 10440. Nov 14 08:53:42 managed-node2 sshd[10440]: pam_unix(sshd:session): session opened for user root by (uid=0) Nov 14 08:53:42 managed-node2 sshd[10443]: Received disconnect from 10.31.15.127 port 54716:11: disconnected by user Nov 14 08:53:42 managed-node2 sshd[10443]: Disconnected from user root 10.31.15.127 port 54716 Nov 14 08:53:42 managed-node2 sshd[10440]: pam_unix(sshd:session): session closed for user root Nov 14 08:53:42 managed-node2 systemd[1]: session-9.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-9.scope has successfully entered the 'dead' state. Nov 14 08:53:42 managed-node2 systemd-logind[592]: Session 9 logged out. Waiting for processes to exit. Nov 14 08:53:42 managed-node2 systemd-logind[592]: Removed session 9. -- Subject: Session 9 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 9 has been terminated. Nov 14 08:53:42 managed-node2 sshd[10464]: Accepted publickey for root from 10.31.15.127 port 54718 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 14 08:53:42 managed-node2 systemd[1]: Started Session 10 of user root. -- Subject: Unit session-10.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-10.scope has finished starting up. -- -- The start-up result is done. Nov 14 08:53:42 managed-node2 systemd-logind[592]: New session 10 of user root. -- Subject: A new session 10 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 10 has been created for the user root. -- -- The leading process of the session is 10464. Nov 14 08:53:42 managed-node2 sshd[10464]: pam_unix(sshd:session): session opened for user root by (uid=0)