ansible-playbook [core 2.17.8]
  config file = None
  configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/local/lib/python3.12/site-packages/ansible
  ansible collection location = /tmp/collections-WJe
  executable location = /usr/local/bin/ansible-playbook
  python version = 3.12.9 (main, Feb  4 2025, 00:00:00) [GCC 14.2.1 20250110 (Red Hat 14.2.1-7)] (/usr/bin/python3.12)
  jinja version = 3.1.5
  libyaml = True
No config file found; using defaults
running playbook inside collection fedora.linux_system_roles
redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug
redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug
redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
PLAYBOOK: tests_quadlet_demo.yml ***********************************************
2 plays in /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml
PLAY [all] *********************************************************************
TASK [Include vault variables] *************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:5
Saturday 15 February 2025  11:42:02 -0500 (0:00:00.010)       0:00:00.010 ***** 
[WARNING]: Found variable using reserved name: q
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_test_password": {
            "__ansible_vault": "$ANSIBLE_VAULT;1.1;AES256\n35383939616163653333633431363463313831383037386236646138333162396161356130303461\n3932623930643263313563336163316337643562333936360a363538636631313039343233383732\n38666530383538656639363465313230343533386130303833336434303438333161656262346562\n3362626538613031640a663330613638366132356534363534353239616666653466353961323533\n6565\n"
        },
        "mysql_container_root_password": {
            "__ansible_vault": "$ANSIBLE_VAULT;1.1;AES256\n61333932373230333539663035366431326163363166363036323963623131363530326231303634\n6635326161643165363366323062333334363730376631660a393566366139353861656364656661\n38653463363837336639363032646433666361646535366137303464623261313663643336306465\n6264663730656337310a343962353137386238383064646533366433333437303566656433386233\n34343235326665646661623131643335313236313131353661386338343366316261643634653633\n3832313034366536616531323963333234326461353130303532\n"
        }
    },
    "ansible_included_var_files": [
        "/tmp/podman-qfn/tests/vars/vault-variables.yml"
    ],
    "changed": false
}
PLAY [Deploy the quadlet demo app] *********************************************
TASK [Gathering Facts] *********************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:9
Saturday 15 February 2025  11:42:02 -0500 (0:00:00.032)       0:00:00.043 ***** 
[WARNING]: Platform linux on host managed-node1 is using the discovered Python
interpreter at /usr/bin/python3.12, but future installation of another Python
interpreter could change the meaning of that path. See
https://docs.ansible.com/ansible-
core/2.17/reference_appendices/interpreter_discovery.html for more information.
ok: [managed-node1]
TASK [Test is only supported on x86_64] ****************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:38
Saturday 15 February 2025  11:42:03 -0500 (0:00:01.522)       0:00:01.565 ***** 
skipping: [managed-node1] => {
    "false_condition": "ansible_facts[\"architecture\"] != \"x86_64\""
}
TASK [End test] ****************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:45
Saturday 15 February 2025  11:42:03 -0500 (0:00:00.027)       0:00:01.593 ***** 
META: end_play conditional evaluated to False, continuing play
skipping: [managed-node1] => {
    "skip_reason": "end_play conditional evaluated to False, continuing play"
}
MSG:
end_play
TASK [Generate certificates] ***************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:51
Saturday 15 February 2025  11:42:03 -0500 (0:00:00.014)       0:00:01.608 ***** 
included: fedora.linux_system_roles.certificate for managed-node1
TASK [fedora.linux_system_roles.certificate : Set version specific variables] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:2
Saturday 15 February 2025  11:42:03 -0500 (0:00:00.066)       0:00:01.675 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml for managed-node1
TASK [fedora.linux_system_roles.certificate : Ensure ansible_facts used by role] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:2
Saturday 15 February 2025  11:42:03 -0500 (0:00:00.046)       0:00:01.722 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__certificate_required_facts | difference(ansible_facts.keys() | list) | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.certificate : Check if system is ostree] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:10
Saturday 15 February 2025  11:42:03 -0500 (0:00:00.078)       0:00:01.800 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}
TASK [fedora.linux_system_roles.certificate : Set flag to indicate system is ostree] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:15
Saturday 15 February 2025  11:42:04 -0500 (0:00:00.557)       0:00:02.357 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__certificate_is_ostree": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.certificate : Set platform/version specific variables] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:19
Saturday 15 February 2025  11:42:04 -0500 (0:00:00.050)       0:00:02.408 ***** 
skipping: [managed-node1] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "__vars_file is file",
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [managed-node1] => (item=CentOS.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "__vars_file is file",
    "item": "CentOS.yml",
    "skip_reason": "Conditional result was False"
}
ok: [managed-node1] => (item=CentOS_10.yml) => {
    "ansible_facts": {
        "__certificate_certmonger_packages": [
            "certmonger",
            "python3-packaging"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/vars/CentOS_10.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "CentOS_10.yml"
}
ok: [managed-node1] => (item=CentOS_10.yml) => {
    "ansible_facts": {
        "__certificate_certmonger_packages": [
            "certmonger",
            "python3-packaging"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/vars/CentOS_10.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "CentOS_10.yml"
}
TASK [fedora.linux_system_roles.certificate : Ensure certificate role dependencies are installed] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:5
Saturday 15 February 2025  11:42:04 -0500 (0:00:00.083)       0:00:02.491 ***** 
ok: [managed-node1] => {
    "changed": false,
    "rc": 0,
    "results": []
}
MSG:
Nothing to do
lsrpackages: python3-cryptography python3-dbus python3-pyasn1
TASK [fedora.linux_system_roles.certificate : Ensure provider packages are installed] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:23
Saturday 15 February 2025  11:42:05 -0500 (0:00:01.161)       0:00:03.653 ***** 
ok: [managed-node1] => (item=certmonger) => {
    "__certificate_provider": "certmonger",
    "ansible_loop_var": "__certificate_provider",
    "changed": false,
    "rc": 0,
    "results": []
}
MSG:
Nothing to do
lsrpackages: certmonger python3-packaging
TASK [fedora.linux_system_roles.certificate : Ensure pre-scripts hooks directory exists] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:35
Saturday 15 February 2025  11:42:06 -0500 (0:00:00.942)       0:00:04.595 ***** 
ok: [managed-node1] => (item=certmonger) => {
    "__certificate_provider": "certmonger",
    "ansible_loop_var": "__certificate_provider",
    "changed": false,
    "gid": 0,
    "group": "root",
    "mode": "0700",
    "owner": "root",
    "path": "/etc/certmonger//pre-scripts",
    "secontext": "unconfined_u:object_r:etc_t:s0",
    "size": 6,
    "state": "directory",
    "uid": 0
}
TASK [fedora.linux_system_roles.certificate : Ensure post-scripts hooks directory exists] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:61
Saturday 15 February 2025  11:42:07 -0500 (0:00:00.539)       0:00:05.135 ***** 
ok: [managed-node1] => (item=certmonger) => {
    "__certificate_provider": "certmonger",
    "ansible_loop_var": "__certificate_provider",
    "changed": false,
    "gid": 0,
    "group": "root",
    "mode": "0700",
    "owner": "root",
    "path": "/etc/certmonger//post-scripts",
    "secontext": "unconfined_u:object_r:etc_t:s0",
    "size": 6,
    "state": "directory",
    "uid": 0
}
TASK [fedora.linux_system_roles.certificate : Ensure provider service is running] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:90
Saturday 15 February 2025  11:42:07 -0500 (0:00:00.446)       0:00:05.581 ***** 
ok: [managed-node1] => (item=certmonger) => {
    "__certificate_provider": "certmonger",
    "ansible_loop_var": "__certificate_provider",
    "changed": false,
    "enabled": true,
    "name": "certmonger",
    "state": "started",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:certmonger_unit_file_t:s0",
        "ActiveEnterTimestamp": "Sat 2025-02-15 11:37:44 EST",
        "ActiveEnterTimestampMonotonic": "660194362",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "active",
        "After": "dbus-broker.service network.target syslog.target dbus.socket basic.target systemd-journald.socket sysinit.target system.slice",
        "AllowIsolate": "no",
        "AssertResult": "yes",
        "AssertTimestamp": "Sat 2025-02-15 11:37:44 EST",
        "AssertTimestampMonotonic": "660171447",
        "Before": "multi-user.target shutdown.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "BusName": "org.fedorahosted.certmonger",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "430923000",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "yes",
        "ConditionTimestamp": "Sat 2025-02-15 11:37:44 EST",
        "ConditionTimestampMonotonic": "660171443",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "shutdown.target",
        "ControlGroup": "/system.slice/certmonger.service",
        "ControlGroupId": "5218",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "no",
        "Description": "Certificate monitoring and PKI enrollment",
        "DevicePolicy": "auto",
        "DynamicUser": "no",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveTasksMax": "22347",
        "EnvironmentFiles": "/etc/sysconfig/certmonger (ignore_errors=yes)",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainHandoffTimestamp": "Sat 2025-02-15 11:37:44 EST",
        "ExecMainHandoffTimestampMonotonic": "660181510",
        "ExecMainPID": "10295",
        "ExecMainStartTimestamp": "Sat 2025-02-15 11:37:44 EST",
        "ExecMainStartTimestampMonotonic": "660172515",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/sbin/certmonger ; argv[]=/usr/sbin/certmonger -S -p /run/certmonger.pid -n $OPTS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/sbin/certmonger ; argv[]=/usr/sbin/certmonger -S -p /run/certmonger.pid -n $OPTS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/usr/lib/systemd/system/certmonger.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "certmonger.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestamp": "Sat 2025-02-15 11:37:44 EST",
        "InactiveExitTimestampMonotonic": "660174493",
        "InvocationID": "90916c5404ef41ae9174d9ecd3708c6d",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "control-group",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "10295",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "3175354368",
        "MemoryCurrent": "2011136",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "13582336",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "0",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "0",
        "MemoryZSwapCurrent": "0",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "certmonger.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMPolicy": "stop",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "PIDFile": "/run/certmonger.pid",
        "PartOf": "dbus-broker.service",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "no",
        "ProtectControlGroups": "no",
        "ProtectControlGroupsEx": "no",
        "ProtectHome": "no",
        "ProtectHostname": "no",
        "ProtectKernelLogs": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "sysinit.target dbus.socket system.slice",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestamp": "Sat 2025-02-15 11:39:59 EST",
        "StateChangeTimestampMonotonic": "794473980",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "running",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "1",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "dbus",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "disabled",
        "UnitFileState": "enabled",
        "UtmpMode": "init",
        "WantedBy": "multi-user.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "0"
    }
}
TASK [fedora.linux_system_roles.certificate : Ensure certificate requests] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:101
Saturday 15 February 2025  11:42:08 -0500 (0:00:00.935)       0:00:06.517 ***** 
changed: [managed-node1] => (item={'name': 'quadlet_demo', 'dns': ['localhost'], 'ca': 'self-sign'}) => {
    "ansible_loop_var": "item",
    "changed": true,
    "item": {
        "ca": "self-sign",
        "dns": [
            "localhost"
        ],
        "name": "quadlet_demo"
    }
}
MSG:
Certificate requested (new).
TASK [fedora.linux_system_roles.certificate : Slurp the contents of the files] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:152
Saturday 15 February 2025  11:42:09 -0500 (0:00:01.087)       0:00:07.604 ***** 
ok: [managed-node1] => (item=['cert', {'name': 'quadlet_demo', 'dns': ['localhost'], 'ca': 'self-sign'}]) => {
    "ansible_loop_var": "item",
    "changed": false,
    "content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURnekNDQW11Z0F3SUJBZ0lSQUlYMUkxUll4VU85bHBhK3gzd3JXSlV3RFFZSktvWklodmNOQVFFTEJRQXcKVURFZ01CNEdBMVVFQXd3WFRHOWpZV3dnVTJsbmJtbHVaeUJCZFhSb2IzSnBkSGt4TERBcUJnTlZCQU1NSXpnMQpaalV5TXpVMExUVTRZelUwTTJKa0xUazJPVFppWldNM0xUZGpNbUkxT0Rrek1CNFhEVEkxTURJeE5URTJOREl3Ck9Wb1hEVEkyTURJeE5URTJNemMwTlZvd0ZERVNNQkFHQTFVRUF4TUpiRzlqWVd4b2IzTjBNSUlCSWpBTkJna3EKaGtpRzl3MEJBUUVGQUFPQ0FROEFNSUlCQ2dLQ0FRRUF3c0g4eGlCUEdtYjlJclpLbWtRNTd5N1NIcW5XeWFEdApldXY5QjFUOVpuNU1wSGFXQU0vNEpkSUN1b095YUQ5MGRXMDFYTnRyYlRNZjFZdVFnUk9rbGFPalR5RDM3Q1ozCnhuSGZ3eHNkMUptMmVaUXRqQ0hDdG5ZSVBoQm8rc0xLUEVVaklzUU9Ba01FWUlJSllwYXIvT3lzcUh3UmFNZGkKbHZKZEJqL2dMVjFMVzFQa1ZwSTdPekdkcnJPaTY2ejZRNklPYlo5YUNwRE1zY2lBbkVsMkRCS1VWa1ozajFNMApVczdGTjBZUVRHL2tubFB0a253MUFjSzFqMWpvMTVJNC9Lb0JLQXljZUIxb0FPRXRIeHZldHc0T0FWcmRmSWNsClhudU5CR3h2WlVqREZRSjFrWThMSkdVV3J4YUh3dHByQTFWa0puTXFweFc1VDUrQmgxQUZ2d0lEQVFBQm80R1QKTUlHUU1Bc0dBMVVkRHdRRUF3SUZvREFVQmdOVkhSRUVEVEFMZ2dsc2IyTmhiR2h2YzNRd0hRWURWUjBsQkJZdwpGQVlJS3dZQkJRVUhBd0VHQ0NzR0FRVUZCd01DTUF3R0ExVWRFd0VCL3dRQ01BQXdIUVlEVlIwT0JCWUVGSGpIClM1Qzk3c3dDbnplbHptMUkydlI4cDcrMU1COEdBMVVkSXdRWU1CYUFGUDNGQWc2Ynl3eVJTK0lKb1RTYVl2TzQKTjZxNE1BMEdDU3FHU0liM0RRRUJDd1VBQTRJQkFRQ01KV2ZROHlqREdlc0JQQTA4MmRMNldZMGdaMVBFSDVtVgpRbjNFK3FyOVRFL3l2KzNRMVRsVU1kOVQ1RFBvbUJqUUxqNkc3RmhIRFNkSitNTTJPMkNteit6TUZVb0RFcHl5CjgrbFpPYVZsSmd6TGxrOTFZYkk3WE5lcDNaTXRjY25SNDJuOEl4MFErZkwvZ1c3N09sMDcrK3Z1VjR5cmQvdE8KQVlIOG1SOEZYYXZJYndLcFZRM0ZDWE5PQzJEK1RFUnRCQk90QzlhZDdzRWl2ZERwMVdxWVZaakZ6RWhGYUNvVwpCdjJUOUJuZUlSWDNvYW5VUEZjWCtoSjlDc0dsc3VpMHR0ZHRsdWFiVFRrNmhieTJLNTZSRjAxdHZiMVkxY0NmClFhVkwxc2FnTVVxRGFzNEMzeDFtRVhoMjlBS09IWXZybEZjZjdlYlFNelFqT0JqVGFTMmkKLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=",
    "encoding": "base64",
    "item": [
        "cert",
        {
            "ca": "self-sign",
            "dns": [
                "localhost"
            ],
            "name": "quadlet_demo"
        }
    ],
    "source": "/etc/pki/tls/certs/quadlet_demo.crt"
}
ok: [managed-node1] => (item=['key', {'name': 'quadlet_demo', 'dns': ['localhost'], 'ca': 'self-sign'}]) => {
    "ansible_loop_var": "item",
    "changed": false,
    "content": "LS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tCk1JSUV2Z0lCQURBTkJna3Foa2lHOXcwQkFRRUZBQVNDQktnd2dnU2tBZ0VBQW9JQkFRREN3ZnpHSUU4YVp2MGkKdGtxYVJEbnZMdEllcWRiSm9PMTY2LzBIVlAxbWZreWtkcFlBei9nbDBnSzZnN0pvUDNSMWJUVmMyMnR0TXgvVgppNUNCRTZTVm82TlBJUGZzSm5mR2NkL0RHeDNVbWJaNWxDMk1JY0syZGdnK0VHajZ3c284UlNNaXhBNENRd1JnCmdnbGlscXY4N0t5b2ZCRm94MktXOGwwR1ArQXRYVXRiVStSV2tqczdNWjJ1czZMcnJQcERvZzV0bjFvS2tNeXgKeUlDY1NYWU1FcFJXUm5lUFV6UlN6c1UzUmhCTWIrU2VVKzJTZkRVQndyV1BXT2pYa2pqOHFnRW9ESng0SFdnQQo0UzBmRzk2M0RnNEJXdDE4aHlWZWU0MEViRzlsU01NVkFuV1Jqd3NrWlJhdkZvZkMybXNEVldRbWN5cW5GYmxQCm40R0hVQVcvQWdNQkFBRUNnZ0VBWHI2VVE2WXdGVDRJNHp3Zkt0RUtCZ3VXK0lmVEQ1K1VKL3BweTRsVEJPdG8KU1pIUEEzSW8yKzFBbW8rNjJQQVZyWktGSlRreEY1eUpYZzlaS3hIdFBVYXdDYXlXTDNHL1R2RFMxRTE2dzVSTApxdldrK3NqcHVtM05NTFEyL2RhSkg2emFuTzBYaTQ0aXNQaFJySDUwQ01UeWNieDVrNmw3a3NjdzdmdGhDVy9DCkR3ZjFVcStpRHZieXI5bXpOU3dNVERPQ2lrZ3lLdEdmOEYvL1lqSUFnd0VyZlFFRDM3T3Z5cjM4bkh6UjNkbHcKcFlPY2w5d1I3alJYR1RKT0ZLMGxRTUZvNlUxTmg5c2tPazluQysvZG4xb0tiT01tREdWSlA0clF6anRQY0RmZwpSeEFhUUVYbVlLVmQyWlpTMDJPMlBTRENuV3kyV201OUMvb2x1RXVhRVFLQmdRRGo2SGZhSDZXM09aK3FVK0JzCnh3WVJvVGF3Y2lDa0M1bTBzL0kxNFdkZTlaSXp5UVdxektiL05yUXlqK1ZScGd6ZFJFcDhvcFZ2dE1zTitnWksKamtrNVBVZ0RSWTMrejhiUGpLazNDZlpvajE1aE9oK05kQ1NsOWk5NEVzYUgwRjYzQXNUcDkwWUp4VzFjODVYbwp2TE9pdkhHNmZaSjRQc25maXVNUkZQWkpZd0tCZ1FEYXczb0hFK3hWUU1GTHhUcjJMQndFVFcrQkkwQnQvb3loCmpnaUc3em1ma1M0dW4weTFiOHRxSWtXMmp2VFhDZHo2ZXdsWURMQzNrVnlITmkwNDRXZCsvRTVldTIvVnJjc0MKRkhIRGVDREdtaGhGT0pDazBDSjQ4ZWg2MExHUTlrbHJPd3M0bzBMVldTZ0MrSXJJaWhiM2pHcnVmTzNWeXZENQp5VzZOOGFRdTlRS0JnUURWeWdzcnhRWXBzMWRRYXZBYWVqUDZNN1ZlblNCN1locGtoV3lQR0l0a2E0NGpIODQrCkkwNEFHQ3UvUm5hQ09mYlZWU3RlY1JFblFYdzg5eTRSb0JtV0pTTTVWblRWODhoMmgyZHdwdHJMeUxsckVSL3EKREdvWWVMQ2N2VmdZck9FOFljK2YvZ0pvUmhmSjJ3SGoyYWZVMzFyNzZyR01OTElnRFdQbGNUbDdLd0tCZ1FEYQpkMnhkdFIvRzQ2d2xwd21DM2ZuOHNmZ3cxcXQ1TFVQejhvdkJkUlh1ckdXYVdhdVpaZE9XWWpPTVpLZE9PS05kCjRaRXJSVUVWa1N6K09yUmVvWHRoN1VXMmhTamlFM1V3MW9ubDB3bW9UQ3NBdGNSOVV2MmFHcnM0SzBHb3Fzd1gKTWhlNnNCQUdUOHNOYkhmc1hUc2liREE5aXJWWlZCbmkyYTUyUW1MY0lRS0JnR3RrdW9iVUJDTFNpTlVNamNyWAorekRDVjN3YTRCaE1wT0xSQ1Yzd3d3N2c5Sms2U2xpd2J5R295TU52S1l0M3F0S0wxdCtaWk0ybGF5YTZpZjBQCnNNRktuTU95YTFzZVFCMkFmSVhlTU0rL3k5ZGl1cFl2WjFVRE1SMFQ0QVhLY2I3UVNGNW1wMTdaSkNZM3JxM08KUU53QUE1dDVDSnEyTkZ0NytYT3VLVVZoCi0tLS0tRU5EIFBSSVZBVEUgS0VZLS0tLS0K",
    "encoding": "base64",
    "item": [
        "key",
        {
            "ca": "self-sign",
            "dns": [
                "localhost"
            ],
            "name": "quadlet_demo"
        }
    ],
    "source": "/etc/pki/tls/private/quadlet_demo.key"
}
ok: [managed-node1] => (item=['ca', {'name': 'quadlet_demo', 'dns': ['localhost'], 'ca': 'self-sign'}]) => {
    "ansible_loop_var": "item",
    "changed": false,
    "content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURnekNDQW11Z0F3SUJBZ0lSQUlYMUkxUll4VU85bHBhK3gzd3JXSlV3RFFZSktvWklodmNOQVFFTEJRQXcKVURFZ01CNEdBMVVFQXd3WFRHOWpZV3dnVTJsbmJtbHVaeUJCZFhSb2IzSnBkSGt4TERBcUJnTlZCQU1NSXpnMQpaalV5TXpVMExUVTRZelUwTTJKa0xUazJPVFppWldNM0xUZGpNbUkxT0Rrek1CNFhEVEkxTURJeE5URTJOREl3Ck9Wb1hEVEkyTURJeE5URTJNemMwTlZvd0ZERVNNQkFHQTFVRUF4TUpiRzlqWVd4b2IzTjBNSUlCSWpBTkJna3EKaGtpRzl3MEJBUUVGQUFPQ0FROEFNSUlCQ2dLQ0FRRUF3c0g4eGlCUEdtYjlJclpLbWtRNTd5N1NIcW5XeWFEdApldXY5QjFUOVpuNU1wSGFXQU0vNEpkSUN1b095YUQ5MGRXMDFYTnRyYlRNZjFZdVFnUk9rbGFPalR5RDM3Q1ozCnhuSGZ3eHNkMUptMmVaUXRqQ0hDdG5ZSVBoQm8rc0xLUEVVaklzUU9Ba01FWUlJSllwYXIvT3lzcUh3UmFNZGkKbHZKZEJqL2dMVjFMVzFQa1ZwSTdPekdkcnJPaTY2ejZRNklPYlo5YUNwRE1zY2lBbkVsMkRCS1VWa1ozajFNMApVczdGTjBZUVRHL2tubFB0a253MUFjSzFqMWpvMTVJNC9Lb0JLQXljZUIxb0FPRXRIeHZldHc0T0FWcmRmSWNsClhudU5CR3h2WlVqREZRSjFrWThMSkdVV3J4YUh3dHByQTFWa0puTXFweFc1VDUrQmgxQUZ2d0lEQVFBQm80R1QKTUlHUU1Bc0dBMVVkRHdRRUF3SUZvREFVQmdOVkhSRUVEVEFMZ2dsc2IyTmhiR2h2YzNRd0hRWURWUjBsQkJZdwpGQVlJS3dZQkJRVUhBd0VHQ0NzR0FRVUZCd01DTUF3R0ExVWRFd0VCL3dRQ01BQXdIUVlEVlIwT0JCWUVGSGpIClM1Qzk3c3dDbnplbHptMUkydlI4cDcrMU1COEdBMVVkSXdRWU1CYUFGUDNGQWc2Ynl3eVJTK0lKb1RTYVl2TzQKTjZxNE1BMEdDU3FHU0liM0RRRUJDd1VBQTRJQkFRQ01KV2ZROHlqREdlc0JQQTA4MmRMNldZMGdaMVBFSDVtVgpRbjNFK3FyOVRFL3l2KzNRMVRsVU1kOVQ1RFBvbUJqUUxqNkc3RmhIRFNkSitNTTJPMkNteit6TUZVb0RFcHl5CjgrbFpPYVZsSmd6TGxrOTFZYkk3WE5lcDNaTXRjY25SNDJuOEl4MFErZkwvZ1c3N09sMDcrK3Z1VjR5cmQvdE8KQVlIOG1SOEZYYXZJYndLcFZRM0ZDWE5PQzJEK1RFUnRCQk90QzlhZDdzRWl2ZERwMVdxWVZaakZ6RWhGYUNvVwpCdjJUOUJuZUlSWDNvYW5VUEZjWCtoSjlDc0dsc3VpMHR0ZHRsdWFiVFRrNmhieTJLNTZSRjAxdHZiMVkxY0NmClFhVkwxc2FnTVVxRGFzNEMzeDFtRVhoMjlBS09IWXZybEZjZjdlYlFNelFqT0JqVGFTMmkKLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=",
    "encoding": "base64",
    "item": [
        "ca",
        {
            "ca": "self-sign",
            "dns": [
                "localhost"
            ],
            "name": "quadlet_demo"
        }
    ],
    "source": "/etc/pki/tls/certs/quadlet_demo.crt"
}
TASK [fedora.linux_system_roles.certificate : Create return data] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:160
Saturday 15 February 2025  11:42:11 -0500 (0:00:01.349)       0:00:08.953 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "certificate_test_certs": {
            "quadlet_demo": {
                "ca": "/etc/pki/tls/certs/quadlet_demo.crt",
                "ca_content": "-----BEGIN CERTIFICATE-----\nMIIDgzCCAmugAwIBAgIRAIX1I1RYxUO9lpa+x3wrWJUwDQYJKoZIhvcNAQELBQAw\nUDEgMB4GA1UEAwwXTG9jYWwgU2lnbmluZyBBdXRob3JpdHkxLDAqBgNVBAMMIzg1\nZjUyMzU0LTU4YzU0M2JkLTk2OTZiZWM3LTdjMmI1ODkzMB4XDTI1MDIxNTE2NDIw\nOVoXDTI2MDIxNTE2Mzc0NVowFDESMBAGA1UEAxMJbG9jYWxob3N0MIIBIjANBgkq\nhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwsH8xiBPGmb9IrZKmkQ57y7SHqnWyaDt\neuv9B1T9Zn5MpHaWAM/4JdICuoOyaD90dW01XNtrbTMf1YuQgROklaOjTyD37CZ3\nxnHfwxsd1Jm2eZQtjCHCtnYIPhBo+sLKPEUjIsQOAkMEYIIJYpar/OysqHwRaMdi\nlvJdBj/gLV1LW1PkVpI7OzGdrrOi66z6Q6IObZ9aCpDMsciAnEl2DBKUVkZ3j1M0\nUs7FN0YQTG/knlPtknw1AcK1j1jo15I4/KoBKAyceB1oAOEtHxvetw4OAVrdfIcl\nXnuNBGxvZUjDFQJ1kY8LJGUWrxaHwtprA1VkJnMqpxW5T5+Bh1AFvwIDAQABo4GT\nMIGQMAsGA1UdDwQEAwIFoDAUBgNVHREEDTALgglsb2NhbGhvc3QwHQYDVR0lBBYw\nFAYIKwYBBQUHAwEGCCsGAQUFBwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFHjH\nS5C97swCnzelzm1I2vR8p7+1MB8GA1UdIwQYMBaAFP3FAg6bywyRS+IJoTSaYvO4\nN6q4MA0GCSqGSIb3DQEBCwUAA4IBAQCMJWfQ8yjDGesBPA082dL6WY0gZ1PEH5mV\nQn3E+qr9TE/yv+3Q1TlUMd9T5DPomBjQLj6G7FhHDSdJ+MM2O2Cmz+zMFUoDEpyy\n8+lZOaVlJgzLlk91YbI7XNep3ZMtccnR42n8Ix0Q+fL/gW77Ol07++vuV4yrd/tO\nAYH8mR8FXavIbwKpVQ3FCXNOC2D+TERtBBOtC9ad7sEivdDp1WqYVZjFzEhFaCoW\nBv2T9BneIRX3oanUPFcX+hJ9CsGlsui0ttdtluabTTk6hby2K56RF01tvb1Y1cCf\nQaVL1sagMUqDas4C3x1mEXh29AKOHYvrlFcf7ebQMzQjOBjTaS2i\n-----END CERTIFICATE-----\n",
                "cert": "/etc/pki/tls/certs/quadlet_demo.crt",
                "cert_content": "-----BEGIN CERTIFICATE-----\nMIIDgzCCAmugAwIBAgIRAIX1I1RYxUO9lpa+x3wrWJUwDQYJKoZIhvcNAQELBQAw\nUDEgMB4GA1UEAwwXTG9jYWwgU2lnbmluZyBBdXRob3JpdHkxLDAqBgNVBAMMIzg1\nZjUyMzU0LTU4YzU0M2JkLTk2OTZiZWM3LTdjMmI1ODkzMB4XDTI1MDIxNTE2NDIw\nOVoXDTI2MDIxNTE2Mzc0NVowFDESMBAGA1UEAxMJbG9jYWxob3N0MIIBIjANBgkq\nhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwsH8xiBPGmb9IrZKmkQ57y7SHqnWyaDt\neuv9B1T9Zn5MpHaWAM/4JdICuoOyaD90dW01XNtrbTMf1YuQgROklaOjTyD37CZ3\nxnHfwxsd1Jm2eZQtjCHCtnYIPhBo+sLKPEUjIsQOAkMEYIIJYpar/OysqHwRaMdi\nlvJdBj/gLV1LW1PkVpI7OzGdrrOi66z6Q6IObZ9aCpDMsciAnEl2DBKUVkZ3j1M0\nUs7FN0YQTG/knlPtknw1AcK1j1jo15I4/KoBKAyceB1oAOEtHxvetw4OAVrdfIcl\nXnuNBGxvZUjDFQJ1kY8LJGUWrxaHwtprA1VkJnMqpxW5T5+Bh1AFvwIDAQABo4GT\nMIGQMAsGA1UdDwQEAwIFoDAUBgNVHREEDTALgglsb2NhbGhvc3QwHQYDVR0lBBYw\nFAYIKwYBBQUHAwEGCCsGAQUFBwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFHjH\nS5C97swCnzelzm1I2vR8p7+1MB8GA1UdIwQYMBaAFP3FAg6bywyRS+IJoTSaYvO4\nN6q4MA0GCSqGSIb3DQEBCwUAA4IBAQCMJWfQ8yjDGesBPA082dL6WY0gZ1PEH5mV\nQn3E+qr9TE/yv+3Q1TlUMd9T5DPomBjQLj6G7FhHDSdJ+MM2O2Cmz+zMFUoDEpyy\n8+lZOaVlJgzLlk91YbI7XNep3ZMtccnR42n8Ix0Q+fL/gW77Ol07++vuV4yrd/tO\nAYH8mR8FXavIbwKpVQ3FCXNOC2D+TERtBBOtC9ad7sEivdDp1WqYVZjFzEhFaCoW\nBv2T9BneIRX3oanUPFcX+hJ9CsGlsui0ttdtluabTTk6hby2K56RF01tvb1Y1cCf\nQaVL1sagMUqDas4C3x1mEXh29AKOHYvrlFcf7ebQMzQjOBjTaS2i\n-----END CERTIFICATE-----\n",
                "key": "/etc/pki/tls/private/quadlet_demo.key",
                "key_content": "-----BEGIN PRIVATE KEY-----\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQDCwfzGIE8aZv0i\ntkqaRDnvLtIeqdbJoO166/0HVP1mfkykdpYAz/gl0gK6g7JoP3R1bTVc22ttMx/V\ni5CBE6SVo6NPIPfsJnfGcd/DGx3UmbZ5lC2MIcK2dgg+EGj6wso8RSMixA4CQwRg\ngglilqv87KyofBFox2KW8l0GP+AtXUtbU+RWkjs7MZ2us6LrrPpDog5tn1oKkMyx\nyICcSXYMEpRWRnePUzRSzsU3RhBMb+SeU+2SfDUBwrWPWOjXkjj8qgEoDJx4HWgA\n4S0fG963Dg4BWt18hyVee40EbG9lSMMVAnWRjwskZRavFofC2msDVWQmcyqnFblP\nn4GHUAW/AgMBAAECggEAXr6UQ6YwFT4I4zwfKtEKBguW+IfTD5+UJ/ppy4lTBOto\nSZHPA3Io2+1Amo+62PAVrZKFJTkxF5yJXg9ZKxHtPUawCayWL3G/TvDS1E16w5RL\nqvWk+sjpum3NMLQ2/daJH6zanO0Xi44isPhRrH50CMTycbx5k6l7kscw7fthCW/C\nDwf1Uq+iDvbyr9mzNSwMTDOCikgyKtGf8F//YjIAgwErfQED37Ovyr38nHzR3dlw\npYOcl9wR7jRXGTJOFK0lQMFo6U1Nh9skOk9nC+/dn1oKbOMmDGVJP4rQzjtPcDfg\nRxAaQEXmYKVd2ZZS02O2PSDCnWy2Wm59C/oluEuaEQKBgQDj6HfaH6W3OZ+qU+Bs\nxwYRoTawciCkC5m0s/I14Wde9ZIzyQWqzKb/NrQyj+VRpgzdREp8opVvtMsN+gZK\njkk5PUgDRY3+z8bPjKk3CfZoj15hOh+NdCSl9i94EsaH0F63AsTp90YJxW1c85Xo\nvLOivHG6fZJ4PsnfiuMRFPZJYwKBgQDaw3oHE+xVQMFLxTr2LBwETW+BI0Bt/oyh\njgiG7zmfkS4un0y1b8tqIkW2jvTXCdz6ewlYDLC3kVyHNi044Wd+/E5eu2/VrcsC\nFHHDeCDGmhhFOJCk0CJ48eh60LGQ9klrOws4o0LVWSgC+IrIihb3jGrufO3VyvD5\nyW6N8aQu9QKBgQDVygsrxQYps1dQavAaejP6M7VenSB7YhpkhWyPGItka44jH84+\nI04AGCu/RnaCOfbVVStecREnQXw89y4RoBmWJSM5VnTV88h2h2dwptrLyLlrER/q\nDGoYeLCcvVgYrOE8Yc+f/gJoRhfJ2wHj2afU31r76rGMNLIgDWPlcTl7KwKBgQDa\nd2xdtR/G46wlpwmC3fn8sfgw1qt5LUPz8ovBdRXurGWaWauZZdOWYjOMZKdOOKNd\n4ZErRUEVkSz+OrReoXth7UW2hSjiE3Uw1onl0wmoTCsAtcR9Uv2aGrs4K0GoqswX\nMhe6sBAGT8sNbHfsXTsibDA9irVZVBni2a52QmLcIQKBgGtkuobUBCLSiNUMjcrX\n+zDCV3wa4BhMpOLRCV3www7g9Jk6SliwbyGoyMNvKYt3qtKL1t+ZZM2laya6if0P\nsMFKnMOya1seQB2AfIXeMM+/y9diupYvZ1UDMR0T4AXKcb7QSF5mp17ZJCY3rq3O\nQNwAA5t5CJq2NFt7+XOuKUVh\n-----END PRIVATE KEY-----\n"
            }
        }
    },
    "changed": false
}
TASK [fedora.linux_system_roles.certificate : Stop tracking certificates] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:176
Saturday 15 February 2025  11:42:11 -0500 (0:00:00.066)       0:00:09.020 ***** 
ok: [managed-node1] => (item={'cert': '/etc/pki/tls/certs/quadlet_demo.crt', 'cert_content': '-----BEGIN CERTIFICATE-----\nMIIDgzCCAmugAwIBAgIRAIX1I1RYxUO9lpa+x3wrWJUwDQYJKoZIhvcNAQELBQAw\nUDEgMB4GA1UEAwwXTG9jYWwgU2lnbmluZyBBdXRob3JpdHkxLDAqBgNVBAMMIzg1\nZjUyMzU0LTU4YzU0M2JkLTk2OTZiZWM3LTdjMmI1ODkzMB4XDTI1MDIxNTE2NDIw\nOVoXDTI2MDIxNTE2Mzc0NVowFDESMBAGA1UEAxMJbG9jYWxob3N0MIIBIjANBgkq\nhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwsH8xiBPGmb9IrZKmkQ57y7SHqnWyaDt\neuv9B1T9Zn5MpHaWAM/4JdICuoOyaD90dW01XNtrbTMf1YuQgROklaOjTyD37CZ3\nxnHfwxsd1Jm2eZQtjCHCtnYIPhBo+sLKPEUjIsQOAkMEYIIJYpar/OysqHwRaMdi\nlvJdBj/gLV1LW1PkVpI7OzGdrrOi66z6Q6IObZ9aCpDMsciAnEl2DBKUVkZ3j1M0\nUs7FN0YQTG/knlPtknw1AcK1j1jo15I4/KoBKAyceB1oAOEtHxvetw4OAVrdfIcl\nXnuNBGxvZUjDFQJ1kY8LJGUWrxaHwtprA1VkJnMqpxW5T5+Bh1AFvwIDAQABo4GT\nMIGQMAsGA1UdDwQEAwIFoDAUBgNVHREEDTALgglsb2NhbGhvc3QwHQYDVR0lBBYw\nFAYIKwYBBQUHAwEGCCsGAQUFBwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFHjH\nS5C97swCnzelzm1I2vR8p7+1MB8GA1UdIwQYMBaAFP3FAg6bywyRS+IJoTSaYvO4\nN6q4MA0GCSqGSIb3DQEBCwUAA4IBAQCMJWfQ8yjDGesBPA082dL6WY0gZ1PEH5mV\nQn3E+qr9TE/yv+3Q1TlUMd9T5DPomBjQLj6G7FhHDSdJ+MM2O2Cmz+zMFUoDEpyy\n8+lZOaVlJgzLlk91YbI7XNep3ZMtccnR42n8Ix0Q+fL/gW77Ol07++vuV4yrd/tO\nAYH8mR8FXavIbwKpVQ3FCXNOC2D+TERtBBOtC9ad7sEivdDp1WqYVZjFzEhFaCoW\nBv2T9BneIRX3oanUPFcX+hJ9CsGlsui0ttdtluabTTk6hby2K56RF01tvb1Y1cCf\nQaVL1sagMUqDas4C3x1mEXh29AKOHYvrlFcf7ebQMzQjOBjTaS2i\n-----END CERTIFICATE-----\n', 'key': '/etc/pki/tls/private/quadlet_demo.key', 'key_content': '-----BEGIN PRIVATE KEY-----\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQDCwfzGIE8aZv0i\ntkqaRDnvLtIeqdbJoO166/0HVP1mfkykdpYAz/gl0gK6g7JoP3R1bTVc22ttMx/V\ni5CBE6SVo6NPIPfsJnfGcd/DGx3UmbZ5lC2MIcK2dgg+EGj6wso8RSMixA4CQwRg\ngglilqv87KyofBFox2KW8l0GP+AtXUtbU+RWkjs7MZ2us6LrrPpDog5tn1oKkMyx\nyICcSXYMEpRWRnePUzRSzsU3RhBMb+SeU+2SfDUBwrWPWOjXkjj8qgEoDJx4HWgA\n4S0fG963Dg4BWt18hyVee40EbG9lSMMVAnWRjwskZRavFofC2msDVWQmcyqnFblP\nn4GHUAW/AgMBAAECggEAXr6UQ6YwFT4I4zwfKtEKBguW+IfTD5+UJ/ppy4lTBOto\nSZHPA3Io2+1Amo+62PAVrZKFJTkxF5yJXg9ZKxHtPUawCayWL3G/TvDS1E16w5RL\nqvWk+sjpum3NMLQ2/daJH6zanO0Xi44isPhRrH50CMTycbx5k6l7kscw7fthCW/C\nDwf1Uq+iDvbyr9mzNSwMTDOCikgyKtGf8F//YjIAgwErfQED37Ovyr38nHzR3dlw\npYOcl9wR7jRXGTJOFK0lQMFo6U1Nh9skOk9nC+/dn1oKbOMmDGVJP4rQzjtPcDfg\nRxAaQEXmYKVd2ZZS02O2PSDCnWy2Wm59C/oluEuaEQKBgQDj6HfaH6W3OZ+qU+Bs\nxwYRoTawciCkC5m0s/I14Wde9ZIzyQWqzKb/NrQyj+VRpgzdREp8opVvtMsN+gZK\njkk5PUgDRY3+z8bPjKk3CfZoj15hOh+NdCSl9i94EsaH0F63AsTp90YJxW1c85Xo\nvLOivHG6fZJ4PsnfiuMRFPZJYwKBgQDaw3oHE+xVQMFLxTr2LBwETW+BI0Bt/oyh\njgiG7zmfkS4un0y1b8tqIkW2jvTXCdz6ewlYDLC3kVyHNi044Wd+/E5eu2/VrcsC\nFHHDeCDGmhhFOJCk0CJ48eh60LGQ9klrOws4o0LVWSgC+IrIihb3jGrufO3VyvD5\nyW6N8aQu9QKBgQDVygsrxQYps1dQavAaejP6M7VenSB7YhpkhWyPGItka44jH84+\nI04AGCu/RnaCOfbVVStecREnQXw89y4RoBmWJSM5VnTV88h2h2dwptrLyLlrER/q\nDGoYeLCcvVgYrOE8Yc+f/gJoRhfJ2wHj2afU31r76rGMNLIgDWPlcTl7KwKBgQDa\nd2xdtR/G46wlpwmC3fn8sfgw1qt5LUPz8ovBdRXurGWaWauZZdOWYjOMZKdOOKNd\n4ZErRUEVkSz+OrReoXth7UW2hSjiE3Uw1onl0wmoTCsAtcR9Uv2aGrs4K0GoqswX\nMhe6sBAGT8sNbHfsXTsibDA9irVZVBni2a52QmLcIQKBgGtkuobUBCLSiNUMjcrX\n+zDCV3wa4BhMpOLRCV3www7g9Jk6SliwbyGoyMNvKYt3qtKL1t+ZZM2laya6if0P\nsMFKnMOya1seQB2AfIXeMM+/y9diupYvZ1UDMR0T4AXKcb7QSF5mp17ZJCY3rq3O\nQNwAA5t5CJq2NFt7+XOuKUVh\n-----END PRIVATE KEY-----\n', 'ca': '/etc/pki/tls/certs/quadlet_demo.crt', 'ca_content': '-----BEGIN CERTIFICATE-----\nMIIDgzCCAmugAwIBAgIRAIX1I1RYxUO9lpa+x3wrWJUwDQYJKoZIhvcNAQELBQAw\nUDEgMB4GA1UEAwwXTG9jYWwgU2lnbmluZyBBdXRob3JpdHkxLDAqBgNVBAMMIzg1\nZjUyMzU0LTU4YzU0M2JkLTk2OTZiZWM3LTdjMmI1ODkzMB4XDTI1MDIxNTE2NDIw\nOVoXDTI2MDIxNTE2Mzc0NVowFDESMBAGA1UEAxMJbG9jYWxob3N0MIIBIjANBgkq\nhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwsH8xiBPGmb9IrZKmkQ57y7SHqnWyaDt\neuv9B1T9Zn5MpHaWAM/4JdICuoOyaD90dW01XNtrbTMf1YuQgROklaOjTyD37CZ3\nxnHfwxsd1Jm2eZQtjCHCtnYIPhBo+sLKPEUjIsQOAkMEYIIJYpar/OysqHwRaMdi\nlvJdBj/gLV1LW1PkVpI7OzGdrrOi66z6Q6IObZ9aCpDMsciAnEl2DBKUVkZ3j1M0\nUs7FN0YQTG/knlPtknw1AcK1j1jo15I4/KoBKAyceB1oAOEtHxvetw4OAVrdfIcl\nXnuNBGxvZUjDFQJ1kY8LJGUWrxaHwtprA1VkJnMqpxW5T5+Bh1AFvwIDAQABo4GT\nMIGQMAsGA1UdDwQEAwIFoDAUBgNVHREEDTALgglsb2NhbGhvc3QwHQYDVR0lBBYw\nFAYIKwYBBQUHAwEGCCsGAQUFBwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFHjH\nS5C97swCnzelzm1I2vR8p7+1MB8GA1UdIwQYMBaAFP3FAg6bywyRS+IJoTSaYvO4\nN6q4MA0GCSqGSIb3DQEBCwUAA4IBAQCMJWfQ8yjDGesBPA082dL6WY0gZ1PEH5mV\nQn3E+qr9TE/yv+3Q1TlUMd9T5DPomBjQLj6G7FhHDSdJ+MM2O2Cmz+zMFUoDEpyy\n8+lZOaVlJgzLlk91YbI7XNep3ZMtccnR42n8Ix0Q+fL/gW77Ol07++vuV4yrd/tO\nAYH8mR8FXavIbwKpVQ3FCXNOC2D+TERtBBOtC9ad7sEivdDp1WqYVZjFzEhFaCoW\nBv2T9BneIRX3oanUPFcX+hJ9CsGlsui0ttdtluabTTk6hby2K56RF01tvb1Y1cCf\nQaVL1sagMUqDas4C3x1mEXh29AKOHYvrlFcf7ebQMzQjOBjTaS2i\n-----END CERTIFICATE-----\n'}) => {
    "ansible_loop_var": "item",
    "changed": false,
    "cmd": [
        "getcert",
        "stop-tracking",
        "-f",
        "/etc/pki/tls/certs/quadlet_demo.crt"
    ],
    "delta": "0:00:00.025856",
    "end": "2025-02-15 11:42:11.620680",
    "item": {
        "ca": "/etc/pki/tls/certs/quadlet_demo.crt",
        "ca_content": "-----BEGIN CERTIFICATE-----\nMIIDgzCCAmugAwIBAgIRAIX1I1RYxUO9lpa+x3wrWJUwDQYJKoZIhvcNAQELBQAw\nUDEgMB4GA1UEAwwXTG9jYWwgU2lnbmluZyBBdXRob3JpdHkxLDAqBgNVBAMMIzg1\nZjUyMzU0LTU4YzU0M2JkLTk2OTZiZWM3LTdjMmI1ODkzMB4XDTI1MDIxNTE2NDIw\nOVoXDTI2MDIxNTE2Mzc0NVowFDESMBAGA1UEAxMJbG9jYWxob3N0MIIBIjANBgkq\nhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwsH8xiBPGmb9IrZKmkQ57y7SHqnWyaDt\neuv9B1T9Zn5MpHaWAM/4JdICuoOyaD90dW01XNtrbTMf1YuQgROklaOjTyD37CZ3\nxnHfwxsd1Jm2eZQtjCHCtnYIPhBo+sLKPEUjIsQOAkMEYIIJYpar/OysqHwRaMdi\nlvJdBj/gLV1LW1PkVpI7OzGdrrOi66z6Q6IObZ9aCpDMsciAnEl2DBKUVkZ3j1M0\nUs7FN0YQTG/knlPtknw1AcK1j1jo15I4/KoBKAyceB1oAOEtHxvetw4OAVrdfIcl\nXnuNBGxvZUjDFQJ1kY8LJGUWrxaHwtprA1VkJnMqpxW5T5+Bh1AFvwIDAQABo4GT\nMIGQMAsGA1UdDwQEAwIFoDAUBgNVHREEDTALgglsb2NhbGhvc3QwHQYDVR0lBBYw\nFAYIKwYBBQUHAwEGCCsGAQUFBwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFHjH\nS5C97swCnzelzm1I2vR8p7+1MB8GA1UdIwQYMBaAFP3FAg6bywyRS+IJoTSaYvO4\nN6q4MA0GCSqGSIb3DQEBCwUAA4IBAQCMJWfQ8yjDGesBPA082dL6WY0gZ1PEH5mV\nQn3E+qr9TE/yv+3Q1TlUMd9T5DPomBjQLj6G7FhHDSdJ+MM2O2Cmz+zMFUoDEpyy\n8+lZOaVlJgzLlk91YbI7XNep3ZMtccnR42n8Ix0Q+fL/gW77Ol07++vuV4yrd/tO\nAYH8mR8FXavIbwKpVQ3FCXNOC2D+TERtBBOtC9ad7sEivdDp1WqYVZjFzEhFaCoW\nBv2T9BneIRX3oanUPFcX+hJ9CsGlsui0ttdtluabTTk6hby2K56RF01tvb1Y1cCf\nQaVL1sagMUqDas4C3x1mEXh29AKOHYvrlFcf7ebQMzQjOBjTaS2i\n-----END CERTIFICATE-----\n",
        "cert": "/etc/pki/tls/certs/quadlet_demo.crt",
        "cert_content": "-----BEGIN CERTIFICATE-----\nMIIDgzCCAmugAwIBAgIRAIX1I1RYxUO9lpa+x3wrWJUwDQYJKoZIhvcNAQELBQAw\nUDEgMB4GA1UEAwwXTG9jYWwgU2lnbmluZyBBdXRob3JpdHkxLDAqBgNVBAMMIzg1\nZjUyMzU0LTU4YzU0M2JkLTk2OTZiZWM3LTdjMmI1ODkzMB4XDTI1MDIxNTE2NDIw\nOVoXDTI2MDIxNTE2Mzc0NVowFDESMBAGA1UEAxMJbG9jYWxob3N0MIIBIjANBgkq\nhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwsH8xiBPGmb9IrZKmkQ57y7SHqnWyaDt\neuv9B1T9Zn5MpHaWAM/4JdICuoOyaD90dW01XNtrbTMf1YuQgROklaOjTyD37CZ3\nxnHfwxsd1Jm2eZQtjCHCtnYIPhBo+sLKPEUjIsQOAkMEYIIJYpar/OysqHwRaMdi\nlvJdBj/gLV1LW1PkVpI7OzGdrrOi66z6Q6IObZ9aCpDMsciAnEl2DBKUVkZ3j1M0\nUs7FN0YQTG/knlPtknw1AcK1j1jo15I4/KoBKAyceB1oAOEtHxvetw4OAVrdfIcl\nXnuNBGxvZUjDFQJ1kY8LJGUWrxaHwtprA1VkJnMqpxW5T5+Bh1AFvwIDAQABo4GT\nMIGQMAsGA1UdDwQEAwIFoDAUBgNVHREEDTALgglsb2NhbGhvc3QwHQYDVR0lBBYw\nFAYIKwYBBQUHAwEGCCsGAQUFBwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFHjH\nS5C97swCnzelzm1I2vR8p7+1MB8GA1UdIwQYMBaAFP3FAg6bywyRS+IJoTSaYvO4\nN6q4MA0GCSqGSIb3DQEBCwUAA4IBAQCMJWfQ8yjDGesBPA082dL6WY0gZ1PEH5mV\nQn3E+qr9TE/yv+3Q1TlUMd9T5DPomBjQLj6G7FhHDSdJ+MM2O2Cmz+zMFUoDEpyy\n8+lZOaVlJgzLlk91YbI7XNep3ZMtccnR42n8Ix0Q+fL/gW77Ol07++vuV4yrd/tO\nAYH8mR8FXavIbwKpVQ3FCXNOC2D+TERtBBOtC9ad7sEivdDp1WqYVZjFzEhFaCoW\nBv2T9BneIRX3oanUPFcX+hJ9CsGlsui0ttdtluabTTk6hby2K56RF01tvb1Y1cCf\nQaVL1sagMUqDas4C3x1mEXh29AKOHYvrlFcf7ebQMzQjOBjTaS2i\n-----END CERTIFICATE-----\n",
        "key": "/etc/pki/tls/private/quadlet_demo.key",
        "key_content": "-----BEGIN PRIVATE KEY-----\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQDCwfzGIE8aZv0i\ntkqaRDnvLtIeqdbJoO166/0HVP1mfkykdpYAz/gl0gK6g7JoP3R1bTVc22ttMx/V\ni5CBE6SVo6NPIPfsJnfGcd/DGx3UmbZ5lC2MIcK2dgg+EGj6wso8RSMixA4CQwRg\ngglilqv87KyofBFox2KW8l0GP+AtXUtbU+RWkjs7MZ2us6LrrPpDog5tn1oKkMyx\nyICcSXYMEpRWRnePUzRSzsU3RhBMb+SeU+2SfDUBwrWPWOjXkjj8qgEoDJx4HWgA\n4S0fG963Dg4BWt18hyVee40EbG9lSMMVAnWRjwskZRavFofC2msDVWQmcyqnFblP\nn4GHUAW/AgMBAAECggEAXr6UQ6YwFT4I4zwfKtEKBguW+IfTD5+UJ/ppy4lTBOto\nSZHPA3Io2+1Amo+62PAVrZKFJTkxF5yJXg9ZKxHtPUawCayWL3G/TvDS1E16w5RL\nqvWk+sjpum3NMLQ2/daJH6zanO0Xi44isPhRrH50CMTycbx5k6l7kscw7fthCW/C\nDwf1Uq+iDvbyr9mzNSwMTDOCikgyKtGf8F//YjIAgwErfQED37Ovyr38nHzR3dlw\npYOcl9wR7jRXGTJOFK0lQMFo6U1Nh9skOk9nC+/dn1oKbOMmDGVJP4rQzjtPcDfg\nRxAaQEXmYKVd2ZZS02O2PSDCnWy2Wm59C/oluEuaEQKBgQDj6HfaH6W3OZ+qU+Bs\nxwYRoTawciCkC5m0s/I14Wde9ZIzyQWqzKb/NrQyj+VRpgzdREp8opVvtMsN+gZK\njkk5PUgDRY3+z8bPjKk3CfZoj15hOh+NdCSl9i94EsaH0F63AsTp90YJxW1c85Xo\nvLOivHG6fZJ4PsnfiuMRFPZJYwKBgQDaw3oHE+xVQMFLxTr2LBwETW+BI0Bt/oyh\njgiG7zmfkS4un0y1b8tqIkW2jvTXCdz6ewlYDLC3kVyHNi044Wd+/E5eu2/VrcsC\nFHHDeCDGmhhFOJCk0CJ48eh60LGQ9klrOws4o0LVWSgC+IrIihb3jGrufO3VyvD5\nyW6N8aQu9QKBgQDVygsrxQYps1dQavAaejP6M7VenSB7YhpkhWyPGItka44jH84+\nI04AGCu/RnaCOfbVVStecREnQXw89y4RoBmWJSM5VnTV88h2h2dwptrLyLlrER/q\nDGoYeLCcvVgYrOE8Yc+f/gJoRhfJ2wHj2afU31r76rGMNLIgDWPlcTl7KwKBgQDa\nd2xdtR/G46wlpwmC3fn8sfgw1qt5LUPz8ovBdRXurGWaWauZZdOWYjOMZKdOOKNd\n4ZErRUEVkSz+OrReoXth7UW2hSjiE3Uw1onl0wmoTCsAtcR9Uv2aGrs4K0GoqswX\nMhe6sBAGT8sNbHfsXTsibDA9irVZVBni2a52QmLcIQKBgGtkuobUBCLSiNUMjcrX\n+zDCV3wa4BhMpOLRCV3www7g9Jk6SliwbyGoyMNvKYt3qtKL1t+ZZM2laya6if0P\nsMFKnMOya1seQB2AfIXeMM+/y9diupYvZ1UDMR0T4AXKcb7QSF5mp17ZJCY3rq3O\nQNwAA5t5CJq2NFt7+XOuKUVh\n-----END PRIVATE KEY-----\n"
    },
    "rc": 0,
    "start": "2025-02-15 11:42:11.594824"
}
STDOUT:
Request "20250215164209" removed.
TASK [fedora.linux_system_roles.certificate : Remove files] ********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:181
Saturday 15 February 2025  11:42:11 -0500 (0:00:00.571)       0:00:09.592 ***** 
changed: [managed-node1] => (item=/etc/pki/tls/certs/quadlet_demo.crt) => {
    "ansible_loop_var": "item",
    "changed": true,
    "item": "/etc/pki/tls/certs/quadlet_demo.crt",
    "path": "/etc/pki/tls/certs/quadlet_demo.crt",
    "state": "absent"
}
changed: [managed-node1] => (item=/etc/pki/tls/private/quadlet_demo.key) => {
    "ansible_loop_var": "item",
    "changed": true,
    "item": "/etc/pki/tls/private/quadlet_demo.key",
    "path": "/etc/pki/tls/private/quadlet_demo.key",
    "state": "absent"
}
ok: [managed-node1] => (item=/etc/pki/tls/certs/quadlet_demo.crt) => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "/etc/pki/tls/certs/quadlet_demo.crt",
    "path": "/etc/pki/tls/certs/quadlet_demo.crt",
    "state": "absent"
}
TASK [Run the role] ************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:62
Saturday 15 February 2025  11:42:12 -0500 (0:00:01.098)       0:00:10.690 ***** 
included: fedora.linux_system_roles.podman for managed-node1
TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3
Saturday 15 February 2025  11:42:12 -0500 (0:00:00.083)       0:00:10.773 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] ****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3
Saturday 15 February 2025  11:42:12 -0500 (0:00:00.035)       0:00:10.809 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11
Saturday 15 February 2025  11:42:12 -0500 (0:00:00.038)       0:00:10.847 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}
TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16
Saturday 15 February 2025  11:42:13 -0500 (0:00:00.373)       0:00:11.221 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_is_ostree": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check if transactional-update exists in /sbin] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:23
Saturday 15 February 2025  11:42:13 -0500 (0:00:00.026)       0:00:11.247 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}
TASK [fedora.linux_system_roles.podman : Set flag if transactional-update exists] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:28
Saturday 15 February 2025  11:42:13 -0500 (0:00:00.365)       0:00:11.613 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_is_transactional": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:32
Saturday 15 February 2025  11:42:13 -0500 (0:00:00.024)       0:00:11.637 ***** 
ok: [managed-node1] => (item=RedHat.yml) => {
    "ansible_facts": {
        "__podman_packages": [
            "podman",
            "shadow-utils-subid"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml"
}
skipping: [managed-node1] => (item=CentOS.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "__vars_file is file",
    "item": "CentOS.yml",
    "skip_reason": "Conditional result was False"
}
ok: [managed-node1] => (item=CentOS_10.yml) => {
    "ansible_facts": {
        "__podman_packages": [
            "iptables-nft",
            "podman",
            "shadow-utils-subid"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "CentOS_10.yml"
}
ok: [managed-node1] => (item=CentOS_10.yml) => {
    "ansible_facts": {
        "__podman_packages": [
            "iptables-nft",
            "podman",
            "shadow-utils-subid"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "CentOS_10.yml"
}
TASK [fedora.linux_system_roles.podman : Gather the package facts] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6
Saturday 15 February 2025  11:42:13 -0500 (0:00:00.044)       0:00:11.682 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Enable copr if requested] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10
Saturday 15 February 2025  11:42:14 -0500 (0:00:01.118)       0:00:12.800 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_use_copr | d(false)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14
Saturday 15 February 2025  11:42:14 -0500 (0:00:00.046)       0:00:12.847 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "(__podman_packages | difference(ansible_facts.packages))",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Notify user that reboot is needed to apply changes] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28
Saturday 15 February 2025  11:42:14 -0500 (0:00:00.048)       0:00:12.896 ***** 
skipping: [managed-node1] => {
    "false_condition": "__podman_is_transactional | d(false)"
}
TASK [fedora.linux_system_roles.podman : Reboot transactional update systems] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:33
Saturday 15 February 2025  11:42:15 -0500 (0:00:00.047)       0:00:12.943 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if reboot is needed and not set] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:38
Saturday 15 February 2025  11:42:15 -0500 (0:00:00.062)       0:00:13.006 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get podman version] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:46
Saturday 15 February 2025  11:42:15 -0500 (0:00:00.049)       0:00:13.056 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "--version"
    ],
    "delta": "0:00:00.028851",
    "end": "2025-02-15 11:42:15.472098",
    "rc": 0,
    "start": "2025-02-15 11:42:15.443247"
}
STDOUT:
podman version 5.3.1
TASK [fedora.linux_system_roles.podman : Set podman version] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:52
Saturday 15 February 2025  11:42:15 -0500 (0:00:00.396)       0:00:13.452 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "podman_version": "5.3.1"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56
Saturday 15 February 2025  11:42:15 -0500 (0:00:00.032)       0:00:13.485 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_version is version(\"4.2\", \"<\")",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:63
Saturday 15 February 2025  11:42:15 -0500 (0:00:00.031)       0:00:13.516 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_version is version(\"4.4\", \"<\")",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:73
Saturday 15 February 2025  11:42:15 -0500 (0:00:00.099)       0:00:13.616 ***** 
META: end_host conditional evaluated to False, continuing execution for managed-node1
skipping: [managed-node1] => {
    "skip_reason": "end_host conditional evaluated to False, continuing execution for managed-node1"
}
MSG:
end_host conditional evaluated to false, continuing execution for managed-node1
TASK [fedora.linux_system_roles.podman : Podman package version must be 5.0 or later for Pod quadlets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80
Saturday 15 February 2025  11:42:15 -0500 (0:00:00.119)       0:00:13.735 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__has_type_pod or __has_pod_file_ext or __has_pod_file_src_ext or __has_pod_template_src_ext or __has_pod_template_src_ext_j2",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Podman package version must be 5.0 or later for Pod quadlets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:96
Saturday 15 February 2025  11:42:15 -0500 (0:00:00.044)       0:00:13.780 ***** 
META: end_host conditional evaluated to False, continuing execution for managed-node1
skipping: [managed-node1] => {
    "skip_reason": "end_host conditional evaluated to False, continuing execution for managed-node1"
}
MSG:
end_host conditional evaluated to false, continuing execution for managed-node1
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:109
Saturday 15 February 2025  11:42:15 -0500 (0:00:00.046)       0:00:13.827 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:42:15 -0500 (0:00:00.058)       0:00:13.885 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "getent_passwd": {
            "root": [
                "x",
                "0",
                "0",
                "Super User",
                "/root",
                "/bin/bash"
            ]
        }
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:42:16 -0500 (0:00:00.500)       0:00:14.386 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:42:16 -0500 (0:00:00.055)       0:00:14.442 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:42:16 -0500 (0:00:00.065)       0:00:14.507 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.416)       0:00:14.924 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.037)       0:00:14.961 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.040)       0:00:15.001 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.036)       0:00:15.038 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.035)       0:00:15.073 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.047)       0:00:15.120 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.032)       0:00:15.153 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.031)       0:00:15.184 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set config file paths] ****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:115
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.033)       0:00:15.218 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf",
        "__podman_policy_json_file": "/etc/containers/policy.json",
        "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf",
        "__podman_storage_conf_file": "/etc/containers/storage.conf"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle container.conf.d] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:124
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.076)       0:00:15.294 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.076)       0:00:15.371 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_containers_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Update container config file] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.072)       0:00:15.443 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_containers_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:127
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.034)       0:00:15.477 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.057)       0:00:15.535 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_registries_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Update registries config file] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.030)       0:00:15.565 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_registries_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Handle storage.conf] ******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:130
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.031)       0:00:15.597 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.059)       0:00:15.657 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_storage_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Update storage config file] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.036)       0:00:15.694 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_storage_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Handle policy.json] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:133
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.050)       0:00:15.744 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.071)       0:00:15.816 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_policy_json | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.034)       0:00:15.851 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_policy_json | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get the existing policy.json] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19
Saturday 15 February 2025  11:42:17 -0500 (0:00:00.035)       0:00:15.887 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_policy_json | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Write new policy.json file] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25
Saturday 15 February 2025  11:42:18 -0500 (0:00:00.033)       0:00:15.920 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_policy_json | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [Manage firewall for specified ports] *************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:139
Saturday 15 February 2025  11:42:18 -0500 (0:00:00.032)       0:00:15.952 ***** 
included: fedora.linux_system_roles.firewall for managed-node1
TASK [fedora.linux_system_roles.firewall : Setup firewalld] ********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:2
Saturday 15 February 2025  11:42:18 -0500 (0:00:00.099)       0:00:16.051 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml for managed-node1
TASK [fedora.linux_system_roles.firewall : Ensure ansible_facts used by role] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:2
Saturday 15 February 2025  11:42:18 -0500 (0:00:00.135)       0:00:16.187 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_required_facts | difference(ansible_facts.keys() | list) | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Check if system is ostree] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:10
Saturday 15 February 2025  11:42:18 -0500 (0:00:00.064)       0:00:16.251 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}
TASK [fedora.linux_system_roles.firewall : Set flag to indicate system is ostree] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:15
Saturday 15 February 2025  11:42:18 -0500 (0:00:00.418)       0:00:16.670 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__firewall_is_ostree": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.firewall : Check if transactional-update exists in /sbin] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:22
Saturday 15 February 2025  11:42:18 -0500 (0:00:00.056)       0:00:16.726 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}
TASK [fedora.linux_system_roles.firewall : Set flag if transactional-update exists] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:27
Saturday 15 February 2025  11:42:19 -0500 (0:00:00.396)       0:00:17.123 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__firewall_is_transactional": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.firewall : Install firewalld] ******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:31
Saturday 15 February 2025  11:42:19 -0500 (0:00:00.034)       0:00:17.158 ***** 
ok: [managed-node1] => {
    "changed": false,
    "rc": 0,
    "results": []
}
MSG:
Nothing to do
lsrpackages: firewalld
TASK [fedora.linux_system_roles.firewall : Notify user that reboot is needed to apply changes] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:43
Saturday 15 February 2025  11:42:20 -0500 (0:00:00.826)       0:00:17.984 ***** 
skipping: [managed-node1] => {
    "false_condition": "__firewall_is_transactional | d(false)"
}
TASK [fedora.linux_system_roles.firewall : Reboot transactional update systems] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:48
Saturday 15 February 2025  11:42:20 -0500 (0:00:00.056)       0:00:18.040 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Fail if reboot is needed and not set] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:53
Saturday 15 February 2025  11:42:20 -0500 (0:00:00.051)       0:00:18.092 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Collect service facts] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:5
Saturday 15 February 2025  11:42:20 -0500 (0:00:00.052)       0:00:18.144 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "firewall_disable_conflicting_services | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Attempt to stop and disable conflicting services] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:9
Saturday 15 February 2025  11:42:20 -0500 (0:00:00.048)       0:00:18.193 ***** 
skipping: [managed-node1] => (item=nftables)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "firewall_disable_conflicting_services | bool",
    "item": "nftables",
    "skip_reason": "Conditional result was False"
}
skipping: [managed-node1] => (item=iptables)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "firewall_disable_conflicting_services | bool",
    "item": "iptables",
    "skip_reason": "Conditional result was False"
}
skipping: [managed-node1] => (item=ufw)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "firewall_disable_conflicting_services | bool",
    "item": "ufw",
    "skip_reason": "Conditional result was False"
}
skipping: [managed-node1] => {
    "changed": false
}
MSG:
All items skipped
TASK [fedora.linux_system_roles.firewall : Unmask firewalld service] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:22
Saturday 15 February 2025  11:42:20 -0500 (0:00:00.063)       0:00:18.256 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": "firewalld",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0",
        "ActiveEnterTimestampMonotonic": "0",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "inactive",
        "After": "system.slice dbus-broker.service polkit.service basic.target sysinit.target dbus.socket",
        "AllowIsolate": "no",
        "AssertResult": "no",
        "AssertTimestampMonotonic": "0",
        "Before": "network-pre.target shutdown.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "BusName": "org.fedoraproject.FirewallD1",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "[not set]",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "yes",
        "CanReload": "yes",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_tty_config cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "no",
        "ConditionTimestampMonotonic": "0",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "ipset.service iptables.service shutdown.target ebtables.service ip6tables.service",
        "ControlGroupId": "0",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "no",
        "Description": "firewalld - dynamic firewall daemon",
        "DeviceAllow": "char-rtc r",
        "DevicePolicy": "closed",
        "Documentation": "\"man:firewalld(1)\"",
        "DynamicUser": "no",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveTasksMax": "22347",
        "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainHandoffTimestampMonotonic": "0",
        "ExecMainPID": "0",
        "ExecMainStartTimestampMonotonic": "0",
        "ExecMainStatus": "0",
        "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/usr/lib/systemd/system/firewalld.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "firewalld.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestampMonotonic": "0",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "mixed",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "yes",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "0",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "3165634560",
        "MemoryCurrent": "[not set]",
        "MemoryDenyWriteExecute": "yes",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "[not set]",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "[not set]",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "[not set]",
        "MemoryZSwapCurrent": "[not set]",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "firewalld.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMPolicy": "stop",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "yes",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "yes",
        "ProtectControlGroups": "yes",
        "ProtectControlGroupsEx": "yes",
        "ProtectHome": "yes",
        "ProtectHostname": "yes",
        "ProtectKernelLogs": "yes",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "yes",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "sysinit.target system.slice dbus.socket",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "yes",
        "RestrictSUIDSGID": "yes",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "StandardError": "null",
        "StandardInput": "null",
        "StandardOutput": "null",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestampMonotonic": "0",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "dead",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallArchitectures": "native",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "[not set]",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "dbus",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "enabled",
        "UnitFileState": "disabled",
        "UtmpMode": "init",
        "Wants": "network-pre.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "infinity"
    }
}
TASK [fedora.linux_system_roles.firewall : Enable and start firewalld service] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:28
Saturday 15 February 2025  11:42:20 -0500 (0:00:00.541)       0:00:18.798 ***** 
changed: [managed-node1] => {
    "changed": true,
    "enabled": true,
    "name": "firewalld",
    "state": "started",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0",
        "ActiveEnterTimestampMonotonic": "0",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "inactive",
        "After": "polkit.service dbus.socket sysinit.target basic.target dbus-broker.service system.slice",
        "AllowIsolate": "no",
        "AssertResult": "no",
        "AssertTimestampMonotonic": "0",
        "Before": "network-pre.target shutdown.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "BusName": "org.fedoraproject.FirewallD1",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "[not set]",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "yes",
        "CanReload": "yes",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_tty_config cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "no",
        "ConditionTimestampMonotonic": "0",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "iptables.service shutdown.target ip6tables.service ipset.service ebtables.service",
        "ControlGroupId": "0",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "no",
        "Description": "firewalld - dynamic firewall daemon",
        "DeviceAllow": "char-rtc r",
        "DevicePolicy": "closed",
        "Documentation": "\"man:firewalld(1)\"",
        "DynamicUser": "no",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveTasksMax": "22347",
        "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainHandoffTimestampMonotonic": "0",
        "ExecMainPID": "0",
        "ExecMainStartTimestampMonotonic": "0",
        "ExecMainStatus": "0",
        "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/usr/lib/systemd/system/firewalld.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "firewalld.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestampMonotonic": "0",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "mixed",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "yes",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "0",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "3165114368",
        "MemoryCurrent": "[not set]",
        "MemoryDenyWriteExecute": "yes",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "[not set]",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "[not set]",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "[not set]",
        "MemoryZSwapCurrent": "[not set]",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "firewalld.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMPolicy": "stop",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "yes",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "yes",
        "ProtectControlGroups": "yes",
        "ProtectControlGroupsEx": "yes",
        "ProtectHome": "yes",
        "ProtectHostname": "yes",
        "ProtectKernelLogs": "yes",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "yes",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "system.slice dbus.socket sysinit.target",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "yes",
        "RestrictSUIDSGID": "yes",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "StandardError": "null",
        "StandardInput": "null",
        "StandardOutput": "null",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestampMonotonic": "0",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "dead",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallArchitectures": "native",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "[not set]",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "dbus",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "enabled",
        "UnitFileState": "disabled",
        "UtmpMode": "init",
        "Wants": "network-pre.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "infinity"
    }
}
TASK [fedora.linux_system_roles.firewall : Check if previous replaced is defined] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:34
Saturday 15 February 2025  11:42:21 -0500 (0:00:01.059)       0:00:19.857 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__firewall_previous_replaced": false,
        "__firewall_python_cmd": "/usr/bin/python3.12",
        "__firewall_report_changed": true
    },
    "changed": false
}
TASK [fedora.linux_system_roles.firewall : Get config files, checksums before and remove] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:43
Saturday 15 February 2025  11:42:21 -0500 (0:00:00.048)       0:00:19.906 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_previous_replaced | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Tell firewall module it is able to report changed] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:55
Saturday 15 February 2025  11:42:22 -0500 (0:00:00.032)       0:00:19.939 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_previous_replaced | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Configure firewall] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:71
Saturday 15 February 2025  11:42:22 -0500 (0:00:00.036)       0:00:19.976 ***** 
changed: [managed-node1] => (item={'port': '8000/tcp', 'state': 'enabled'}) => {
    "__firewall_changed": true,
    "ansible_loop_var": "item",
    "changed": true,
    "item": {
        "port": "8000/tcp",
        "state": "enabled"
    }
}
changed: [managed-node1] => (item={'port': '9000/tcp', 'state': 'enabled'}) => {
    "__firewall_changed": true,
    "ansible_loop_var": "item",
    "changed": true,
    "item": {
        "port": "9000/tcp",
        "state": "enabled"
    }
}
TASK [fedora.linux_system_roles.firewall : Gather firewall config information] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:120
Saturday 15 February 2025  11:42:23 -0500 (0:00:01.215)       0:00:21.191 ***** 
skipping: [managed-node1] => (item={'port': '8000/tcp', 'state': 'enabled'})  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "firewall | length == 1",
    "item": {
        "port": "8000/tcp",
        "state": "enabled"
    },
    "skip_reason": "Conditional result was False"
}
skipping: [managed-node1] => (item={'port': '9000/tcp', 'state': 'enabled'})  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "firewall | length == 1",
    "item": {
        "port": "9000/tcp",
        "state": "enabled"
    },
    "skip_reason": "Conditional result was False"
}
skipping: [managed-node1] => {
    "changed": false
}
MSG:
All items skipped
TASK [fedora.linux_system_roles.firewall : Update firewalld_config fact] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:130
Saturday 15 February 2025  11:42:23 -0500 (0:00:00.077)       0:00:21.268 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "firewall | length == 1",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Gather firewall config if no arguments] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:139
Saturday 15 February 2025  11:42:23 -0500 (0:00:00.063)       0:00:21.331 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "firewall == None or firewall | length == 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Update firewalld_config fact] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:144
Saturday 15 February 2025  11:42:23 -0500 (0:00:00.097)       0:00:21.428 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "firewall == None or firewall | length == 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Get config files, checksums after] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:153
Saturday 15 February 2025  11:42:23 -0500 (0:00:00.101)       0:00:21.530 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_previous_replaced | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Calculate what has changed] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:163
Saturday 15 February 2025  11:42:23 -0500 (0:00:00.207)       0:00:21.737 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_previous_replaced | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Show diffs] *************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:169
Saturday 15 February 2025  11:42:23 -0500 (0:00:00.096)       0:00:21.833 ***** 
skipping: [managed-node1] => {
    "false_condition": "__firewall_previous_replaced | bool"
}
TASK [Manage selinux for specified ports] **************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:146
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.142)       0:00:21.975 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_selinux_ports | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:153
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.053)       0:00:22.028 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_cancel_user_linger": []
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:157
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.044)       0:00:22.073 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle credential files - present] ****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:166
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.040)       0:00:22.113 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle secrets] ***********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:175
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.042)       0:00:22.156 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml for managed-node1 => (item=(censored due to no_log))
TASK [fedora.linux_system_roles.podman : Set variables part 1] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:3
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.145)       0:00:22.302 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:7
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.035)       0:00:22.337 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1 => (item=(censored due to no_log))
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.074)       0:00:22.411 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.107)       0:00:22.519 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.088)       0:00:22.608 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.083)       0:00:22.691 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.061)       0:00:22.753 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.040)       0:00:22.793 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.045)       0:00:22.839 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:42:24 -0500 (0:00:00.039)       0:00:22.878 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:42:25 -0500 (0:00:00.076)       0:00:22.954 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:42:25 -0500 (0:00:00.031)       0:00:22.986 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:42:25 -0500 (0:00:00.034)       0:00:23.020 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:42:25 -0500 (0:00:00.032)       0:00:23.052 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set variables part 2] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:14
Saturday 15 February 2025  11:42:25 -0500 (0:00:00.042)       0:00:23.095 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_rootless": false,
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:20
Saturday 15 February 2025  11:42:25 -0500 (0:00:00.091)       0:00:23.186 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:42:25 -0500 (0:00:00.109)       0:00:23.296 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:42:25 -0500 (0:00:00.054)       0:00:23.351 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:42:25 -0500 (0:00:00.059)       0:00:23.410 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:25
Saturday 15 February 2025  11:42:25 -0500 (0:00:00.055)       0:00:23.465 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Manage each secret] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:41
Saturday 15 February 2025  11:42:25 -0500 (0:00:00.074)       0:00:23.540 ***** 
[WARNING]: Using a variable for a task's 'args' is unsafe in some situations
(see
https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat-
unsafe)
changed: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
TASK [fedora.linux_system_roles.podman : Set variables part 1] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:3
Saturday 15 February 2025  11:42:26 -0500 (0:00:00.738)       0:00:24.280 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:7
Saturday 15 February 2025  11:42:26 -0500 (0:00:00.040)       0:00:24.320 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1 => (item=(censored due to no_log))
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:42:26 -0500 (0:00:00.074)       0:00:24.394 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:42:26 -0500 (0:00:00.061)       0:00:24.456 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:42:26 -0500 (0:00:00.050)       0:00:24.506 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:42:26 -0500 (0:00:00.049)       0:00:24.555 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:42:26 -0500 (0:00:00.039)       0:00:24.595 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:42:26 -0500 (0:00:00.079)       0:00:24.674 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:42:26 -0500 (0:00:00.062)       0:00:24.736 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:42:26 -0500 (0:00:00.110)       0:00:24.847 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:42:26 -0500 (0:00:00.052)       0:00:24.900 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.056)       0:00:24.956 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.043)       0:00:24.999 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.053)       0:00:25.053 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set variables part 2] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:14
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.053)       0:00:25.106 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_rootless": false,
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:20
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.061)       0:00:25.168 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.070)       0:00:25.238 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.033)       0:00:25.272 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.035)       0:00:25.308 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:25
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.034)       0:00:25.343 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Manage each secret] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:41
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.031)       0:00:25.374 ***** 
changed: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
TASK [fedora.linux_system_roles.podman : Set variables part 1] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:3
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.495)       0:00:25.870 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:7
Saturday 15 February 2025  11:42:27 -0500 (0:00:00.037)       0:00:25.907 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1 => (item=(censored due to no_log))
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.057)       0:00:25.965 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.034)       0:00:25.999 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.036)       0:00:26.036 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.042)       0:00:26.079 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.033)       0:00:26.112 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.030)       0:00:26.143 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.075)       0:00:26.219 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.032)       0:00:26.252 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.033)       0:00:26.285 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.032)       0:00:26.318 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.033)       0:00:26.352 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.033)       0:00:26.385 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set variables part 2] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:14
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.033)       0:00:26.418 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_rootless": false,
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:20
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.039)       0:00:26.458 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.058)       0:00:26.517 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.031)       0:00:26.549 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.030)       0:00:26.579 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:25
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.032)       0:00:26.612 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Manage each secret] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:41
Saturday 15 February 2025  11:42:28 -0500 (0:00:00.029)       0:00:26.642 ***** 
changed: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:182
Saturday 15 February 2025  11:42:29 -0500 (0:00:00.504)       0:00:27.146 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle Quadlet specifications] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:189
Saturday 15 February 2025  11:42:29 -0500 (0:00:00.028)       0:00:27.174 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:42:29 -0500 (0:00:00.145)       0:00:27.319 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "quadlet-demo.network",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "[Network]\nSubnet=192.168.30.0/24\nGateway=192.168.30.1\nLabel=app=wordpress",
        "__podman_quadlet_template_src": ""
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:42:29 -0500 (0:00:00.045)       0:00:27.364 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "created",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:42:29 -0500 (0:00:00.040)       0:00:27.405 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:42:29 -0500 (0:00:00.074)       0:00:27.479 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "quadlet-demo",
        "__podman_quadlet_type": "network",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:42:29 -0500 (0:00:00.047)       0:00:27.527 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:42:29 -0500 (0:00:00.059)       0:00:27.587 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:42:29 -0500 (0:00:00.037)       0:00:27.624 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:42:29 -0500 (0:00:00.035)       0:00:27.659 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:42:29 -0500 (0:00:00.046)       0:00:27.705 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.390)       0:00:28.095 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.033)       0:00:28.129 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.035)       0:00:28.164 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.033)       0:00:28.198 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.035)       0:00:28.233 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.034)       0:00:28.267 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.034)       0:00:28.302 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.033)       0:00:28.335 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.034)       0:00:28.370 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [],
        "__podman_kube_yamls_raw": "",
        "__podman_service_name": "quadlet-demo-network.service",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.052)       0:00:28.422 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.035)       0:00:28.457 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_kube_yamls_raw | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.033)       0:00:28.490 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [],
        "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo.network",
        "__podman_volumes": []
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.076)       0:00:28.567 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.038)       0:00:28.605 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state == \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.031)       0:00:28.637 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:2
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.111)       0:00:28.748 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.074)       0:00:28.822 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.032)       0:00:28.854 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:42:30 -0500 (0:00:00.032)       0:00:28.887 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create host directories] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:7
Saturday 15 February 2025  11:42:31 -0500 (0:00:00.030)       0:00:28.917 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "skipped_reason": "No items in the list"
}
TASK [fedora.linux_system_roles.podman : Ensure container images are present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18
Saturday 15 February 2025  11:42:31 -0500 (0:00:00.029)       0:00:28.947 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Ensure the quadlet directory is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39
Saturday 15 February 2025  11:42:31 -0500 (0:00:00.031)       0:00:28.978 ***** 
ok: [managed-node1] => {
    "changed": false,
    "gid": 0,
    "group": "root",
    "mode": "0755",
    "owner": "root",
    "path": "/etc/containers/systemd",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 6,
    "state": "directory",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is copied] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:48
Saturday 15 February 2025  11:42:31 -0500 (0:00:00.389)       0:00:29.368 ***** 
changed: [managed-node1] => {
    "changed": true,
    "checksum": "e57c08d49aff4bae8daab138d913aeddaa8682a0",
    "dest": "/etc/containers/systemd/quadlet-demo.network",
    "gid": 0,
    "group": "root",
    "md5sum": "061f3cf318cbd8ab5794bb1173831fb8",
    "mode": "0644",
    "owner": "root",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 74,
    "src": "/root/.ansible/tmp/ansible-tmp-1739637751.501507-21049-169973974175633/.source.network",
    "state": "file",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file content is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:58
Saturday 15 February 2025  11:42:32 -0500 (0:00:00.919)       0:00:30.287 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is present] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70
Saturday 15 February 2025  11:42:32 -0500 (0:00:00.043)       0:00:30.330 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_copy_file is skipped",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Reload systemctl] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:82
Saturday 15 February 2025  11:42:32 -0500 (0:00:00.037)       0:00:30.368 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Start service] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:110
Saturday 15 February 2025  11:42:33 -0500 (0:00:00.763)       0:00:31.132 ***** 
changed: [managed-node1] => {
    "changed": true,
    "name": "quadlet-demo-network.service",
    "state": "started",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0",
        "ActiveEnterTimestampMonotonic": "0",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "inactive",
        "After": "systemd-journald.socket basic.target network-online.target -.mount system.slice sysinit.target",
        "AllowIsolate": "no",
        "AssertResult": "no",
        "AssertTimestampMonotonic": "0",
        "Before": "shutdown.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "[not set]",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "no",
        "ConditionTimestampMonotonic": "0",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "shutdown.target",
        "ControlGroupId": "0",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "no",
        "Description": "quadlet-demo-network.service",
        "DevicePolicy": "auto",
        "DynamicUser": "no",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveTasksMax": "22347",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainHandoffTimestampMonotonic": "0",
        "ExecMainPID": "0",
        "ExecMainStartTimestampMonotonic": "0",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman network create --ignore --subnet 192.168.30.0/24 --gateway 192.168.30.1 --label app=wordpress systemd-quadlet-demo ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman network create --ignore --subnet 192.168.30.0/24 --gateway 192.168.30.1 --label app=wordpress systemd-quadlet-demo ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/run/systemd/generator/quadlet-demo-network.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "quadlet-demo-network.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestampMonotonic": "0",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "control-group",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "0",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "3149807616",
        "MemoryCurrent": "[not set]",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "[not set]",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "[not set]",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "[not set]",
        "MemoryZSwapCurrent": "[not set]",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "quadlet-demo-network.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMPolicy": "stop",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "no",
        "ProtectControlGroups": "no",
        "ProtectControlGroupsEx": "no",
        "ProtectHome": "no",
        "ProtectHostname": "no",
        "ProtectKernelLogs": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "yes",
        "RemoveIPC": "no",
        "Requires": "system.slice -.mount sysinit.target",
        "RequiresMountsFor": "/run/containers",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "SourcePath": "/etc/containers/systemd/quadlet-demo.network",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestampMonotonic": "0",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "dead",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogIdentifier": "quadlet-demo-network",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "[not set]",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "infinity",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "oneshot",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "disabled",
        "UnitFileState": "generated",
        "UtmpMode": "init",
        "Wants": "network-online.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "infinity"
    }
}
TASK [fedora.linux_system_roles.podman : Restart service] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:125
Saturday 15 February 2025  11:42:33 -0500 (0:00:00.638)       0:00:31.770 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_service_started is changed",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:42:33 -0500 (0:00:00.054)       0:00:31.824 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "quadlet-demo-mysql.volume",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "[Volume]",
        "__podman_quadlet_template_src": ""
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:42:33 -0500 (0:00:00.074)       0:00:31.899 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "created",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:42:34 -0500 (0:00:00.073)       0:00:31.973 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:42:34 -0500 (0:00:00.071)       0:00:32.044 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "quadlet-demo-mysql",
        "__podman_quadlet_type": "volume",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:42:34 -0500 (0:00:00.078)       0:00:32.123 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:42:34 -0500 (0:00:00.158)       0:00:32.281 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:42:34 -0500 (0:00:00.050)       0:00:32.332 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:42:34 -0500 (0:00:00.040)       0:00:32.372 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:42:34 -0500 (0:00:00.056)       0:00:32.429 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:42:34 -0500 (0:00:00.391)       0:00:32.820 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:42:34 -0500 (0:00:00.042)       0:00:32.863 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:42:34 -0500 (0:00:00.037)       0:00:32.900 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.039)       0:00:32.940 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.033)       0:00:32.974 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.036)       0:00:33.011 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.051)       0:00:33.062 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.034)       0:00:33.097 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.033)       0:00:33.131 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [],
        "__podman_kube_yamls_raw": "",
        "__podman_service_name": "quadlet-demo-mysql-volume.service",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.055)       0:00:33.186 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.034)       0:00:33.220 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_kube_yamls_raw | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.034)       0:00:33.254 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [],
        "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo-mysql.volume",
        "__podman_volumes": []
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.074)       0:00:33.329 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.048)       0:00:33.378 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state == \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.049)       0:00:33.428 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:2
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.117)       0:00:33.545 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.134)       0:00:33.679 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.034)       0:00:33.714 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.037)       0:00:33.751 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create host directories] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:7
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.038)       0:00:33.789 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "skipped_reason": "No items in the list"
}
TASK [fedora.linux_system_roles.podman : Ensure container images are present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.031)       0:00:33.821 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Ensure the quadlet directory is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39
Saturday 15 February 2025  11:42:35 -0500 (0:00:00.031)       0:00:33.853 ***** 
ok: [managed-node1] => {
    "changed": false,
    "gid": 0,
    "group": "root",
    "mode": "0755",
    "owner": "root",
    "path": "/etc/containers/systemd",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 34,
    "state": "directory",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is copied] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:48
Saturday 15 February 2025  11:42:36 -0500 (0:00:00.410)       0:00:34.263 ***** 
changed: [managed-node1] => {
    "changed": true,
    "checksum": "585f8cbdf0ec73000f9227dcffbef71e9552ea4a",
    "dest": "/etc/containers/systemd/quadlet-demo-mysql.volume",
    "gid": 0,
    "group": "root",
    "md5sum": "5ddd03a022aeb4502d9bc8ce436b4233",
    "mode": "0644",
    "owner": "root",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 9,
    "src": "/root/.ansible/tmp/ansible-tmp-1739637756.4084275-21235-278404765771819/.source.volume",
    "state": "file",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file content is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:58
Saturday 15 February 2025  11:42:37 -0500 (0:00:00.736)       0:00:35.000 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is present] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70
Saturday 15 February 2025  11:42:37 -0500 (0:00:00.035)       0:00:35.036 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_copy_file is skipped",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Reload systemctl] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:82
Saturday 15 February 2025  11:42:37 -0500 (0:00:00.031)       0:00:35.067 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Start service] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:110
Saturday 15 February 2025  11:42:37 -0500 (0:00:00.745)       0:00:35.813 ***** 
changed: [managed-node1] => {
    "changed": true,
    "name": "quadlet-demo-mysql-volume.service",
    "state": "started",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0",
        "ActiveEnterTimestampMonotonic": "0",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "inactive",
        "After": "system.slice -.mount sysinit.target systemd-journald.socket network-online.target basic.target",
        "AllowIsolate": "no",
        "AssertResult": "no",
        "AssertTimestampMonotonic": "0",
        "Before": "shutdown.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "[not set]",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "no",
        "ConditionTimestampMonotonic": "0",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "shutdown.target",
        "ControlGroupId": "0",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "no",
        "Description": "quadlet-demo-mysql-volume.service",
        "DevicePolicy": "auto",
        "DynamicUser": "no",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveTasksMax": "22347",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainHandoffTimestampMonotonic": "0",
        "ExecMainPID": "0",
        "ExecMainStartTimestampMonotonic": "0",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman volume create --ignore systemd-quadlet-demo-mysql ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman volume create --ignore systemd-quadlet-demo-mysql ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/run/systemd/generator/quadlet-demo-mysql-volume.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "quadlet-demo-mysql-volume.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestampMonotonic": "0",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "control-group",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "0",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "3178471424",
        "MemoryCurrent": "[not set]",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "[not set]",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "[not set]",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "[not set]",
        "MemoryZSwapCurrent": "[not set]",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "quadlet-demo-mysql-volume.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMPolicy": "stop",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "no",
        "ProtectControlGroups": "no",
        "ProtectControlGroupsEx": "no",
        "ProtectHome": "no",
        "ProtectHostname": "no",
        "ProtectKernelLogs": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "yes",
        "RemoveIPC": "no",
        "Requires": "system.slice sysinit.target -.mount",
        "RequiresMountsFor": "/run/containers",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "SourcePath": "/etc/containers/systemd/quadlet-demo-mysql.volume",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestampMonotonic": "0",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "dead",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogIdentifier": "quadlet-demo-mysql-volume",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "[not set]",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "infinity",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "oneshot",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "disabled",
        "UnitFileState": "generated",
        "UtmpMode": "init",
        "Wants": "network-online.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "infinity"
    }
}
TASK [fedora.linux_system_roles.podman : Restart service] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:125
Saturday 15 February 2025  11:42:38 -0500 (0:00:00.662)       0:00:36.476 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_service_started is changed",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:42:38 -0500 (0:00:00.067)       0:00:36.543 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "[Install]\nWantedBy=default.target\n\n[Container]\nImage=quay.io/linux-system-roles/mysql:5.6\nContainerName=quadlet-demo-mysql\nVolume=quadlet-demo-mysql.volume:/var/lib/mysql\nVolume=/tmp/quadlet_demo:/var/lib/quadlet_demo:Z\nNetwork=quadlet-demo.network\nSecret=mysql-root-password-container,type=env,target=MYSQL_ROOT_PASSWORD\nHealthCmd=/bin/true\nHealthOnFailure=kill\n",
        "__podman_quadlet_template_src": "quadlet-demo-mysql.container.j2"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:42:38 -0500 (0:00:00.123)       0:00:36.667 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "created",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:42:38 -0500 (0:00:00.048)       0:00:36.715 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_str",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:42:38 -0500 (0:00:00.046)       0:00:36.761 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "quadlet-demo-mysql",
        "__podman_quadlet_type": "container",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:42:38 -0500 (0:00:00.100)       0:00:36.862 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.069)       0:00:36.932 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.043)       0:00:36.975 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.084)       0:00:37.059 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.048)       0:00:37.107 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.452)       0:00:37.560 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.039)       0:00:37.599 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.042)       0:00:37.642 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.040)       0:00:37.683 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.040)       0:00:37.724 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.044)       0:00:37.768 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.051)       0:00:37.819 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:42:39 -0500 (0:00:00.057)       0:00:37.877 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:42:40 -0500 (0:00:00.056)       0:00:37.934 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [
            "quay.io/linux-system-roles/mysql:5.6"
        ],
        "__podman_kube_yamls_raw": "",
        "__podman_service_name": "quadlet-demo-mysql.service",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:42:40 -0500 (0:00:00.092)       0:00:38.026 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:42:40 -0500 (0:00:00.075)       0:00:38.102 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_kube_yamls_raw | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:42:40 -0500 (0:00:00.072)       0:00:38.174 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [
            "quay.io/linux-system-roles/mysql:5.6"
        ],
        "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo-mysql.container",
        "__podman_volumes": [
            "/tmp/quadlet_demo"
        ]
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:42:40 -0500 (0:00:00.189)       0:00:38.364 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:42:40 -0500 (0:00:00.066)       0:00:38.430 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state == \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:42:40 -0500 (0:00:00.052)       0:00:38.482 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:2
Saturday 15 February 2025  11:42:40 -0500 (0:00:00.109)       0:00:38.591 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:42:40 -0500 (0:00:00.100)       0:00:38.692 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:42:40 -0500 (0:00:00.051)       0:00:38.744 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:42:40 -0500 (0:00:00.143)       0:00:38.888 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create host directories] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:7
Saturday 15 February 2025  11:42:41 -0500 (0:00:00.063)       0:00:38.952 ***** 
changed: [managed-node1] => (item=/tmp/quadlet_demo) => {
    "ansible_loop_var": "item",
    "changed": true,
    "gid": 0,
    "group": "root",
    "item": "/tmp/quadlet_demo",
    "mode": "0777",
    "owner": "root",
    "path": "/tmp/quadlet_demo",
    "secontext": "unconfined_u:object_r:user_tmp_t:s0",
    "size": 6,
    "state": "directory",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure container images are present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18
Saturday 15 February 2025  11:42:41 -0500 (0:00:00.477)       0:00:39.430 ***** 
changed: [managed-node1] => (item=None) => {
    "attempts": 1,
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
changed: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
TASK [fedora.linux_system_roles.podman : Ensure the quadlet directory is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39
Saturday 15 February 2025  11:42:47 -0500 (0:00:06.400)       0:00:45.830 ***** 
ok: [managed-node1] => {
    "changed": false,
    "gid": 0,
    "group": "root",
    "mode": "0755",
    "owner": "root",
    "path": "/etc/containers/systemd",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 67,
    "state": "directory",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is copied] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:48
Saturday 15 February 2025  11:42:48 -0500 (0:00:00.394)       0:00:46.225 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_quadlet_file_src | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file content is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:58
Saturday 15 February 2025  11:42:48 -0500 (0:00:00.041)       0:00:46.266 ***** 
changed: [managed-node1] => {
    "changed": true,
    "checksum": "ca62b2ad3cc9afb5b5371ebbf797b9bc4fd7edd4",
    "dest": "/etc/containers/systemd/quadlet-demo-mysql.container",
    "gid": 0,
    "group": "root",
    "md5sum": "341b473056d2a5dfa35970b0d2e23a5d",
    "mode": "0644",
    "owner": "root",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 363,
    "src": "/root/.ansible/tmp/ansible-tmp-1739637768.4011524-21676-87794506354656/.source.container",
    "state": "file",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is present] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70
Saturday 15 February 2025  11:42:49 -0500 (0:00:00.691)       0:00:46.958 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_copy_content is skipped",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Reload systemctl] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:82
Saturday 15 February 2025  11:42:49 -0500 (0:00:00.031)       0:00:46.990 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Start service] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:110
Saturday 15 February 2025  11:42:49 -0500 (0:00:00.758)       0:00:47.748 ***** 
changed: [managed-node1] => {
    "changed": true,
    "name": "quadlet-demo-mysql.service",
    "state": "started",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0",
        "ActiveEnterTimestampMonotonic": "0",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "inactive",
        "After": "-.mount sysinit.target network-online.target quadlet-demo-network.service quadlet-demo-mysql-volume.service system.slice tmp.mount systemd-journald.socket basic.target",
        "AllowIsolate": "no",
        "AssertResult": "no",
        "AssertTimestampMonotonic": "0",
        "Before": "multi-user.target shutdown.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "[not set]",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "no",
        "ConditionTimestampMonotonic": "0",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "shutdown.target",
        "ControlGroupId": "0",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "yes",
        "DelegateControllers": "cpu cpuset io memory pids",
        "Description": "quadlet-demo-mysql.service",
        "DevicePolicy": "auto",
        "DynamicUser": "no",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveTasksMax": "22347",
        "Environment": "PODMAN_SYSTEMD_UNIT=quadlet-demo-mysql.service",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainHandoffTimestampMonotonic": "0",
        "ExecMainPID": "0",
        "ExecMainStartTimestampMonotonic": "0",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman run --name quadlet-demo-mysql --cidfile=/run/quadlet-demo-mysql.cid --replace --rm --cgroups=split --network systemd-quadlet-demo --sdnotify=conmon -d -v systemd-quadlet-demo-mysql:/var/lib/mysql -v /tmp/quadlet_demo:/var/lib/quadlet_demo:Z --secret mysql-root-password-container,type=env,target=MYSQL_ROOT_PASSWORD --health-cmd /bin/true --health-on-failure kill quay.io/linux-system-roles/mysql:5.6 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman run --name quadlet-demo-mysql --cidfile=/run/quadlet-demo-mysql.cid --replace --rm --cgroups=split --network systemd-quadlet-demo --sdnotify=conmon -d -v systemd-quadlet-demo-mysql:/var/lib/mysql -v /tmp/quadlet_demo:/var/lib/quadlet_demo:Z --secret mysql-root-password-container,type=env,target=MYSQL_ROOT_PASSWORD --health-cmd /bin/true --health-on-failure kill quay.io/linux-system-roles/mysql:5.6 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStop": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/quadlet-demo-mysql.cid ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStopEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/quadlet-demo-mysql.cid ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStopPost": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/quadlet-demo-mysql.cid ; ignore_errors=yes ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStopPostEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/quadlet-demo-mysql.cid ; flags=ignore-failure ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/run/systemd/generator/quadlet-demo-mysql.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "quadlet-demo-mysql.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestampMonotonic": "0",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "mixed",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "0",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "3014520832",
        "MemoryCurrent": "[not set]",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "[not set]",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "[not set]",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "[not set]",
        "MemoryZSwapCurrent": "[not set]",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "quadlet-demo-mysql.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "all",
        "OOMPolicy": "continue",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "no",
        "ProtectControlGroups": "no",
        "ProtectControlGroupsEx": "no",
        "ProtectHome": "no",
        "ProtectHostname": "no",
        "ProtectKernelLogs": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "quadlet-demo-mysql-volume.service sysinit.target -.mount quadlet-demo-network.service system.slice",
        "RequiresMountsFor": "/run/containers /tmp/quadlet_demo",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "SourcePath": "/etc/containers/systemd/quadlet-demo-mysql.container",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestampMonotonic": "0",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "dead",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogIdentifier": "quadlet-demo-mysql",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "[not set]",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "notify",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "disabled",
        "UnitFileState": "generated",
        "UtmpMode": "init",
        "WantedBy": "multi-user.target",
        "Wants": "network-online.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "infinity"
    }
}
TASK [fedora.linux_system_roles.podman : Restart service] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:125
Saturday 15 February 2025  11:42:50 -0500 (0:00:00.914)       0:00:48.663 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_service_started is changed",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:42:50 -0500 (0:00:00.032)       0:00:48.695 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "envoy-proxy-configmap.yml",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "---\napiVersion: v1\nkind: ConfigMap\nmetadata:\n  name: envoy-proxy-config\ndata:\n  envoy.yaml: |\n    admin:\n      address:\n        socket_address:\n          address: 0.0.0.0\n          port_value: 9901\n\n    static_resources:\n      listeners:\n      - name: listener_0\n        address:\n          socket_address:\n            address: 0.0.0.0\n            port_value: 8080\n        filter_chains:\n        - filters:\n          - name: envoy.filters.network.http_connection_manager\n            typed_config:\n              \"@type\": type.googleapis.com/envoy.extensions.filters.network.http_connection_manager.v3.HttpConnectionManager\n              stat_prefix: ingress_http\n              codec_type: AUTO\n              route_config:\n                name: local_route\n                virtual_hosts:\n                - name: local_service\n                  domains: [\"*\"]\n                  routes:\n                  - match:\n                      prefix: \"/\"\n                    route:\n                      cluster: backend\n              http_filters:\n              - name: envoy.filters.http.router\n                typed_config:\n                  \"@type\": type.googleapis.com/envoy.extensions.filters.http.router.v3.Router\n          transport_socket:\n            name: envoy.transport_sockets.tls\n            typed_config:\n              \"@type\": type.googleapis.com/envoy.extensions.transport_sockets.tls.v3.DownstreamTlsContext\n              common_tls_context:\n                tls_certificates:\n                - certificate_chain:\n                    filename: /etc/envoy-certificates/certificate.pem\n                  private_key:\n                    filename: /etc/envoy-certificates/certificate.key\n      clusters:\n      - name: backend\n        connect_timeout: 5s\n        type: STATIC\n        dns_refresh_rate: 1800s\n        lb_policy: ROUND_ROBIN\n        load_assignment:\n          cluster_name: backend\n          endpoints:\n          - lb_endpoints:\n            - endpoint:\n                address:\n                  socket_address:\n                    address: 127.0.0.1\n                    port_value: 80",
        "__podman_quadlet_template_src": ""
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:42:50 -0500 (0:00:00.044)       0:00:48.739 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "created",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:42:50 -0500 (0:00:00.040)       0:00:48.780 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:42:50 -0500 (0:00:00.032)       0:00:48.812 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "envoy-proxy-configmap",
        "__podman_quadlet_type": "yml",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:42:50 -0500 (0:00:00.046)       0:00:48.859 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.059)       0:00:48.918 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.035)       0:00:48.954 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.033)       0:00:48.988 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.044)       0:00:49.033 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.467)       0:00:49.501 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.071)       0:00:49.573 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.032)       0:00:49.605 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.037)       0:00:49.642 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.048)       0:00:49.691 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.037)       0:00:49.729 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.051)       0:00:49.780 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.057)       0:00:49.837 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:42:51 -0500 (0:00:00.055)       0:00:49.893 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [],
        "__podman_kube_yamls_raw": "",
        "__podman_service_name": "",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.091)       0:00:49.984 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.056)       0:00:50.041 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_kube_yamls_raw | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.053)       0:00:50.095 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [],
        "__podman_quadlet_file": "/etc/containers/systemd/envoy-proxy-configmap.yml",
        "__podman_volumes": []
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.119)       0:00:50.214 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.059)       0:00:50.274 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state == \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.049)       0:00:50.324 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:2
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.112)       0:00:50.436 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.087)       0:00:50.523 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.047)       0:00:50.570 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.035)       0:00:50.606 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create host directories] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:7
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.035)       0:00:50.641 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "skipped_reason": "No items in the list"
}
TASK [fedora.linux_system_roles.podman : Ensure container images are present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.093)       0:00:50.734 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Ensure the quadlet directory is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39
Saturday 15 February 2025  11:42:52 -0500 (0:00:00.031)       0:00:50.766 ***** 
ok: [managed-node1] => {
    "changed": false,
    "gid": 0,
    "group": "root",
    "mode": "0755",
    "owner": "root",
    "path": "/etc/containers/systemd",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 103,
    "state": "directory",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is copied] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:48
Saturday 15 February 2025  11:42:53 -0500 (0:00:00.426)       0:00:51.192 ***** 
changed: [managed-node1] => {
    "changed": true,
    "checksum": "d681c7d56f912150d041873e880818b22a90c188",
    "dest": "/etc/containers/systemd/envoy-proxy-configmap.yml",
    "gid": 0,
    "group": "root",
    "md5sum": "aec75d972c231aac004e1338934544cf",
    "mode": "0644",
    "owner": "root",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 2102,
    "src": "/root/.ansible/tmp/ansible-tmp-1739637773.3253858-21862-27786019115190/.source.yml",
    "state": "file",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file content is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:58
Saturday 15 February 2025  11:42:54 -0500 (0:00:00.821)       0:00:52.014 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is present] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70
Saturday 15 February 2025  11:42:54 -0500 (0:00:00.055)       0:00:52.069 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_copy_file is skipped",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Reload systemctl] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:82
Saturday 15 February 2025  11:42:54 -0500 (0:00:00.069)       0:00:52.138 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Start service] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:110
Saturday 15 February 2025  11:42:55 -0500 (0:00:00.850)       0:00:52.989 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_service_name | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Restart service] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:125
Saturday 15 February 2025  11:42:55 -0500 (0:00:00.059)       0:00:53.048 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_service_name | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:42:55 -0500 (0:00:00.062)       0:00:53.111 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "---\napiVersion: v1\nkind: PersistentVolumeClaim\nmetadata:\n  name: wp-pv-claim\n  labels:\n    app: wordpress\nspec:\n  accessModes:\n  - ReadWriteOnce\n  resources:\n    requests:\n      storage: 20Gi\n---\napiVersion: v1\nkind: Pod\nmetadata:\n  name: quadlet-demo\nspec:\n  containers:\n  - name: wordpress\n    image: quay.io/linux-system-roles/wordpress:4.8-apache\n    env:\n    - name: WORDPRESS_DB_HOST\n      value: quadlet-demo-mysql\n    - name: WORDPRESS_DB_PASSWORD\n      valueFrom:\n        secretKeyRef:\n          name: mysql-root-password-kube\n          key: password\n    volumeMounts:\n    - name: wordpress-persistent-storage\n      mountPath: /var/www/html\n    resources:\n      requests:\n        memory: \"64Mi\"\n        cpu: \"250m\"\n      limits:\n        memory: \"128Mi\"\n        cpu: \"500m\"\n  - name: envoy\n    image: quay.io/linux-system-roles/envoyproxy:v1.25.0\n    volumeMounts:\n    - name: config-volume\n      mountPath: /etc/envoy\n    - name: certificates\n      mountPath: /etc/envoy-certificates\n    env:\n    - name: ENVOY_UID\n      value: \"0\"\n    resources:\n      requests:\n        memory: \"64Mi\"\n        cpu: \"250m\"\n      limits:\n        memory: \"128Mi\"\n        cpu: \"500m\"\n  volumes:\n  - name: config-volume\n    configMap:\n      name: envoy-proxy-config\n  - name: certificates\n    secret:\n      secretName: envoy-certificates\n  - name: wordpress-persistent-storage\n    persistentVolumeClaim:\n      claimName: wp-pv-claim\n  - name: www  # not used - for testing hostpath\n    hostPath:\n      path: /tmp/httpd3\n  - name: create  # not used - for testing hostpath\n    hostPath:\n      path: /tmp/httpd3-create\n",
        "__podman_quadlet_template_src": "quadlet-demo.yml.j2"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:42:55 -0500 (0:00:00.196)       0:00:53.308 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "created",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:42:55 -0500 (0:00:00.084)       0:00:53.392 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_str",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:42:55 -0500 (0:00:00.067)       0:00:53.459 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "quadlet-demo",
        "__podman_quadlet_type": "yml",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:42:55 -0500 (0:00:00.090)       0:00:53.550 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:42:55 -0500 (0:00:00.094)       0:00:53.645 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:42:55 -0500 (0:00:00.058)       0:00:53.704 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:42:55 -0500 (0:00:00.070)       0:00:53.775 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:42:55 -0500 (0:00:00.094)       0:00:53.869 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:42:56 -0500 (0:00:00.487)       0:00:54.357 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:42:56 -0500 (0:00:00.104)       0:00:54.461 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:42:56 -0500 (0:00:00.089)       0:00:54.551 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:42:56 -0500 (0:00:00.052)       0:00:54.603 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:42:56 -0500 (0:00:00.136)       0:00:54.740 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:42:56 -0500 (0:00:00.068)       0:00:54.808 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:42:56 -0500 (0:00:00.056)       0:00:54.864 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.074)       0:00:54.939 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.101)       0:00:55.040 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [],
        "__podman_kube_yamls_raw": "",
        "__podman_service_name": "",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.124)       0:00:55.164 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.046)       0:00:55.211 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_kube_yamls_raw | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.047)       0:00:55.258 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [],
        "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo.yml",
        "__podman_volumes": []
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.082)       0:00:55.341 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.038)       0:00:55.380 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state == \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.031)       0:00:55.411 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:2
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.088)       0:00:55.500 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.088)       0:00:55.588 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.050)       0:00:55.638 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.049)       0:00:55.688 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create host directories] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:7
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.052)       0:00:55.740 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "skipped_reason": "No items in the list"
}
TASK [fedora.linux_system_roles.podman : Ensure container images are present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.052)       0:00:55.793 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Ensure the quadlet directory is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39
Saturday 15 February 2025  11:42:57 -0500 (0:00:00.059)       0:00:55.852 ***** 
ok: [managed-node1] => {
    "changed": false,
    "gid": 0,
    "group": "root",
    "mode": "0755",
    "owner": "root",
    "path": "/etc/containers/systemd",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 136,
    "state": "directory",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is copied] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:48
Saturday 15 February 2025  11:42:58 -0500 (0:00:00.489)       0:00:56.342 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_quadlet_file_src | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file content is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:58
Saturday 15 February 2025  11:42:58 -0500 (0:00:00.073)       0:00:56.415 ***** 
changed: [managed-node1] => {
    "changed": true,
    "checksum": "998dccde0483b1654327a46ddd89cbaa47650370",
    "dest": "/etc/containers/systemd/quadlet-demo.yml",
    "gid": 0,
    "group": "root",
    "md5sum": "fd890594adfc24339cb9cdc5e7b19a66",
    "mode": "0644",
    "owner": "root",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 1605,
    "src": "/root/.ansible/tmp/ansible-tmp-1739637778.5718408-22093-261865611902226/.source.yml",
    "state": "file",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is present] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70
Saturday 15 February 2025  11:42:59 -0500 (0:00:00.824)       0:00:57.240 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_copy_content is skipped",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Reload systemctl] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:82
Saturday 15 February 2025  11:42:59 -0500 (0:00:00.033)       0:00:57.274 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Start service] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:110
Saturday 15 February 2025  11:43:00 -0500 (0:00:00.912)       0:00:58.186 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_service_name | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Restart service] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:125
Saturday 15 February 2025  11:43:00 -0500 (0:00:00.035)       0:00:58.221 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_service_name | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:43:00 -0500 (0:00:00.041)       0:00:58.263 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "quadlet-demo.kube",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "[Install]\nWantedBy=default.target\n\n[Unit]\nRequires=quadlet-demo-mysql.service\nAfter=quadlet-demo-mysql.service\n\n[Kube]\n# Point to the yaml file in the same directory\nYaml=quadlet-demo.yml\n# Use the quadlet-demo network\nNetwork=quadlet-demo.network\n# Publish the envoy proxy data port\nPublishPort=8000:8080\n# Publish the envoy proxy admin port\nPublishPort=9000:9901\n# Use the envoy proxy config map in the same directory\nConfigMap=envoy-proxy-configmap.yml",
        "__podman_quadlet_template_src": ""
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:43:00 -0500 (0:00:00.061)       0:00:58.324 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "created",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:43:00 -0500 (0:00:00.059)       0:00:58.384 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:43:00 -0500 (0:00:00.037)       0:00:58.421 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "quadlet-demo",
        "__podman_quadlet_type": "kube",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:43:00 -0500 (0:00:00.055)       0:00:58.477 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:43:00 -0500 (0:00:00.100)       0:00:58.578 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:43:00 -0500 (0:00:00.059)       0:00:58.638 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:43:00 -0500 (0:00:00.055)       0:00:58.693 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:43:00 -0500 (0:00:00.089)       0:00:58.783 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:43:01 -0500 (0:00:00.444)       0:00:59.228 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:43:01 -0500 (0:00:00.055)       0:00:59.283 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:43:01 -0500 (0:00:00.053)       0:00:59.336 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:43:01 -0500 (0:00:00.057)       0:00:59.393 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:43:01 -0500 (0:00:00.088)       0:00:59.482 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:43:01 -0500 (0:00:00.104)       0:00:59.586 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:43:01 -0500 (0:00:00.112)       0:00:59.698 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:43:01 -0500 (0:00:00.168)       0:00:59.867 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:43:02 -0500 (0:00:00.084)       0:00:59.952 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [],
        "__podman_kube_yamls_raw": [
            "quadlet-demo.yml"
        ],
        "__podman_service_name": "quadlet-demo.service",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:43:02 -0500 (0:00:00.120)       0:01:00.072 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:43:02 -0500 (0:00:00.047)       0:01:00.120 ***** 
ok: [managed-node1] => {
    "changed": false,
    "content": "LS0tCmFwaVZlcnNpb246IHYxCmtpbmQ6IFBlcnNpc3RlbnRWb2x1bWVDbGFpbQptZXRhZGF0YToKICBuYW1lOiB3cC1wdi1jbGFpbQogIGxhYmVsczoKICAgIGFwcDogd29yZHByZXNzCnNwZWM6CiAgYWNjZXNzTW9kZXM6CiAgLSBSZWFkV3JpdGVPbmNlCiAgcmVzb3VyY2VzOgogICAgcmVxdWVzdHM6CiAgICAgIHN0b3JhZ2U6IDIwR2kKLS0tCmFwaVZlcnNpb246IHYxCmtpbmQ6IFBvZAptZXRhZGF0YToKICBuYW1lOiBxdWFkbGV0LWRlbW8Kc3BlYzoKICBjb250YWluZXJzOgogIC0gbmFtZTogd29yZHByZXNzCiAgICBpbWFnZTogcXVheS5pby9saW51eC1zeXN0ZW0tcm9sZXMvd29yZHByZXNzOjQuOC1hcGFjaGUKICAgIGVudjoKICAgIC0gbmFtZTogV09SRFBSRVNTX0RCX0hPU1QKICAgICAgdmFsdWU6IHF1YWRsZXQtZGVtby1teXNxbAogICAgLSBuYW1lOiBXT1JEUFJFU1NfREJfUEFTU1dPUkQKICAgICAgdmFsdWVGcm9tOgogICAgICAgIHNlY3JldEtleVJlZjoKICAgICAgICAgIG5hbWU6IG15c3FsLXJvb3QtcGFzc3dvcmQta3ViZQogICAgICAgICAga2V5OiBwYXNzd29yZAogICAgdm9sdW1lTW91bnRzOgogICAgLSBuYW1lOiB3b3JkcHJlc3MtcGVyc2lzdGVudC1zdG9yYWdlCiAgICAgIG1vdW50UGF0aDogL3Zhci93d3cvaHRtbAogICAgcmVzb3VyY2VzOgogICAgICByZXF1ZXN0czoKICAgICAgICBtZW1vcnk6ICI2NE1pIgogICAgICAgIGNwdTogIjI1MG0iCiAgICAgIGxpbWl0czoKICAgICAgICBtZW1vcnk6ICIxMjhNaSIKICAgICAgICBjcHU6ICI1MDBtIgogIC0gbmFtZTogZW52b3kKICAgIGltYWdlOiBxdWF5LmlvL2xpbnV4LXN5c3RlbS1yb2xlcy9lbnZveXByb3h5OnYxLjI1LjAKICAgIHZvbHVtZU1vdW50czoKICAgIC0gbmFtZTogY29uZmlnLXZvbHVtZQogICAgICBtb3VudFBhdGg6IC9ldGMvZW52b3kKICAgIC0gbmFtZTogY2VydGlmaWNhdGVzCiAgICAgIG1vdW50UGF0aDogL2V0Yy9lbnZveS1jZXJ0aWZpY2F0ZXMKICAgIGVudjoKICAgIC0gbmFtZTogRU5WT1lfVUlECiAgICAgIHZhbHVlOiAiMCIKICAgIHJlc291cmNlczoKICAgICAgcmVxdWVzdHM6CiAgICAgICAgbWVtb3J5OiAiNjRNaSIKICAgICAgICBjcHU6ICIyNTBtIgogICAgICBsaW1pdHM6CiAgICAgICAgbWVtb3J5OiAiMTI4TWkiCiAgICAgICAgY3B1OiAiNTAwbSIKICB2b2x1bWVzOgogIC0gbmFtZTogY29uZmlnLXZvbHVtZQogICAgY29uZmlnTWFwOgogICAgICBuYW1lOiBlbnZveS1wcm94eS1jb25maWcKICAtIG5hbWU6IGNlcnRpZmljYXRlcwogICAgc2VjcmV0OgogICAgICBzZWNyZXROYW1lOiBlbnZveS1jZXJ0aWZpY2F0ZXMKICAtIG5hbWU6IHdvcmRwcmVzcy1wZXJzaXN0ZW50LXN0b3JhZ2UKICAgIHBlcnNpc3RlbnRWb2x1bWVDbGFpbToKICAgICAgY2xhaW1OYW1lOiB3cC1wdi1jbGFpbQogIC0gbmFtZTogd3d3ICAjIG5vdCB1c2VkIC0gZm9yIHRlc3RpbmcgaG9zdHBhdGgKICAgIGhvc3RQYXRoOgogICAgICBwYXRoOiAvdG1wL2h0dHBkMwogIC0gbmFtZTogY3JlYXRlICAjIG5vdCB1c2VkIC0gZm9yIHRlc3RpbmcgaG9zdHBhdGgKICAgIGhvc3RQYXRoOgogICAgICBwYXRoOiAvdG1wL2h0dHBkMy1jcmVhdGUK",
    "encoding": "base64",
    "source": "/etc/containers/systemd/quadlet-demo.yml"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:43:02 -0500 (0:00:00.460)       0:01:00.580 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [
            "quay.io/linux-system-roles/wordpress:4.8-apache",
            "quay.io/linux-system-roles/envoyproxy:v1.25.0"
        ],
        "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo.kube",
        "__podman_volumes": [
            "/tmp/httpd3",
            "/tmp/httpd3-create"
        ]
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:43:02 -0500 (0:00:00.107)       0:01:00.688 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:43:02 -0500 (0:00:00.043)       0:01:00.732 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state == \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:43:02 -0500 (0:00:00.033)       0:01:00.765 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:2
Saturday 15 February 2025  11:43:02 -0500 (0:00:00.070)       0:01:00.835 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:43:02 -0500 (0:00:00.053)       0:01:00.888 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:43:03 -0500 (0:00:00.031)       0:01:00.919 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:43:03 -0500 (0:00:00.029)       0:01:00.949 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Create host directories] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:7
Saturday 15 February 2025  11:43:03 -0500 (0:00:00.031)       0:01:00.981 ***** 
changed: [managed-node1] => (item=/tmp/httpd3) => {
    "ansible_loop_var": "item",
    "changed": true,
    "gid": 0,
    "group": "root",
    "item": "/tmp/httpd3",
    "mode": "0755",
    "owner": "root",
    "path": "/tmp/httpd3",
    "secontext": "unconfined_u:object_r:user_tmp_t:s0",
    "size": 6,
    "state": "directory",
    "uid": 0
}
changed: [managed-node1] => (item=/tmp/httpd3-create) => {
    "ansible_loop_var": "item",
    "changed": true,
    "gid": 0,
    "group": "root",
    "item": "/tmp/httpd3-create",
    "mode": "0755",
    "owner": "root",
    "path": "/tmp/httpd3-create",
    "secontext": "unconfined_u:object_r:user_tmp_t:s0",
    "size": 6,
    "state": "directory",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure container images are present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18
Saturday 15 February 2025  11:43:03 -0500 (0:00:00.796)       0:01:01.777 ***** 
changed: [managed-node1] => (item=None) => {
    "attempts": 1,
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
changed: [managed-node1] => (item=None) => {
    "attempts": 1,
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
changed: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
TASK [fedora.linux_system_roles.podman : Ensure the quadlet directory is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39
Saturday 15 February 2025  11:43:21 -0500 (0:00:17.876)       0:01:19.653 ***** 
ok: [managed-node1] => {
    "changed": false,
    "gid": 0,
    "group": "root",
    "mode": "0755",
    "owner": "root",
    "path": "/etc/containers/systemd",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 160,
    "state": "directory",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is copied] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:48
Saturday 15 February 2025  11:43:22 -0500 (0:00:00.423)       0:01:20.077 ***** 
changed: [managed-node1] => {
    "changed": true,
    "checksum": "7a5c73a5d935a42431c87bcdbeb8a04ed0909dc7",
    "dest": "/etc/containers/systemd/quadlet-demo.kube",
    "gid": 0,
    "group": "root",
    "md5sum": "da53c88f92b68b0487aa209f795b6bb3",
    "mode": "0644",
    "owner": "root",
    "secontext": "system_u:object_r:etc_t:s0",
    "size": 456,
    "src": "/root/.ansible/tmp/ansible-tmp-1739637802.223046-23059-18128663460698/.source.kube",
    "state": "file",
    "uid": 0
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file content is present] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:58
Saturday 15 February 2025  11:43:22 -0500 (0:00:00.760)       0:01:20.837 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Ensure quadlet file is present] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70
Saturday 15 February 2025  11:43:22 -0500 (0:00:00.034)       0:01:20.872 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_copy_file is skipped",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Reload systemctl] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:82
Saturday 15 February 2025  11:43:22 -0500 (0:00:00.032)       0:01:20.904 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Start service] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:110
Saturday 15 February 2025  11:43:23 -0500 (0:00:00.888)       0:01:21.792 ***** 
changed: [managed-node1] => {
    "changed": true,
    "name": "quadlet-demo.service",
    "state": "started",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0",
        "ActiveEnterTimestampMonotonic": "0",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "inactive",
        "After": "sysinit.target network-online.target -.mount quadlet-demo-mysql.service basic.target system.slice systemd-journald.socket quadlet-demo-network.service",
        "AllowIsolate": "no",
        "AssertResult": "no",
        "AssertTimestampMonotonic": "0",
        "Before": "multi-user.target shutdown.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "[not set]",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "no",
        "ConditionTimestampMonotonic": "0",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "shutdown.target",
        "ControlGroupId": "0",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "no",
        "Description": "quadlet-demo.service",
        "DevicePolicy": "auto",
        "DynamicUser": "no",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveTasksMax": "22347",
        "Environment": "PODMAN_SYSTEMD_UNIT=quadlet-demo.service",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainHandoffTimestampMonotonic": "0",
        "ExecMainPID": "0",
        "ExecMainStartTimestampMonotonic": "0",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman kube play --replace --service-container=true --network systemd-quadlet-demo --configmap /etc/containers/systemd/envoy-proxy-configmap.yml --publish 8000:8080 --publish 9000:9901 /etc/containers/systemd/quadlet-demo.yml ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman kube play --replace --service-container=true --network systemd-quadlet-demo --configmap /etc/containers/systemd/envoy-proxy-configmap.yml --publish 8000:8080 --publish 9000:9901 /etc/containers/systemd/quadlet-demo.yml ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStopPost": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman kube down /etc/containers/systemd/quadlet-demo.yml ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStopPostEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman kube down /etc/containers/systemd/quadlet-demo.yml ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/run/systemd/generator/quadlet-demo.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "quadlet-demo.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestampMonotonic": "0",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "mixed",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "0",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "2562686976",
        "MemoryCurrent": "[not set]",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "[not set]",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "[not set]",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "[not set]",
        "MemoryZSwapCurrent": "[not set]",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "quadlet-demo.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "all",
        "OOMPolicy": "stop",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "no",
        "ProtectControlGroups": "no",
        "ProtectControlGroupsEx": "no",
        "ProtectHome": "no",
        "ProtectHostname": "no",
        "ProtectKernelLogs": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "quadlet-demo-mysql.service system.slice sysinit.target -.mount quadlet-demo-network.service",
        "RequiresMountsFor": "/run/containers",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "SourcePath": "/etc/containers/systemd/quadlet-demo.kube",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestampMonotonic": "0",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "dead",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogIdentifier": "quadlet-demo",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "[not set]",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "notify",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "disabled",
        "UnitFileState": "generated",
        "UtmpMode": "init",
        "WantedBy": "multi-user.target",
        "Wants": "network-online.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "infinity"
    }
}
TASK [fedora.linux_system_roles.podman : Restart service] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:125
Saturday 15 February 2025  11:43:25 -0500 (0:00:01.162)       0:01:22.955 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_service_started is changed",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Cancel linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:196
Saturday 15 February 2025  11:43:25 -0500 (0:00:00.052)       0:01:23.008 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "skipped_reason": "No items in the list"
}
TASK [fedora.linux_system_roles.podman : Handle credential files - absent] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:202
Saturday 15 February 2025  11:43:25 -0500 (0:00:00.047)       0:01:23.056 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle certs.d files - absent] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:211
Saturday 15 February 2025  11:43:25 -0500 (0:00:00.050)       0:01:23.106 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [Check quadlet files] *****************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:96
Saturday 15 February 2025  11:43:25 -0500 (0:00:00.077)       0:01:23.183 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "ls",
        "-alrtF",
        "/etc/containers/systemd"
    ],
    "delta": "0:00:00.004394",
    "end": "2025-02-15 11:43:25.712676",
    "rc": 0,
    "start": "2025-02-15 11:43:25.708282"
}
STDOUT:
total 24
drwxr-xr-x. 9 root root  178 Feb 15 11:38 ../
-rw-r--r--. 1 root root   74 Feb 15 11:42 quadlet-demo.network
-rw-r--r--. 1 root root    9 Feb 15 11:42 quadlet-demo-mysql.volume
-rw-r--r--. 1 root root  363 Feb 15 11:42 quadlet-demo-mysql.container
-rw-r--r--. 1 root root 2102 Feb 15 11:42 envoy-proxy-configmap.yml
-rw-r--r--. 1 root root 1605 Feb 15 11:42 quadlet-demo.yml
-rw-r--r--. 1 root root  456 Feb 15 11:43 quadlet-demo.kube
drwxr-xr-x. 2 root root  185 Feb 15 11:43 ./
TASK [Check containers] ********************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:100
Saturday 15 February 2025  11:43:25 -0500 (0:00:00.573)       0:01:23.757 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "ps",
        "-a"
    ],
    "delta": "0:00:00.079242",
    "end": "2025-02-15 11:43:26.356683",
    "failed_when_result": false,
    "rc": 0,
    "start": "2025-02-15 11:43:26.277441"
}
STDOUT:
CONTAINER ID  IMAGE                                            COMMAND               CREATED         STATUS                   PORTS                                                      NAMES
a5dec710fc02  quay.io/libpod/registry:2.8.2                    /etc/docker/regis...  5 minutes ago   Up 5 minutes             127.0.0.1:5000->5000/tcp                                   podman_registry
3aa715549e27  quay.io/linux-system-roles/mysql:5.6             mysqld                35 seconds ago  Up 36 seconds (healthy)  3306/tcp                                                   quadlet-demo-mysql
381dcd9d839f  localhost/podman-pause:5.3.1-1733097600                                1 second ago    Up 2 seconds                                                                        a96f3a51b8d1-service
a5c265751b66  localhost/podman-pause:5.3.1-1733097600                                1 second ago    Up 2 seconds             0.0.0.0:8000->8080/tcp, 0.0.0.0:9000->9901/tcp             5bc2e99d5832-infra
8454ef009c74  quay.io/linux-system-roles/wordpress:4.8-apache  apache2-foregroun...  1 second ago    Up 2 seconds             0.0.0.0:8000->8080/tcp, 0.0.0.0:9000->9901/tcp, 80/tcp     quadlet-demo-wordpress
0cdb8c8a7208  quay.io/linux-system-roles/envoyproxy:v1.25.0    envoy -c /etc/env...  1 second ago    Up 2 seconds             0.0.0.0:8000->8080/tcp, 0.0.0.0:9000->9901/tcp, 10000/tcp  quadlet-demo-envoy
TASK [Check volumes] ***********************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:105
Saturday 15 February 2025  11:43:26 -0500 (0:00:00.609)       0:01:24.366 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "volume",
        "ls"
    ],
    "delta": "0:00:00.032530",
    "end": "2025-02-15 11:43:26.829858",
    "failed_when_result": false,
    "rc": 0,
    "start": "2025-02-15 11:43:26.797328"
}
STDOUT:
DRIVER      VOLUME NAME
local       2fefbc9190adbe5ffeef28f6e938304e8cedd541e0d51e013b7101e905a15702
local       systemd-quadlet-demo-mysql
local       wp-pv-claim
local       envoy-proxy-config
local       envoy-certificates
TASK [Check pods] **************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:110
Saturday 15 February 2025  11:43:26 -0500 (0:00:00.449)       0:01:24.816 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "pod",
        "ps",
        "--ctr-ids",
        "--ctr-names",
        "--ctr-status"
    ],
    "delta": "0:00:00.037042",
    "end": "2025-02-15 11:43:27.305249",
    "failed_when_result": false,
    "rc": 0,
    "start": "2025-02-15 11:43:27.268207"
}
STDOUT:
POD ID        NAME          STATUS      CREATED        INFRA ID      IDS                                     NAMES                                                         STATUS
5bc2e99d5832  quadlet-demo  Running     2 seconds ago  a5c265751b66  a5c265751b66,8454ef009c74,0cdb8c8a7208  5bc2e99d5832-infra,quadlet-demo-wordpress,quadlet-demo-envoy  running,running,running
TASK [Check systemd] ***********************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:115
Saturday 15 February 2025  11:43:27 -0500 (0:00:00.486)       0:01:25.302 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": "set -euo pipefail; systemctl list-units | grep quadlet",
    "delta": "0:00:00.014737",
    "end": "2025-02-15 11:43:27.722611",
    "failed_when_result": false,
    "rc": 0,
    "start": "2025-02-15 11:43:27.707874"
}
STDOUT:
  quadlet-demo-mysql-volume.service                                                                                                    loaded active exited    quadlet-demo-mysql-volume.service
  quadlet-demo-mysql.service                                                                                                           loaded active running   quadlet-demo-mysql.service
  quadlet-demo-network.service                                                                                                         loaded active exited    quadlet-demo-network.service
  quadlet-demo.service                                                                                                                 loaded active running   quadlet-demo.service
TASK [Check web] ***************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:121
Saturday 15 February 2025  11:43:27 -0500 (0:00:00.471)       0:01:25.773 ***** 
changed: [managed-node1] => {
    "attempts": 1,
    "changed": true,
    "checksum_dest": null,
    "checksum_src": "d1ac587ee4653b36ed40791b2bca2a83cf8cb157",
    "dest": "/run/out",
    "elapsed": 0,
    "gid": 0,
    "group": "root",
    "md5sum": "95e8238992037c7b6b6decebba46e982",
    "mode": "0600",
    "owner": "root",
    "secontext": "system_u:object_r:var_run_t:s0",
    "size": 11666,
    "src": "/root/.ansible/tmp/ansible-tmp-1739637807.9343338-23305-149762986720323/tmp_p1n1wr9",
    "state": "file",
    "status_code": 200,
    "uid": 0,
    "url": "https://localhost:8000"
}
MSG:
OK (unknown bytes)
TASK [Show web] ****************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:132
Saturday 15 February 2025  11:43:29 -0500 (0:00:01.436)       0:01:27.210 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "cat",
        "/run/out"
    ],
    "delta": "0:00:00.003054",
    "end": "2025-02-15 11:43:29.626228",
    "rc": 0,
    "start": "2025-02-15 11:43:29.623174"
}
STDOUT:
	
	
	
	WordPress › Installation
	
WordPress
	
TASK [Error] *******************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:137
Saturday 15 February 2025  11:43:29 -0500 (0:00:00.421)       0:01:27.631 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__web_status is failed",
    "skip_reason": "Conditional result was False"
}
TASK [Check] *******************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:148
Saturday 15 February 2025  11:43:29 -0500 (0:00:00.059)       0:01:27.691 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "ps",
        "-a"
    ],
    "delta": "0:00:00.036774",
    "end": "2025-02-15 11:43:30.154136",
    "rc": 0,
    "start": "2025-02-15 11:43:30.117362"
}
STDOUT:
CONTAINER ID  IMAGE                                            COMMAND               CREATED         STATUS                   PORTS                                                      NAMES
a5dec710fc02  quay.io/libpod/registry:2.8.2                    /etc/docker/regis...  5 minutes ago   Up 5 minutes             127.0.0.1:5000->5000/tcp                                   podman_registry
3aa715549e27  quay.io/linux-system-roles/mysql:5.6             mysqld                39 seconds ago  Up 40 seconds (healthy)  3306/tcp                                                   quadlet-demo-mysql
381dcd9d839f  localhost/podman-pause:5.3.1-1733097600                                5 seconds ago   Up 6 seconds                                                                        a96f3a51b8d1-service
a5c265751b66  localhost/podman-pause:5.3.1-1733097600                                5 seconds ago   Up 6 seconds             0.0.0.0:8000->8080/tcp, 0.0.0.0:9000->9901/tcp             5bc2e99d5832-infra
8454ef009c74  quay.io/linux-system-roles/wordpress:4.8-apache  apache2-foregroun...  5 seconds ago   Up 6 seconds             0.0.0.0:8000->8080/tcp, 0.0.0.0:9000->9901/tcp, 80/tcp     quadlet-demo-wordpress
0cdb8c8a7208  quay.io/linux-system-roles/envoyproxy:v1.25.0    envoy -c /etc/env...  5 seconds ago   Up 6 seconds             0.0.0.0:8000->8080/tcp, 0.0.0.0:9000->9901/tcp, 10000/tcp  quadlet-demo-envoy
TASK [Check pods] **************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:152
Saturday 15 February 2025  11:43:30 -0500 (0:00:00.497)       0:01:28.188 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "pod",
        "ps",
        "--ctr-ids",
        "--ctr-names",
        "--ctr-status"
    ],
    "delta": "0:00:00.037030",
    "end": "2025-02-15 11:43:30.698369",
    "failed_when_result": false,
    "rc": 0,
    "start": "2025-02-15 11:43:30.661339"
}
STDOUT:
POD ID        NAME          STATUS      CREATED        INFRA ID      IDS                                     NAMES                                                         STATUS
5bc2e99d5832  quadlet-demo  Running     6 seconds ago  a5c265751b66  a5c265751b66,8454ef009c74,0cdb8c8a7208  5bc2e99d5832-infra,quadlet-demo-wordpress,quadlet-demo-envoy  running,running,running
TASK [Check systemd] ***********************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:157
Saturday 15 February 2025  11:43:30 -0500 (0:00:00.557)       0:01:28.746 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": "set -euo pipefail; systemctl list-units --all | grep quadlet",
    "delta": "0:00:00.014548",
    "end": "2025-02-15 11:43:31.239510",
    "failed_when_result": false,
    "rc": 0,
    "start": "2025-02-15 11:43:31.224962"
}
STDOUT:
  quadlet-demo-mysql-volume.service                                                                                                    loaded    active   exited    quadlet-demo-mysql-volume.service
  quadlet-demo-mysql.service                                                                                                           loaded    active   running   quadlet-demo-mysql.service
  quadlet-demo-network.service                                                                                                         loaded    active   exited    quadlet-demo-network.service
  quadlet-demo.service                                                                                                                 loaded    active   running   quadlet-demo.service
TASK [LS] **********************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:165
Saturday 15 February 2025  11:43:31 -0500 (0:00:00.514)       0:01:29.260 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "ls",
        "-alrtF",
        "/etc/systemd/system"
    ],
    "delta": "0:00:00.004401",
    "end": "2025-02-15 11:43:31.709660",
    "failed_when_result": false,
    "rc": 0,
    "start": "2025-02-15 11:43:31.705259"
}
STDOUT:
total 12
drwxr-xr-x.  5 root root   47 Feb 11 06:03 ../
lrwxrwxrwx.  1 root root   43 Feb 11 06:03 dbus.service -> /usr/lib/systemd/system/dbus-broker.service
drwxr-xr-x.  2 root root   32 Feb 11 06:03 getty.target.wants/
lrwxrwxrwx.  1 root root   37 Feb 11 06:03 ctrl-alt-del.target -> /usr/lib/systemd/system/reboot.target
drwxr-xr-x.  2 root root   48 Feb 11 06:04 network-online.target.wants/
lrwxrwxrwx.  1 root root   57 Feb 11 06:04 dbus-org.freedesktop.nm-dispatcher.service -> /usr/lib/systemd/system/NetworkManager-dispatcher.service
drwxr-xr-x.  2 root root   76 Feb 11 06:04 timers.target.wants/
drwxr-xr-x.  2 root root   38 Feb 11 06:04 dev-virtio\x2dports-org.qemu.guest_agent.0.device.wants/
lrwxrwxrwx.  1 root root   41 Feb 11 06:07 default.target -> /usr/lib/systemd/system/multi-user.target
drwxr-xr-x.  2 root root   31 Feb 11 06:56 remote-fs.target.wants/
drwxr-xr-x.  2 root root  119 Feb 11 06:57 cloud-init.target.wants/
drwxr-xr-x.  2 root root 4096 Feb 11 06:57 sysinit.target.wants/
drwxr-xr-x.  2 root root  143 Feb 15 11:37 sockets.target.wants/
drwxr-xr-x.  2 root root 4096 Feb 15 11:42 multi-user.target.wants/
lrwxrwxrwx.  1 root root   41 Feb 15 11:42 dbus-org.fedoraproject.FirewallD1.service -> /usr/lib/systemd/system/firewalld.service
drwxr-xr-x. 11 root root 4096 Feb 15 11:42 ./
TASK [Cleanup] *****************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:172
Saturday 15 February 2025  11:43:31 -0500 (0:00:00.472)       0:01:29.733 ***** 
included: fedora.linux_system_roles.podman for managed-node1
TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3
Saturday 15 February 2025  11:43:31 -0500 (0:00:00.157)       0:01:29.891 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] ****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3
Saturday 15 February 2025  11:43:32 -0500 (0:00:00.156)       0:01:30.048 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11
Saturday 15 February 2025  11:43:32 -0500 (0:00:00.062)       0:01:30.110 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_is_ostree is defined",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16
Saturday 15 February 2025  11:43:32 -0500 (0:00:00.062)       0:01:30.173 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_is_ostree is defined",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check if transactional-update exists in /sbin] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:23
Saturday 15 February 2025  11:43:32 -0500 (0:00:00.052)       0:01:30.225 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_is_transactional is defined",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set flag if transactional-update exists] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:28
Saturday 15 February 2025  11:43:32 -0500 (0:00:00.063)       0:01:30.289 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_is_transactional is defined",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:32
Saturday 15 February 2025  11:43:32 -0500 (0:00:00.072)       0:01:30.362 ***** 
ok: [managed-node1] => (item=RedHat.yml) => {
    "ansible_facts": {
        "__podman_packages": [
            "podman",
            "shadow-utils-subid"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml"
}
skipping: [managed-node1] => (item=CentOS.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "__vars_file is file",
    "item": "CentOS.yml",
    "skip_reason": "Conditional result was False"
}
ok: [managed-node1] => (item=CentOS_10.yml) => {
    "ansible_facts": {
        "__podman_packages": [
            "iptables-nft",
            "podman",
            "shadow-utils-subid"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "CentOS_10.yml"
}
ok: [managed-node1] => (item=CentOS_10.yml) => {
    "ansible_facts": {
        "__podman_packages": [
            "iptables-nft",
            "podman",
            "shadow-utils-subid"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "CentOS_10.yml"
}
TASK [fedora.linux_system_roles.podman : Gather the package facts] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6
Saturday 15 February 2025  11:43:32 -0500 (0:00:00.131)       0:01:30.493 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Enable copr if requested] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10
Saturday 15 February 2025  11:43:33 -0500 (0:00:00.955)       0:01:31.449 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_use_copr | d(false)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14
Saturday 15 February 2025  11:43:33 -0500 (0:00:00.050)       0:01:31.499 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "(__podman_packages | difference(ansible_facts.packages))",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Notify user that reboot is needed to apply changes] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28
Saturday 15 February 2025  11:43:33 -0500 (0:00:00.045)       0:01:31.545 ***** 
skipping: [managed-node1] => {
    "false_condition": "__podman_is_transactional | d(false)"
}
TASK [fedora.linux_system_roles.podman : Reboot transactional update systems] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:33
Saturday 15 February 2025  11:43:33 -0500 (0:00:00.037)       0:01:31.583 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if reboot is needed and not set] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:38
Saturday 15 February 2025  11:43:33 -0500 (0:00:00.040)       0:01:31.624 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get podman version] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:46
Saturday 15 February 2025  11:43:33 -0500 (0:00:00.036)       0:01:31.660 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "--version"
    ],
    "delta": "0:00:00.023863",
    "end": "2025-02-15 11:43:34.079994",
    "rc": 0,
    "start": "2025-02-15 11:43:34.056131"
}
STDOUT:
podman version 5.3.1
TASK [fedora.linux_system_roles.podman : Set podman version] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:52
Saturday 15 February 2025  11:43:34 -0500 (0:00:00.441)       0:01:32.101 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "podman_version": "5.3.1"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56
Saturday 15 February 2025  11:43:34 -0500 (0:00:00.066)       0:01:32.168 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_version is version(\"4.2\", \"<\")",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:63
Saturday 15 February 2025  11:43:34 -0500 (0:00:00.079)       0:01:32.247 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_version is version(\"4.4\", \"<\")",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:73
Saturday 15 February 2025  11:43:34 -0500 (0:00:00.074)       0:01:32.322 ***** 
META: end_host conditional evaluated to False, continuing execution for managed-node1
skipping: [managed-node1] => {
    "skip_reason": "end_host conditional evaluated to False, continuing execution for managed-node1"
}
MSG:
end_host conditional evaluated to false, continuing execution for managed-node1
TASK [fedora.linux_system_roles.podman : Podman package version must be 5.0 or later for Pod quadlets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80
Saturday 15 February 2025  11:43:34 -0500 (0:00:00.127)       0:01:32.450 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__has_type_pod or __has_pod_file_ext or __has_pod_file_src_ext or __has_pod_template_src_ext or __has_pod_template_src_ext_j2",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Podman package version must be 5.0 or later for Pod quadlets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:96
Saturday 15 February 2025  11:43:34 -0500 (0:00:00.084)       0:01:32.534 ***** 
META: end_host conditional evaluated to False, continuing execution for managed-node1
skipping: [managed-node1] => {
    "skip_reason": "end_host conditional evaluated to False, continuing execution for managed-node1"
}
MSG:
end_host conditional evaluated to false, continuing execution for managed-node1
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:109
Saturday 15 February 2025  11:43:34 -0500 (0:00:00.085)       0:01:32.620 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:43:34 -0500 (0:00:00.103)       0:01:32.724 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:43:34 -0500 (0:00:00.055)       0:01:32.779 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:43:34 -0500 (0:00:00.061)       0:01:32.841 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:43:34 -0500 (0:00:00.071)       0:01:32.912 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.395)       0:01:33.307 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.054)       0:01:33.361 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.055)       0:01:33.417 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.053)       0:01:33.470 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.055)       0:01:33.526 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.053)       0:01:33.580 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.038)       0:01:33.618 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.039)       0:01:33.657 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set config file paths] ****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:115
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.040)       0:01:33.698 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf",
        "__podman_policy_json_file": "/etc/containers/policy.json",
        "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf",
        "__podman_storage_conf_file": "/etc/containers/storage.conf"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle container.conf.d] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:124
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.044)       0:01:33.743 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.058)       0:01:33.801 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_containers_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Update container config file] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13
Saturday 15 February 2025  11:43:35 -0500 (0:00:00.080)       0:01:33.882 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_containers_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:127
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.035)       0:01:33.918 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.088)       0:01:34.006 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_registries_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Update registries config file] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.051)       0:01:34.058 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_registries_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Handle storage.conf] ******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:130
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.048)       0:01:34.106 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.072)       0:01:34.178 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_storage_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Update storage config file] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.046)       0:01:34.225 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_storage_conf | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Handle policy.json] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:133
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.038)       0:01:34.263 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.068)       0:01:34.332 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_policy_json | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.032)       0:01:34.365 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_policy_json | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get the existing policy.json] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.033)       0:01:34.398 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_policy_json | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Write new policy.json file] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.031)       0:01:34.430 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_policy_json | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [Manage firewall for specified ports] *************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:139
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.034)       0:01:34.465 ***** 
included: fedora.linux_system_roles.firewall for managed-node1
TASK [fedora.linux_system_roles.firewall : Setup firewalld] ********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:2
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.123)       0:01:34.588 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml for managed-node1
TASK [fedora.linux_system_roles.firewall : Ensure ansible_facts used by role] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:2
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.062)       0:01:34.651 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_required_facts | difference(ansible_facts.keys() | list) | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Check if system is ostree] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:10
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.088)       0:01:34.739 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __firewall_is_ostree is defined",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Set flag to indicate system is ostree] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:15
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.032)       0:01:34.772 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __firewall_is_ostree is defined",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Check if transactional-update exists in /sbin] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:22
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.036)       0:01:34.808 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __firewall_is_transactional is defined",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Set flag if transactional-update exists] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:27
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.041)       0:01:34.849 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __firewall_is_transactional is defined",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Install firewalld] ******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:31
Saturday 15 February 2025  11:43:36 -0500 (0:00:00.055)       0:01:34.905 ***** 
ok: [managed-node1] => {
    "changed": false,
    "rc": 0,
    "results": []
}
MSG:
Nothing to do
lsrpackages: firewalld
TASK [fedora.linux_system_roles.firewall : Notify user that reboot is needed to apply changes] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:43
Saturday 15 February 2025  11:43:37 -0500 (0:00:00.817)       0:01:35.722 ***** 
skipping: [managed-node1] => {
    "false_condition": "__firewall_is_transactional | d(false)"
}
TASK [fedora.linux_system_roles.firewall : Reboot transactional update systems] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:48
Saturday 15 February 2025  11:43:37 -0500 (0:00:00.040)       0:01:35.763 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Fail if reboot is needed and not set] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:53
Saturday 15 February 2025  11:43:37 -0500 (0:00:00.038)       0:01:35.802 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Collect service facts] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:5
Saturday 15 February 2025  11:43:37 -0500 (0:00:00.037)       0:01:35.840 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "firewall_disable_conflicting_services | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Attempt to stop and disable conflicting services] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:9
Saturday 15 February 2025  11:43:37 -0500 (0:00:00.030)       0:01:35.870 ***** 
skipping: [managed-node1] => (item=nftables)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "firewall_disable_conflicting_services | bool",
    "item": "nftables",
    "skip_reason": "Conditional result was False"
}
skipping: [managed-node1] => (item=iptables)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "firewall_disable_conflicting_services | bool",
    "item": "iptables",
    "skip_reason": "Conditional result was False"
}
skipping: [managed-node1] => (item=ufw)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "firewall_disable_conflicting_services | bool",
    "item": "ufw",
    "skip_reason": "Conditional result was False"
}
skipping: [managed-node1] => {
    "changed": false
}
MSG:
All items skipped
TASK [fedora.linux_system_roles.firewall : Unmask firewalld service] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:22
Saturday 15 February 2025  11:43:38 -0500 (0:00:00.047)       0:01:35.917 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": "firewalld",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0",
        "ActiveEnterTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "ActiveEnterTimestampMonotonic": "937076210",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "active",
        "After": "dbus-broker.service dbus.socket system.slice polkit.service sysinit.target basic.target",
        "AllowIsolate": "no",
        "AssertResult": "yes",
        "AssertTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "AssertTimestampMonotonic": "936818345",
        "Before": "network-pre.target shutdown.target multi-user.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "BusName": "org.fedoraproject.FirewallD1",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "446122000",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "yes",
        "CanReload": "yes",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_tty_config cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "yes",
        "ConditionTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "ConditionTimestampMonotonic": "936818341",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "iptables.service ip6tables.service shutdown.target ipset.service ebtables.service",
        "ControlGroup": "/system.slice/firewalld.service",
        "ControlGroupId": "154653",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "no",
        "Description": "firewalld - dynamic firewall daemon",
        "DeviceAllow": "char-rtc r",
        "DevicePolicy": "closed",
        "Documentation": "\"man:firewalld(1)\"",
        "DynamicUser": "no",
        "EffectiveCPUs": "0-1",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveMemoryNodes": "0",
        "EffectiveTasksMax": "22347",
        "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainHandoffTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "ExecMainHandoffTimestampMonotonic": "936850552",
        "ExecMainPID": "64827",
        "ExecMainStartTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "ExecMainStartTimestampMonotonic": "936821259",
        "ExecMainStatus": "0",
        "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/usr/lib/systemd/system/firewalld.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "firewalld.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "InactiveExitTimestampMonotonic": "936821958",
        "InvocationID": "48aac25ad79f402780638262db92b00c",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "mixed",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "yes",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "64827",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "2456469504",
        "MemoryCurrent": "33148928",
        "MemoryDenyWriteExecute": "yes",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "34549760",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "0",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "0",
        "MemoryZSwapCurrent": "0",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "firewalld.service dbus-org.fedoraproject.FirewallD1.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMPolicy": "stop",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "yes",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "yes",
        "ProtectControlGroups": "yes",
        "ProtectControlGroupsEx": "yes",
        "ProtectHome": "yes",
        "ProtectHostname": "yes",
        "ProtectKernelLogs": "yes",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "yes",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "dbus.socket system.slice sysinit.target",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "yes",
        "RestrictSUIDSGID": "yes",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "StandardError": "null",
        "StandardInput": "null",
        "StandardOutput": "null",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestamp": "Sat 2025-02-15 11:43:23 EST",
        "StateChangeTimestampMonotonic": "998913488",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "running",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallArchitectures": "native",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "2",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "dbus",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "enabled",
        "UnitFileState": "enabled",
        "UtmpMode": "init",
        "WantedBy": "multi-user.target",
        "Wants": "network-pre.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "0"
    }
}
TASK [fedora.linux_system_roles.firewall : Enable and start firewalld service] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:28
Saturday 15 February 2025  11:43:38 -0500 (0:00:00.539)       0:01:36.457 ***** 
ok: [managed-node1] => {
    "changed": false,
    "enabled": true,
    "name": "firewalld",
    "state": "started",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0",
        "ActiveEnterTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "ActiveEnterTimestampMonotonic": "937076210",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "active",
        "After": "dbus-broker.service dbus.socket system.slice polkit.service sysinit.target basic.target",
        "AllowIsolate": "no",
        "AssertResult": "yes",
        "AssertTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "AssertTimestampMonotonic": "936818345",
        "Before": "network-pre.target shutdown.target multi-user.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "BusName": "org.fedoraproject.FirewallD1",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "446122000",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "yes",
        "CanReload": "yes",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_tty_config cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "yes",
        "ConditionTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "ConditionTimestampMonotonic": "936818341",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "iptables.service ip6tables.service shutdown.target ipset.service ebtables.service",
        "ControlGroup": "/system.slice/firewalld.service",
        "ControlGroupId": "154653",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "no",
        "Description": "firewalld - dynamic firewall daemon",
        "DeviceAllow": "char-rtc r",
        "DevicePolicy": "closed",
        "Documentation": "\"man:firewalld(1)\"",
        "DynamicUser": "no",
        "EffectiveCPUs": "0-1",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveMemoryNodes": "0",
        "EffectiveTasksMax": "22347",
        "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainHandoffTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "ExecMainHandoffTimestampMonotonic": "936850552",
        "ExecMainPID": "64827",
        "ExecMainStartTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "ExecMainStartTimestampMonotonic": "936821259",
        "ExecMainStatus": "0",
        "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/usr/lib/systemd/system/firewalld.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "firewalld.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestamp": "Sat 2025-02-15 11:42:21 EST",
        "InactiveExitTimestampMonotonic": "936821958",
        "InvocationID": "48aac25ad79f402780638262db92b00c",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "mixed",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "yes",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "64827",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "2459832320",
        "MemoryCurrent": "33148928",
        "MemoryDenyWriteExecute": "yes",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "34549760",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "0",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "0",
        "MemoryZSwapCurrent": "0",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "firewalld.service dbus-org.fedoraproject.FirewallD1.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMPolicy": "stop",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "yes",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "yes",
        "ProtectControlGroups": "yes",
        "ProtectControlGroupsEx": "yes",
        "ProtectHome": "yes",
        "ProtectHostname": "yes",
        "ProtectKernelLogs": "yes",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "yes",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "dbus.socket system.slice sysinit.target",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "yes",
        "RestrictSUIDSGID": "yes",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "StandardError": "null",
        "StandardInput": "null",
        "StandardOutput": "null",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestamp": "Sat 2025-02-15 11:43:23 EST",
        "StateChangeTimestampMonotonic": "998913488",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "running",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallArchitectures": "native",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "2",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "dbus",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "enabled",
        "UnitFileState": "enabled",
        "UtmpMode": "init",
        "WantedBy": "multi-user.target",
        "Wants": "network-pre.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "0"
    }
}
TASK [fedora.linux_system_roles.firewall : Check if previous replaced is defined] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:34
Saturday 15 February 2025  11:43:39 -0500 (0:00:00.548)       0:01:37.006 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__firewall_previous_replaced": false,
        "__firewall_python_cmd": "/usr/bin/python3.12",
        "__firewall_report_changed": true
    },
    "changed": false
}
TASK [fedora.linux_system_roles.firewall : Get config files, checksums before and remove] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:43
Saturday 15 February 2025  11:43:39 -0500 (0:00:00.060)       0:01:37.066 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_previous_replaced | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Tell firewall module it is able to report changed] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:55
Saturday 15 February 2025  11:43:39 -0500 (0:00:00.049)       0:01:37.116 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_previous_replaced | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Configure firewall] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:71
Saturday 15 February 2025  11:43:39 -0500 (0:00:00.052)       0:01:37.168 ***** 
ok: [managed-node1] => (item={'port': '8000/tcp', 'state': 'enabled'}) => {
    "__firewall_changed": false,
    "ansible_loop_var": "item",
    "changed": false,
    "item": {
        "port": "8000/tcp",
        "state": "enabled"
    }
}
ok: [managed-node1] => (item={'port': '9000/tcp', 'state': 'enabled'}) => {
    "__firewall_changed": false,
    "ansible_loop_var": "item",
    "changed": false,
    "item": {
        "port": "9000/tcp",
        "state": "enabled"
    }
}
TASK [fedora.linux_system_roles.firewall : Gather firewall config information] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:120
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.996)       0:01:38.165 ***** 
skipping: [managed-node1] => (item={'port': '8000/tcp', 'state': 'enabled'})  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "firewall | length == 1",
    "item": {
        "port": "8000/tcp",
        "state": "enabled"
    },
    "skip_reason": "Conditional result was False"
}
skipping: [managed-node1] => (item={'port': '9000/tcp', 'state': 'enabled'})  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "firewall | length == 1",
    "item": {
        "port": "9000/tcp",
        "state": "enabled"
    },
    "skip_reason": "Conditional result was False"
}
skipping: [managed-node1] => {
    "changed": false
}
MSG:
All items skipped
TASK [fedora.linux_system_roles.firewall : Update firewalld_config fact] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:130
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.050)       0:01:38.215 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "firewall | length == 1",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Gather firewall config if no arguments] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:139
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.038)       0:01:38.253 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "firewall == None or firewall | length == 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Update firewalld_config fact] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:144
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.051)       0:01:38.305 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "firewall == None or firewall | length == 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Get config files, checksums after] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:153
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.044)       0:01:38.350 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_previous_replaced | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Calculate what has changed] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:163
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.092)       0:01:38.443 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__firewall_previous_replaced | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.firewall : Show diffs] *************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:169
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.034)       0:01:38.478 ***** 
skipping: [managed-node1] => {
    "false_condition": "__firewall_previous_replaced | bool"
}
TASK [Manage selinux for specified ports] **************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:146
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.048)       0:01:38.526 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "podman_selinux_ports | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:153
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.034)       0:01:38.561 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_cancel_user_linger": []
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] *******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:157
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.032)       0:01:38.593 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle credential files - present] ****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:166
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.031)       0:01:38.624 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle secrets] ***********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:175
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.029)       0:01:38.654 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml for managed-node1 => (item=(censored due to no_log))
TASK [fedora.linux_system_roles.podman : Set variables part 1] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:3
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.138)       0:01:38.792 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:7
Saturday 15 February 2025  11:43:40 -0500 (0:00:00.061)       0:01:38.854 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1 => (item=(censored due to no_log))
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.099)       0:01:38.954 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.060)       0:01:39.014 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.055)       0:01:39.070 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.070)       0:01:39.140 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.056)       0:01:39.197 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.057)       0:01:39.254 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.054)       0:01:39.309 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.058)       0:01:39.367 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.055)       0:01:39.422 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.055)       0:01:39.477 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.127)       0:01:39.605 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.056)       0:01:39.662 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set variables part 2] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:14
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.048)       0:01:39.711 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_rootless": false,
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:20
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.066)       0:01:39.777 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.095)       0:01:39.873 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:43:41 -0500 (0:00:00.035)       0:01:39.908 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.038)       0:01:39.947 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:25
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.046)       0:01:39.993 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Manage each secret] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:41
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.036)       0:01:40.030 ***** 
changed: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
TASK [fedora.linux_system_roles.podman : Set variables part 1] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:3
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.422)       0:01:40.453 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:7
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.045)       0:01:40.498 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1 => (item=(censored due to no_log))
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.068)       0:01:40.566 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.035)       0:01:40.602 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.037)       0:01:40.639 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.044)       0:01:40.683 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.033)       0:01:40.717 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.032)       0:01:40.750 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.033)       0:01:40.783 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.033)       0:01:40.816 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:43:42 -0500 (0:00:00.086)       0:01:40.903 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.033)       0:01:40.937 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.035)       0:01:40.972 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.040)       0:01:41.013 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set variables part 2] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:14
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.034)       0:01:41.048 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_rootless": false,
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:20
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.040)       0:01:41.088 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.061)       0:01:41.150 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.032)       0:01:41.182 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.031)       0:01:41.214 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:25
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.032)       0:01:41.247 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Manage each secret] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:41
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.032)       0:01:41.279 ***** 
changed: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
TASK [fedora.linux_system_roles.podman : Set variables part 1] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:3
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.419)       0:01:41.699 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:7
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.036)       0:01:41.735 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1 => (item=(censored due to no_log))
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.058)       0:01:41.793 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.037)       0:01:41.831 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:43:43 -0500 (0:00:00.040)       0:01:41.871 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.068)       0:01:41.940 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.040)       0:01:41.980 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.048)       0:01:42.029 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.040)       0:01:42.070 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.091)       0:01:42.161 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.033)       0:01:42.194 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.034)       0:01:42.229 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.033)       0:01:42.262 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.036)       0:01:42.299 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_check_subids | d(true)",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set variables part 2] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:14
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.043)       0:01:42.342 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_rootless": false,
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:20
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.068)       0:01:42.411 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.096)       0:01:42.507 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.035)       0:01:42.543 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.045)       0:01:42.588 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:25
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.037)       0:01:42.625 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Manage each secret] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:41
Saturday 15 February 2025  11:43:44 -0500 (0:00:00.038)       0:01:42.664 ***** 
changed: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:182
Saturday 15 February 2025  11:43:45 -0500 (0:00:00.421)       0:01:43.086 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle Quadlet specifications] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:189
Saturday 15 February 2025  11:43:45 -0500 (0:00:00.030)       0:01:43.116 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed-node1 => (item=(censored due to no_log))
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:43:45 -0500 (0:00:00.187)       0:01:43.304 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "quadlet-demo.kube",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "[Install]\nWantedBy=default.target\n\n[Unit]\nRequires=quadlet-demo-mysql.service\nAfter=quadlet-demo-mysql.service\n\n[Kube]\n# Point to the yaml file in the same directory\nYaml=quadlet-demo.yml\n# Use the quadlet-demo network\nNetwork=quadlet-demo.network\n# Publish the envoy proxy data port\nPublishPort=8000:8080\n# Publish the envoy proxy admin port\nPublishPort=9000:9901\n# Use the envoy proxy config map in the same directory\nConfigMap=envoy-proxy-configmap.yml",
        "__podman_quadlet_template_src": ""
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:43:45 -0500 (0:00:00.053)       0:01:43.358 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "absent",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:43:45 -0500 (0:00:00.049)       0:01:43.408 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:43:45 -0500 (0:00:00.037)       0:01:43.445 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "quadlet-demo",
        "__podman_quadlet_type": "kube",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:43:45 -0500 (0:00:00.111)       0:01:43.556 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:43:45 -0500 (0:00:00.061)       0:01:43.617 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:43:45 -0500 (0:00:00.037)       0:01:43.654 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:43:45 -0500 (0:00:00.039)       0:01:43.694 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:43:45 -0500 (0:00:00.045)       0:01:43.739 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.391)       0:01:44.130 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.039)       0:01:44.170 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.034)       0:01:44.204 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.036)       0:01:44.240 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.035)       0:01:44.275 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.036)       0:01:44.312 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.034)       0:01:44.346 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.035)       0:01:44.382 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.035)       0:01:44.417 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [],
        "__podman_kube_yamls_raw": [
            "quadlet-demo.yml"
        ],
        "__podman_service_name": "quadlet-demo.service",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.057)       0:01:44.475 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.034)       0:01:44.510 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.031)       0:01:44.542 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [],
        "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo.kube",
        "__podman_volumes": []
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.075)       0:01:44.617 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.046)       0:01:44.663 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4
Saturday 15 February 2025  11:43:46 -0500 (0:00:00.197)       0:01:44.861 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stop and disable service] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12
Saturday 15 February 2025  11:43:47 -0500 (0:00:00.054)       0:01:44.915 ***** 
changed: [managed-node1] => {
    "changed": true,
    "enabled": false,
    "failed_when_result": false,
    "name": "quadlet-demo.service",
    "state": "stopped",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0",
        "ActiveEnterTimestamp": "Sat 2025-02-15 11:43:24 EST",
        "ActiveEnterTimestampMonotonic": "1000142565",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "active",
        "After": "sysinit.target network-online.target -.mount quadlet-demo-mysql.service basic.target system.slice systemd-journald.socket quadlet-demo-network.service",
        "AllowIsolate": "no",
        "AssertResult": "yes",
        "AssertTimestamp": "Sat 2025-02-15 11:43:24 EST",
        "AssertTimestampMonotonic": "999622751",
        "Before": "multi-user.target shutdown.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "249338000",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "yes",
        "ConditionTimestamp": "Sat 2025-02-15 11:43:24 EST",
        "ConditionTimestampMonotonic": "999622747",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "shutdown.target",
        "ControlGroup": "/system.slice/quadlet-demo.service",
        "ControlGroupId": "156374",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "no",
        "Description": "quadlet-demo.service",
        "DevicePolicy": "auto",
        "DynamicUser": "no",
        "EffectiveCPUs": "0-1",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveMemoryNodes": "0",
        "EffectiveTasksMax": "22347",
        "Environment": "PODMAN_SYSTEMD_UNIT=quadlet-demo.service",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainHandoffTimestampMonotonic": "0",
        "ExecMainPID": "72088",
        "ExecMainStartTimestamp": "Sat 2025-02-15 11:43:24 EST",
        "ExecMainStartTimestampMonotonic": "999860001",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman kube play --replace --service-container=true --network systemd-quadlet-demo --configmap /etc/containers/systemd/envoy-proxy-configmap.yml --publish 8000:8080 --publish 9000:9901 /etc/containers/systemd/quadlet-demo.yml ; ignore_errors=no ; start_time=[Sat 2025-02-15 11:43:24 EST] ; stop_time=[n/a] ; pid=72079 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman kube play --replace --service-container=true --network systemd-quadlet-demo --configmap /etc/containers/systemd/envoy-proxy-configmap.yml --publish 8000:8080 --publish 9000:9901 /etc/containers/systemd/quadlet-demo.yml ; flags= ; start_time=[Sat 2025-02-15 11:43:24 EST] ; stop_time=[n/a] ; pid=72079 ; code=(null) ; status=0/0 }",
        "ExecStopPost": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman kube down /etc/containers/systemd/quadlet-demo.yml ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStopPostEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman kube down /etc/containers/systemd/quadlet-demo.yml ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/run/systemd/generator/quadlet-demo.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "quadlet-demo.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestamp": "Sat 2025-02-15 11:43:24 EST",
        "InactiveExitTimestampMonotonic": "999625098",
        "InvocationID": "1710244099c14c73859577e78f1a80e8",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "mixed",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "72088",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "2494771200",
        "MemoryCurrent": "2969600",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "25841664",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "0",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "0",
        "MemoryZSwapCurrent": "0",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "quadlet-demo.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "all",
        "OOMPolicy": "stop",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "no",
        "ProtectControlGroups": "no",
        "ProtectControlGroupsEx": "no",
        "ProtectHome": "no",
        "ProtectHostname": "no",
        "ProtectKernelLogs": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "quadlet-demo-mysql.service system.slice sysinit.target -.mount quadlet-demo-network.service",
        "RequiresMountsFor": "/run/containers",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "SourcePath": "/etc/containers/systemd/quadlet-demo.kube",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestamp": "Sat 2025-02-15 11:43:24 EST",
        "StateChangeTimestampMonotonic": "1000142565",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "running",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogIdentifier": "quadlet-demo",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "4",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "notify",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "disabled",
        "UnitFileState": "generated",
        "UtmpMode": "init",
        "WantedBy": "multi-user.target",
        "Wants": "network-online.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "0"
    }
}
TASK [fedora.linux_system_roles.podman : See if quadlet file exists] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33
Saturday 15 February 2025  11:43:48 -0500 (0:00:01.419)       0:01:46.334 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637802.8521874,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 8,
        "charset": "us-ascii",
        "checksum": "7a5c73a5d935a42431c87bcdbeb8a04ed0909dc7",
        "ctime": 1739637802.8551874,
        "dev": 51714,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 96777509,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "text/plain",
        "mode": "0644",
        "mtime": 1739637802.5781863,
        "nlink": 1,
        "path": "/etc/containers/systemd/quadlet-demo.kube",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 456,
        "uid": 0,
        "version": "2897252076",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38
Saturday 15 February 2025  11:43:48 -0500 (0:00:00.422)       0:01:46.757 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Slurp quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:6
Saturday 15 February 2025  11:43:48 -0500 (0:00:00.100)       0:01:46.857 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:12
Saturday 15 February 2025  11:43:49 -0500 (0:00:00.402)       0:01:47.260 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet yaml file] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:44
Saturday 15 February 2025  11:43:49 -0500 (0:00:00.052)       0:01:47.313 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Reset raw variable] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:52
Saturday 15 February 2025  11:43:49 -0500 (0:00:00.032)       0:01:47.346 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_raw": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Remove quadlet file] ******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42
Saturday 15 February 2025  11:43:49 -0500 (0:00:00.035)       0:01:47.381 ***** 
changed: [managed-node1] => {
    "changed": true,
    "path": "/etc/containers/systemd/quadlet-demo.kube",
    "state": "absent"
}
TASK [fedora.linux_system_roles.podman : Refresh systemd] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48
Saturday 15 February 2025  11:43:49 -0500 (0:00:00.378)       0:01:47.759 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Remove managed resource] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58
Saturday 15 February 2025  11:43:50 -0500 (0:00:00.752)       0:01:48.512 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Remove volumes] ***********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:99
Saturday 15 February 2025  11:43:50 -0500 (0:00:00.034)       0:01:48.546 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116
Saturday 15 February 2025  11:43:50 -0500 (0:00:00.057)       0:01:48.603 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_parsed": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:120
Saturday 15 February 2025  11:43:50 -0500 (0:00:00.053)       0:01:48.657 ***** 
changed: [managed-node1] => {
    "changed": true,
    "cmd": [
        "podman",
        "image",
        "prune",
        "--all",
        "-f"
    ],
    "delta": "0:00:00.763321",
    "end": "2025-02-15 11:43:51.847236",
    "rc": 0,
    "start": "2025-02-15 11:43:51.083915"
}
STDOUT:
9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f
54c8b4fe9ef10b679d92adad3fdcaa4ca5ba12c8f6858ab62e334a2608edc257
fcf3e41b8864a14d75a6d0627d3d02154e28a153aa57e8baa392cd744ffa0d0b
5af2585e22ed1562885d9407efab74010090427be79048c2cd6a226517cc1e1d
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:131
Saturday 15 February 2025  11:43:51 -0500 (0:00:01.178)       0:01:49.836 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:43:52 -0500 (0:00:00.082)       0:01:49.919 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:43:52 -0500 (0:00:00.051)       0:01:49.971 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:43:52 -0500 (0:00:00.042)       0:01:50.013 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - images] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:141
Saturday 15 February 2025  11:43:52 -0500 (0:00:00.034)       0:01:50.048 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "images",
        "-n"
    ],
    "delta": "0:00:00.032752",
    "end": "2025-02-15 11:43:52.517510",
    "rc": 0,
    "start": "2025-02-15 11:43:52.484758"
}
STDOUT:
quay.io/libpod/registry           2.8.2       0030ba3d620c  18 months ago  24.6 MB
quay.io/linux-system-roles/mysql  5.6         dd3b2a5dcb48  3 years ago    308 MB
TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:150
Saturday 15 February 2025  11:43:52 -0500 (0:00:00.549)       0:01:50.598 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "volume",
        "ls",
        "-n"
    ],
    "delta": "0:00:00.029994",
    "end": "2025-02-15 11:43:53.055534",
    "rc": 0,
    "start": "2025-02-15 11:43:53.025540"
}
STDOUT:
local       2fefbc9190adbe5ffeef28f6e938304e8cedd541e0d51e013b7101e905a15702
local       systemd-quadlet-demo-mysql
local       wp-pv-claim
local       envoy-proxy-config
local       envoy-certificates
TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:159
Saturday 15 February 2025  11:43:53 -0500 (0:00:00.456)       0:01:51.054 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "ps",
        "--noheading"
    ],
    "delta": "0:00:00.032749",
    "end": "2025-02-15 11:43:53.518047",
    "rc": 0,
    "start": "2025-02-15 11:43:53.485298"
}
STDOUT:
a5dec710fc02  quay.io/libpod/registry:2.8.2         /etc/docker/regis...  5 minutes ago       Up 5 minutes                 127.0.0.1:5000->5000/tcp  podman_registry
3aa715549e27  quay.io/linux-system-roles/mysql:5.6  mysqld                About a minute ago  Up About a minute (healthy)  3306/tcp                  quadlet-demo-mysql
TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:168
Saturday 15 February 2025  11:43:53 -0500 (0:00:00.466)       0:01:51.521 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "network",
        "ls",
        "-n",
        "-q"
    ],
    "delta": "0:00:00.027837",
    "end": "2025-02-15 11:43:53.979116",
    "rc": 0,
    "start": "2025-02-15 11:43:53.951279"
}
STDOUT:
podman
podman-default-kube-network
systemd-quadlet-demo
TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:177
Saturday 15 February 2025  11:43:54 -0500 (0:00:00.462)       0:01:51.983 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - pods] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:187
Saturday 15 February 2025  11:43:54 -0500 (0:00:00.438)       0:01:52.421 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - services] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197
Saturday 15 February 2025  11:43:54 -0500 (0:00:00.408)       0:01:52.830 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "services": {
            "3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.service": {
                "name": "3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.service",
                "source": "systemd",
                "state": "stopped",
                "status": "failed"
            },
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "NetworkManager.service": {
                "name": "NetworkManager.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "apt-daily.service": {
                "name": "apt-daily.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "audit-rules.service": {
                "name": "audit-rules.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "auditd.service": {
                "name": "auditd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "auth-rpcgss-module.service": {
                "name": "auth-rpcgss-module.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "autofs.service": {
                "name": "autofs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "autovt@.service": {
                "name": "autovt@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "blk-availability.service": {
                "name": "blk-availability.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "capsule@.service": {
                "name": "capsule@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "certmonger.service": {
                "name": "certmonger.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "chrony-wait.service": {
                "name": "chrony-wait.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd-restricted.service": {
                "name": "chronyd-restricted.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd.service": {
                "name": "chronyd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "cloud-config.service": {
                "name": "cloud-config.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-final.service": {
                "name": "cloud-final.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init-hotplugd.service": {
                "name": "cloud-init-hotplugd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "cloud-init-local.service": {
                "name": "cloud-init-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init.service": {
                "name": "cloud-init.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "console-getty.service": {
                "name": "console-getty.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "container-getty@.service": {
                "name": "container-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "crond.service": {
                "name": "crond.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-broker.service": {
                "name": "dbus-broker.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-org.fedoraproject.FirewallD1.service": {
                "name": "dbus-org.fedoraproject.FirewallD1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus.service": {
                "name": "dbus.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "debug-shell.service": {
                "name": "debug-shell.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd.service": {
                "name": "dhcpcd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd@.service": {
                "name": "dhcpcd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "display-manager.service": {
                "name": "display-manager.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "dm-event.service": {
                "name": "dm-event.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-makecache.service": {
                "name": "dnf-makecache.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-system-upgrade-cleanup.service": {
                "name": "dnf-system-upgrade-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "dnf-system-upgrade.service": {
                "name": "dnf-system-upgrade.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-mount.service": {
                "name": "dracut-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown-onfailure.service": {
                "name": "dracut-shutdown-onfailure.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ebtables.service": {
                "name": "ebtables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "emergency.service": {
                "name": "emergency.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fips-crypto-policy-overlay.service": {
                "name": "fips-crypto-policy-overlay.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "firewalld.service": {
                "name": "firewalld.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "fsidd.service": {
                "name": "fsidd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "fstrim.service": {
                "name": "fstrim.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "getty@.service": {
                "name": "getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "enabled"
            },
            "getty@tty1.service": {
                "name": "getty@tty1.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "grub-boot-indeterminate.service": {
                "name": "grub-boot-indeterminate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "grub2-systemd-integration.service": {
                "name": "grub2-systemd-integration.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "gssproxy.service": {
                "name": "gssproxy.service",
                "source": "systemd",
                "state": "running",
                "status": "disabled"
            },
            "hv_kvp_daemon.service": {
                "name": "hv_kvp_daemon.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ip6tables.service": {
                "name": "ip6tables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ipset.service": {
                "name": "ipset.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iptables.service": {
                "name": "iptables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "irqbalance.service": {
                "name": "irqbalance.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "kdump.service": {
                "name": "kdump.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "kvm_stat.service": {
                "name": "kvm_stat.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "ldconfig.service": {
                "name": "ldconfig.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "logrotate.service": {
                "name": "logrotate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm-devices-import.service": {
                "name": "lvm-devices-import.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "lvm2-lvmpolld.service": {
                "name": "lvm2-lvmpolld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-monitor.service": {
                "name": "lvm2-monitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "man-db-cache-update.service": {
                "name": "man-db-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "man-db-restart-cache-update.service": {
                "name": "man-db-restart-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "microcode.service": {
                "name": "microcode.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "modprobe@.service": {
                "name": "modprobe@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "modprobe@configfs.service": {
                "name": "modprobe@configfs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@dm_mod.service": {
                "name": "modprobe@dm_mod.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@drm.service": {
                "name": "modprobe@drm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@efi_pstore.service": {
                "name": "modprobe@efi_pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@fuse.service": {
                "name": "modprobe@fuse.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@loop.service": {
                "name": "modprobe@loop.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "netavark-dhcp-proxy.service": {
                "name": "netavark-dhcp-proxy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "netavark-firewalld-reload.service": {
                "name": "netavark-firewalld-reload.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-blkmap.service": {
                "name": "nfs-blkmap.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-idmapd.service": {
                "name": "nfs-idmapd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-mountd.service": {
                "name": "nfs-mountd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-server.service": {
                "name": "nfs-server.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "nfs-utils.service": {
                "name": "nfs-utils.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfsdcld.service": {
                "name": "nfsdcld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nftables.service": {
                "name": "nftables.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nis-domainname.service": {
                "name": "nis-domainname.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nm-priv-helper.service": {
                "name": "nm-priv-helper.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "ntpd.service": {
                "name": "ntpd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ntpdate.service": {
                "name": "ntpdate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "pam_namespace.service": {
                "name": "pam_namespace.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "pcscd.service": {
                "name": "pcscd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "plymouth-start.service": {
                "name": "plymouth-start.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "podman-auto-update.service": {
                "name": "podman-auto-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-clean-transient.service": {
                "name": "podman-clean-transient.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-kube@.service": {
                "name": "podman-kube@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "podman-restart.service": {
                "name": "podman-restart.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman.service": {
                "name": "podman.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "polkit.service": {
                "name": "polkit.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "quadlet-demo-mysql-volume.service": {
                "name": "quadlet-demo-mysql-volume.service",
                "source": "systemd",
                "state": "stopped",
                "status": "generated"
            },
            "quadlet-demo-mysql.service": {
                "name": "quadlet-demo-mysql.service",
                "source": "systemd",
                "state": "running",
                "status": "generated"
            },
            "quadlet-demo-network.service": {
                "name": "quadlet-demo-network.service",
                "source": "systemd",
                "state": "stopped",
                "status": "generated"
            },
            "quotaon-root.service": {
                "name": "quotaon-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "quotaon@.service": {
                "name": "quotaon@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "rc-local.service": {
                "name": "rc-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rescue.service": {
                "name": "rescue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "restraintd.service": {
                "name": "restraintd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rngd.service": {
                "name": "rngd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpc-gssd.service": {
                "name": "rpc-gssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd-notify.service": {
                "name": "rpc-statd-notify.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd.service": {
                "name": "rpc-statd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-svcgssd.service": {
                "name": "rpc-svcgssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rpcbind.service": {
                "name": "rpcbind.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpmdb-migrate.service": {
                "name": "rpmdb-migrate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rpmdb-rebuild.service": {
                "name": "rpmdb-rebuild.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rsyslog.service": {
                "name": "rsyslog.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "selinux-autorelabel-mark.service": {
                "name": "selinux-autorelabel-mark.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel.service": {
                "name": "selinux-autorelabel.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "selinux-check-proper-disable.service": {
                "name": "selinux-check-proper-disable.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "serial-getty@.service": {
                "name": "serial-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "sntp.service": {
                "name": "sntp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ssh-host-keys-migration.service": {
                "name": "ssh-host-keys-migration.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "sshd-keygen.service": {
                "name": "sshd-keygen.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen@.service": {
                "name": "sshd-keygen@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "sshd-keygen@ecdsa.service": {
                "name": "sshd-keygen@ecdsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@ed25519.service": {
                "name": "sshd-keygen@ed25519.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@rsa.service": {
                "name": "sshd-keygen@rsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-unix-local@.service": {
                "name": "sshd-unix-local@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd-vsock@.service": {
                "name": "sshd-vsock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd.service": {
                "name": "sshd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "sshd@.service": {
                "name": "sshd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "sssd-autofs.service": {
                "name": "sssd-autofs.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-kcm.service": {
                "name": "sssd-kcm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "sssd-nss.service": {
                "name": "sssd-nss.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pac.service": {
                "name": "sssd-pac.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pam.service": {
                "name": "sssd-pam.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-ssh.service": {
                "name": "sssd-ssh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-sudo.service": {
                "name": "sssd-sudo.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd.service": {
                "name": "sssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "syslog.service": {
                "name": "syslog.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "system-update-cleanup.service": {
                "name": "system-update-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-battery-check.service": {
                "name": "systemd-battery-check.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-bless-boot.service": {
                "name": "systemd-bless-boot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-boot-check-no-failures.service": {
                "name": "systemd-boot-check-no-failures.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-boot-random-seed.service": {
                "name": "systemd-boot-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-boot-update.service": {
                "name": "systemd-boot-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-bootctl@.service": {
                "name": "systemd-bootctl@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-confext.service": {
                "name": "systemd-confext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-coredump@.service": {
                "name": "systemd-coredump@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-creds@.service": {
                "name": "systemd-creds@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-exit.service": {
                "name": "systemd-exit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-growfs-root.service": {
                "name": "systemd-growfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-growfs@.service": {
                "name": "systemd-growfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-halt.service": {
                "name": "systemd-halt.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hibernate-clear.service": {
                "name": "systemd-hibernate-clear.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate-resume.service": {
                "name": "systemd-hibernate-resume.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-initctl.service": {
                "name": "systemd-initctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journald-sync@.service": {
                "name": "systemd-journald-sync@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-journald.service": {
                "name": "systemd-journald.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-journald@.service": {
                "name": "systemd-journald@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-kexec.service": {
                "name": "systemd-kexec.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-localed.service": {
                "name": "systemd-localed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-logind.service": {
                "name": "systemd-logind.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-network-generator.service": {
                "name": "systemd-network-generator.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-networkd-wait-online.service": {
                "name": "systemd-networkd-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-oomd.service": {
                "name": "systemd-oomd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-pcrextend@.service": {
                "name": "systemd-pcrextend@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrfs-root.service": {
                "name": "systemd-pcrfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pcrfs@.service": {
                "name": "systemd-pcrfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrlock-file-system.service": {
                "name": "systemd-pcrlock-file-system.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-code.service": {
                "name": "systemd-pcrlock-firmware-code.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-config.service": {
                "name": "systemd-pcrlock-firmware-config.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-machine-id.service": {
                "name": "systemd-pcrlock-machine-id.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-make-policy.service": {
                "name": "systemd-pcrlock-make-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-authority.service": {
                "name": "systemd-pcrlock-secureboot-authority.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-policy.service": {
                "name": "systemd-pcrlock-secureboot-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock@.service": {
                "name": "systemd-pcrlock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrmachine.service": {
                "name": "systemd-pcrmachine.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-initrd.service": {
                "name": "systemd-pcrphase-initrd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-sysinit.service": {
                "name": "systemd-pcrphase-sysinit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase.service": {
                "name": "systemd-pcrphase.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pstore.service": {
                "name": "systemd-pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-quotacheck-root.service": {
                "name": "systemd-quotacheck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-quotacheck@.service": {
                "name": "systemd-quotacheck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-reboot.service": {
                "name": "systemd-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled-runtime"
            },
            "systemd-repart.service": {
                "name": "systemd-repart.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-rfkill.service": {
                "name": "systemd-rfkill.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-soft-reboot.service": {
                "name": "systemd-soft-reboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-suspend-then-hibernate.service": {
                "name": "systemd-suspend-then-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend.service": {
                "name": "systemd-suspend.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-sysext.service": {
                "name": "systemd-sysext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-sysext@.service": {
                "name": "systemd-sysext@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-sysupdate-reboot.service": {
                "name": "systemd-sysupdate-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysupdate.service": {
                "name": "systemd-sysupdate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysusers.service": {
                "name": "systemd-sysusers.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-timedated.service": {
                "name": "systemd-timedated.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-timesyncd.service": {
                "name": "systemd-timesyncd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev-early.service": {
                "name": "systemd-tmpfiles-setup-dev-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup-early.service": {
                "name": "systemd-tpm2-setup-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup.service": {
                "name": "systemd-tpm2-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-load-credentials.service": {
                "name": "systemd-udev-load-credentials.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udevd.service": {
                "name": "systemd-udevd.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-update-done.service": {
                "name": "systemd-update-done.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-userdbd.service": {
                "name": "systemd-userdbd.service",
                "source": "systemd",
                "state": "running",
                "status": "indirect"
            },
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-volatile-root.service": {
                "name": "systemd-volatile-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "user-runtime-dir@.service": {
                "name": "user-runtime-dir@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user-runtime-dir@0.service": {
                "name": "user-runtime-dir@0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "user@.service": {
                "name": "user@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user@0.service": {
                "name": "user@0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "ypbind.service": {
                "name": "ypbind.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            }
        }
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:43:57 -0500 (0:00:02.234)       0:01:55.065 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:43:57 -0500 (0:00:00.034)       0:01:55.099 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "---\napiVersion: v1\nkind: PersistentVolumeClaim\nmetadata:\n  name: wp-pv-claim\n  labels:\n    app: wordpress\nspec:\n  accessModes:\n  - ReadWriteOnce\n  resources:\n    requests:\n      storage: 20Gi\n---\napiVersion: v1\nkind: Pod\nmetadata:\n  name: quadlet-demo\nspec:\n  containers:\n  - name: wordpress\n    image: quay.io/linux-system-roles/wordpress:4.8-apache\n    env:\n    - name: WORDPRESS_DB_HOST\n      value: quadlet-demo-mysql\n    - name: WORDPRESS_DB_PASSWORD\n      valueFrom:\n        secretKeyRef:\n          name: mysql-root-password-kube\n          key: password\n    volumeMounts:\n    - name: wordpress-persistent-storage\n      mountPath: /var/www/html\n    resources:\n      requests:\n        memory: \"64Mi\"\n        cpu: \"250m\"\n      limits:\n        memory: \"128Mi\"\n        cpu: \"500m\"\n  - name: envoy\n    image: quay.io/linux-system-roles/envoyproxy:v1.25.0\n    volumeMounts:\n    - name: config-volume\n      mountPath: /etc/envoy\n    - name: certificates\n      mountPath: /etc/envoy-certificates\n    env:\n    - name: ENVOY_UID\n      value: \"0\"\n    resources:\n      requests:\n        memory: \"64Mi\"\n        cpu: \"250m\"\n      limits:\n        memory: \"128Mi\"\n        cpu: \"500m\"\n  volumes:\n  - name: config-volume\n    configMap:\n      name: envoy-proxy-config\n  - name: certificates\n    secret:\n      secretName: envoy-certificates\n  - name: wordpress-persistent-storage\n    persistentVolumeClaim:\n      claimName: wp-pv-claim\n  - name: www  # not used - for testing hostpath\n    hostPath:\n      path: /tmp/httpd3\n  - name: create  # not used - for testing hostpath\n    hostPath:\n      path: /tmp/httpd3-create\n",
        "__podman_quadlet_template_src": "quadlet-demo.yml.j2"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:43:57 -0500 (0:00:00.111)       0:01:55.211 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "absent",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:43:57 -0500 (0:00:00.044)       0:01:55.255 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_str",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:43:57 -0500 (0:00:00.038)       0:01:55.294 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "quadlet-demo",
        "__podman_quadlet_type": "yml",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:43:57 -0500 (0:00:00.053)       0:01:55.347 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:43:57 -0500 (0:00:00.066)       0:01:55.413 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:43:57 -0500 (0:00:00.041)       0:01:55.455 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:43:57 -0500 (0:00:00.041)       0:01:55.496 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:43:57 -0500 (0:00:00.048)       0:01:55.544 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.445)       0:01:55.990 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.038)       0:01:56.028 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.039)       0:01:56.067 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.038)       0:01:56.106 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.037)       0:01:56.144 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.040)       0:01:56.184 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.045)       0:01:56.229 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.037)       0:01:56.267 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.038)       0:01:56.305 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [],
        "__podman_kube_yamls_raw": "",
        "__podman_service_name": "",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.060)       0:01:56.366 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.037)       0:01:56.404 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.035)       0:01:56.439 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [],
        "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo.yml",
        "__podman_volumes": []
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.080)       0:01:56.520 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.043)       0:01:56.563 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.081)       0:01:56.645 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stop and disable service] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.092)       0:01:56.738 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_service_name | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : See if quadlet file exists] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33
Saturday 15 February 2025  11:43:58 -0500 (0:00:00.038)       0:01:56.777 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637782.5631068,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 8,
        "charset": "us-ascii",
        "checksum": "998dccde0483b1654327a46ddd89cbaa47650370",
        "ctime": 1739637779.2140934,
        "dev": 51714,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 25166044,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "text/plain",
        "mode": "0644",
        "mtime": 1739637778.905092,
        "nlink": 1,
        "path": "/etc/containers/systemd/quadlet-demo.yml",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 1605,
        "uid": 0,
        "version": "2785896575",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38
Saturday 15 February 2025  11:43:59 -0500 (0:00:00.390)       0:01:57.167 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Slurp quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:6
Saturday 15 February 2025  11:43:59 -0500 (0:00:00.071)       0:01:57.238 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:12
Saturday 15 February 2025  11:43:59 -0500 (0:00:00.373)       0:01:57.612 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet yaml file] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:44
Saturday 15 February 2025  11:43:59 -0500 (0:00:00.037)       0:01:57.649 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Reset raw variable] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:52
Saturday 15 February 2025  11:43:59 -0500 (0:00:00.046)       0:01:57.696 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_raw": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Remove quadlet file] ******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42
Saturday 15 February 2025  11:43:59 -0500 (0:00:00.037)       0:01:57.733 ***** 
changed: [managed-node1] => {
    "changed": true,
    "path": "/etc/containers/systemd/quadlet-demo.yml",
    "state": "absent"
}
TASK [fedora.linux_system_roles.podman : Refresh systemd] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48
Saturday 15 February 2025  11:44:00 -0500 (0:00:00.384)       0:01:58.118 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Remove managed resource] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58
Saturday 15 February 2025  11:44:00 -0500 (0:00:00.762)       0:01:58.880 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Remove volumes] ***********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:99
Saturday 15 February 2025  11:44:01 -0500 (0:00:00.039)       0:01:58.919 ***** 
changed: [managed-node1] => (item=None) => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
changed: [managed-node1] => (item=None) => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
changed: [managed-node1] => (item=None) => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
changed: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116
Saturday 15 February 2025  11:44:02 -0500 (0:00:01.246)       0:02:00.165 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_parsed": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:120
Saturday 15 February 2025  11:44:02 -0500 (0:00:00.035)       0:02:00.201 ***** 
changed: [managed-node1] => {
    "changed": true,
    "cmd": [
        "podman",
        "image",
        "prune",
        "--all",
        "-f"
    ],
    "delta": "0:00:00.028296",
    "end": "2025-02-15 11:44:02.628208",
    "rc": 0,
    "start": "2025-02-15 11:44:02.599912"
}
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:131
Saturday 15 February 2025  11:44:02 -0500 (0:00:00.414)       0:02:00.615 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:44:02 -0500 (0:00:00.064)       0:02:00.680 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:44:02 -0500 (0:00:00.095)       0:02:00.775 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:44:02 -0500 (0:00:00.036)       0:02:00.811 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - images] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:141
Saturday 15 February 2025  11:44:02 -0500 (0:00:00.035)       0:02:00.847 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "images",
        "-n"
    ],
    "delta": "0:00:00.032180",
    "end": "2025-02-15 11:44:03.284074",
    "rc": 0,
    "start": "2025-02-15 11:44:03.251894"
}
STDOUT:
quay.io/libpod/registry           2.8.2       0030ba3d620c  18 months ago  24.6 MB
quay.io/linux-system-roles/mysql  5.6         dd3b2a5dcb48  3 years ago    308 MB
TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:150
Saturday 15 February 2025  11:44:03 -0500 (0:00:00.423)       0:02:01.271 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "volume",
        "ls",
        "-n"
    ],
    "delta": "0:00:00.028321",
    "end": "2025-02-15 11:44:03.706909",
    "rc": 0,
    "start": "2025-02-15 11:44:03.678588"
}
STDOUT:
local       2fefbc9190adbe5ffeef28f6e938304e8cedd541e0d51e013b7101e905a15702
local       systemd-quadlet-demo-mysql
TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:159
Saturday 15 February 2025  11:44:03 -0500 (0:00:00.423)       0:02:01.694 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "ps",
        "--noheading"
    ],
    "delta": "0:00:00.033238",
    "end": "2025-02-15 11:44:04.127180",
    "rc": 0,
    "start": "2025-02-15 11:44:04.093942"
}
STDOUT:
a5dec710fc02  quay.io/libpod/registry:2.8.2         /etc/docker/regis...  6 minutes ago       Up 6 minutes                 127.0.0.1:5000->5000/tcp  podman_registry
3aa715549e27  quay.io/linux-system-roles/mysql:5.6  mysqld                About a minute ago  Up About a minute (healthy)  3306/tcp                  quadlet-demo-mysql
TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:168
Saturday 15 February 2025  11:44:04 -0500 (0:00:00.418)       0:02:02.113 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "network",
        "ls",
        "-n",
        "-q"
    ],
    "delta": "0:00:00.027743",
    "end": "2025-02-15 11:44:04.541112",
    "rc": 0,
    "start": "2025-02-15 11:44:04.513369"
}
STDOUT:
podman
podman-default-kube-network
systemd-quadlet-demo
TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:177
Saturday 15 February 2025  11:44:04 -0500 (0:00:00.414)       0:02:02.528 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - pods] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:187
Saturday 15 February 2025  11:44:05 -0500 (0:00:00.409)       0:02:02.937 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - services] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197
Saturday 15 February 2025  11:44:05 -0500 (0:00:00.418)       0:02:03.356 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "services": {
            "3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.service": {
                "name": "3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.service",
                "source": "systemd",
                "state": "stopped",
                "status": "failed"
            },
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "NetworkManager.service": {
                "name": "NetworkManager.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "apt-daily.service": {
                "name": "apt-daily.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "audit-rules.service": {
                "name": "audit-rules.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "auditd.service": {
                "name": "auditd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "auth-rpcgss-module.service": {
                "name": "auth-rpcgss-module.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "autofs.service": {
                "name": "autofs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "autovt@.service": {
                "name": "autovt@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "blk-availability.service": {
                "name": "blk-availability.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "capsule@.service": {
                "name": "capsule@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "certmonger.service": {
                "name": "certmonger.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "chrony-wait.service": {
                "name": "chrony-wait.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd-restricted.service": {
                "name": "chronyd-restricted.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd.service": {
                "name": "chronyd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "cloud-config.service": {
                "name": "cloud-config.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-final.service": {
                "name": "cloud-final.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init-hotplugd.service": {
                "name": "cloud-init-hotplugd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "cloud-init-local.service": {
                "name": "cloud-init-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init.service": {
                "name": "cloud-init.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "console-getty.service": {
                "name": "console-getty.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "container-getty@.service": {
                "name": "container-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "crond.service": {
                "name": "crond.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-broker.service": {
                "name": "dbus-broker.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-org.fedoraproject.FirewallD1.service": {
                "name": "dbus-org.fedoraproject.FirewallD1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus.service": {
                "name": "dbus.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "debug-shell.service": {
                "name": "debug-shell.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd.service": {
                "name": "dhcpcd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd@.service": {
                "name": "dhcpcd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "display-manager.service": {
                "name": "display-manager.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "dm-event.service": {
                "name": "dm-event.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-makecache.service": {
                "name": "dnf-makecache.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-system-upgrade-cleanup.service": {
                "name": "dnf-system-upgrade-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "dnf-system-upgrade.service": {
                "name": "dnf-system-upgrade.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-mount.service": {
                "name": "dracut-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown-onfailure.service": {
                "name": "dracut-shutdown-onfailure.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ebtables.service": {
                "name": "ebtables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "emergency.service": {
                "name": "emergency.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fips-crypto-policy-overlay.service": {
                "name": "fips-crypto-policy-overlay.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "firewalld.service": {
                "name": "firewalld.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "fsidd.service": {
                "name": "fsidd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "fstrim.service": {
                "name": "fstrim.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "getty@.service": {
                "name": "getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "enabled"
            },
            "getty@tty1.service": {
                "name": "getty@tty1.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "grub-boot-indeterminate.service": {
                "name": "grub-boot-indeterminate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "grub2-systemd-integration.service": {
                "name": "grub2-systemd-integration.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "gssproxy.service": {
                "name": "gssproxy.service",
                "source": "systemd",
                "state": "running",
                "status": "disabled"
            },
            "hv_kvp_daemon.service": {
                "name": "hv_kvp_daemon.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ip6tables.service": {
                "name": "ip6tables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ipset.service": {
                "name": "ipset.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iptables.service": {
                "name": "iptables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "irqbalance.service": {
                "name": "irqbalance.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "kdump.service": {
                "name": "kdump.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "kvm_stat.service": {
                "name": "kvm_stat.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "ldconfig.service": {
                "name": "ldconfig.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "logrotate.service": {
                "name": "logrotate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm-devices-import.service": {
                "name": "lvm-devices-import.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "lvm2-lvmpolld.service": {
                "name": "lvm2-lvmpolld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-monitor.service": {
                "name": "lvm2-monitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "man-db-cache-update.service": {
                "name": "man-db-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "man-db-restart-cache-update.service": {
                "name": "man-db-restart-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "microcode.service": {
                "name": "microcode.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "modprobe@.service": {
                "name": "modprobe@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "modprobe@configfs.service": {
                "name": "modprobe@configfs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@dm_mod.service": {
                "name": "modprobe@dm_mod.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@drm.service": {
                "name": "modprobe@drm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@efi_pstore.service": {
                "name": "modprobe@efi_pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@fuse.service": {
                "name": "modprobe@fuse.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@loop.service": {
                "name": "modprobe@loop.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "netavark-dhcp-proxy.service": {
                "name": "netavark-dhcp-proxy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "netavark-firewalld-reload.service": {
                "name": "netavark-firewalld-reload.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-blkmap.service": {
                "name": "nfs-blkmap.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-idmapd.service": {
                "name": "nfs-idmapd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-mountd.service": {
                "name": "nfs-mountd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-server.service": {
                "name": "nfs-server.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "nfs-utils.service": {
                "name": "nfs-utils.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfsdcld.service": {
                "name": "nfsdcld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nftables.service": {
                "name": "nftables.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nis-domainname.service": {
                "name": "nis-domainname.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nm-priv-helper.service": {
                "name": "nm-priv-helper.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "ntpd.service": {
                "name": "ntpd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ntpdate.service": {
                "name": "ntpdate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "pam_namespace.service": {
                "name": "pam_namespace.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "pcscd.service": {
                "name": "pcscd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "plymouth-start.service": {
                "name": "plymouth-start.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "podman-auto-update.service": {
                "name": "podman-auto-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-clean-transient.service": {
                "name": "podman-clean-transient.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-kube@.service": {
                "name": "podman-kube@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "podman-restart.service": {
                "name": "podman-restart.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman.service": {
                "name": "podman.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "polkit.service": {
                "name": "polkit.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "quadlet-demo-mysql-volume.service": {
                "name": "quadlet-demo-mysql-volume.service",
                "source": "systemd",
                "state": "stopped",
                "status": "generated"
            },
            "quadlet-demo-mysql.service": {
                "name": "quadlet-demo-mysql.service",
                "source": "systemd",
                "state": "running",
                "status": "generated"
            },
            "quadlet-demo-network.service": {
                "name": "quadlet-demo-network.service",
                "source": "systemd",
                "state": "stopped",
                "status": "generated"
            },
            "quotaon-root.service": {
                "name": "quotaon-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "quotaon@.service": {
                "name": "quotaon@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "rc-local.service": {
                "name": "rc-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rescue.service": {
                "name": "rescue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "restraintd.service": {
                "name": "restraintd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rngd.service": {
                "name": "rngd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpc-gssd.service": {
                "name": "rpc-gssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd-notify.service": {
                "name": "rpc-statd-notify.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd.service": {
                "name": "rpc-statd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-svcgssd.service": {
                "name": "rpc-svcgssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rpcbind.service": {
                "name": "rpcbind.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpmdb-migrate.service": {
                "name": "rpmdb-migrate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rpmdb-rebuild.service": {
                "name": "rpmdb-rebuild.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rsyslog.service": {
                "name": "rsyslog.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "selinux-autorelabel-mark.service": {
                "name": "selinux-autorelabel-mark.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel.service": {
                "name": "selinux-autorelabel.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "selinux-check-proper-disable.service": {
                "name": "selinux-check-proper-disable.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "serial-getty@.service": {
                "name": "serial-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "sntp.service": {
                "name": "sntp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ssh-host-keys-migration.service": {
                "name": "ssh-host-keys-migration.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "sshd-keygen.service": {
                "name": "sshd-keygen.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen@.service": {
                "name": "sshd-keygen@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "sshd-keygen@ecdsa.service": {
                "name": "sshd-keygen@ecdsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@ed25519.service": {
                "name": "sshd-keygen@ed25519.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@rsa.service": {
                "name": "sshd-keygen@rsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-unix-local@.service": {
                "name": "sshd-unix-local@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd-vsock@.service": {
                "name": "sshd-vsock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd.service": {
                "name": "sshd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "sshd@.service": {
                "name": "sshd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "sssd-autofs.service": {
                "name": "sssd-autofs.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-kcm.service": {
                "name": "sssd-kcm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "sssd-nss.service": {
                "name": "sssd-nss.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pac.service": {
                "name": "sssd-pac.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pam.service": {
                "name": "sssd-pam.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-ssh.service": {
                "name": "sssd-ssh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-sudo.service": {
                "name": "sssd-sudo.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd.service": {
                "name": "sssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "syslog.service": {
                "name": "syslog.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "system-update-cleanup.service": {
                "name": "system-update-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-battery-check.service": {
                "name": "systemd-battery-check.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-bless-boot.service": {
                "name": "systemd-bless-boot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-boot-check-no-failures.service": {
                "name": "systemd-boot-check-no-failures.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-boot-random-seed.service": {
                "name": "systemd-boot-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-boot-update.service": {
                "name": "systemd-boot-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-bootctl@.service": {
                "name": "systemd-bootctl@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-confext.service": {
                "name": "systemd-confext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-coredump@.service": {
                "name": "systemd-coredump@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-creds@.service": {
                "name": "systemd-creds@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-exit.service": {
                "name": "systemd-exit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-growfs-root.service": {
                "name": "systemd-growfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-growfs@.service": {
                "name": "systemd-growfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-halt.service": {
                "name": "systemd-halt.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hibernate-clear.service": {
                "name": "systemd-hibernate-clear.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate-resume.service": {
                "name": "systemd-hibernate-resume.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-initctl.service": {
                "name": "systemd-initctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journald-sync@.service": {
                "name": "systemd-journald-sync@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-journald.service": {
                "name": "systemd-journald.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-journald@.service": {
                "name": "systemd-journald@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-kexec.service": {
                "name": "systemd-kexec.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-localed.service": {
                "name": "systemd-localed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-logind.service": {
                "name": "systemd-logind.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-network-generator.service": {
                "name": "systemd-network-generator.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-networkd-wait-online.service": {
                "name": "systemd-networkd-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-oomd.service": {
                "name": "systemd-oomd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-pcrextend@.service": {
                "name": "systemd-pcrextend@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrfs-root.service": {
                "name": "systemd-pcrfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pcrfs@.service": {
                "name": "systemd-pcrfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrlock-file-system.service": {
                "name": "systemd-pcrlock-file-system.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-code.service": {
                "name": "systemd-pcrlock-firmware-code.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-config.service": {
                "name": "systemd-pcrlock-firmware-config.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-machine-id.service": {
                "name": "systemd-pcrlock-machine-id.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-make-policy.service": {
                "name": "systemd-pcrlock-make-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-authority.service": {
                "name": "systemd-pcrlock-secureboot-authority.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-policy.service": {
                "name": "systemd-pcrlock-secureboot-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock@.service": {
                "name": "systemd-pcrlock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrmachine.service": {
                "name": "systemd-pcrmachine.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-initrd.service": {
                "name": "systemd-pcrphase-initrd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-sysinit.service": {
                "name": "systemd-pcrphase-sysinit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase.service": {
                "name": "systemd-pcrphase.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pstore.service": {
                "name": "systemd-pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-quotacheck-root.service": {
                "name": "systemd-quotacheck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-quotacheck@.service": {
                "name": "systemd-quotacheck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-reboot.service": {
                "name": "systemd-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled-runtime"
            },
            "systemd-repart.service": {
                "name": "systemd-repart.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-rfkill.service": {
                "name": "systemd-rfkill.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-soft-reboot.service": {
                "name": "systemd-soft-reboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-suspend-then-hibernate.service": {
                "name": "systemd-suspend-then-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend.service": {
                "name": "systemd-suspend.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-sysext.service": {
                "name": "systemd-sysext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-sysext@.service": {
                "name": "systemd-sysext@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-sysupdate-reboot.service": {
                "name": "systemd-sysupdate-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysupdate.service": {
                "name": "systemd-sysupdate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysusers.service": {
                "name": "systemd-sysusers.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-timedated.service": {
                "name": "systemd-timedated.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-timesyncd.service": {
                "name": "systemd-timesyncd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev-early.service": {
                "name": "systemd-tmpfiles-setup-dev-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup-early.service": {
                "name": "systemd-tpm2-setup-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup.service": {
                "name": "systemd-tpm2-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-load-credentials.service": {
                "name": "systemd-udev-load-credentials.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udevd.service": {
                "name": "systemd-udevd.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-update-done.service": {
                "name": "systemd-update-done.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-userdbd.service": {
                "name": "systemd-userdbd.service",
                "source": "systemd",
                "state": "running",
                "status": "indirect"
            },
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-volatile-root.service": {
                "name": "systemd-volatile-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "user-runtime-dir@.service": {
                "name": "user-runtime-dir@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user-runtime-dir@0.service": {
                "name": "user-runtime-dir@0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "user@.service": {
                "name": "user@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user@0.service": {
                "name": "user@0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "ypbind.service": {
                "name": "ypbind.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            }
        }
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:44:07 -0500 (0:00:02.071)       0:02:05.428 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:44:07 -0500 (0:00:00.035)       0:02:05.464 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "envoy-proxy-configmap.yml",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "---\napiVersion: v1\nkind: ConfigMap\nmetadata:\n  name: envoy-proxy-config\ndata:\n  envoy.yaml: |\n    admin:\n      address:\n        socket_address:\n          address: 0.0.0.0\n          port_value: 9901\n\n    static_resources:\n      listeners:\n      - name: listener_0\n        address:\n          socket_address:\n            address: 0.0.0.0\n            port_value: 8080\n        filter_chains:\n        - filters:\n          - name: envoy.filters.network.http_connection_manager\n            typed_config:\n              \"@type\": type.googleapis.com/envoy.extensions.filters.network.http_connection_manager.v3.HttpConnectionManager\n              stat_prefix: ingress_http\n              codec_type: AUTO\n              route_config:\n                name: local_route\n                virtual_hosts:\n                - name: local_service\n                  domains: [\"*\"]\n                  routes:\n                  - match:\n                      prefix: \"/\"\n                    route:\n                      cluster: backend\n              http_filters:\n              - name: envoy.filters.http.router\n                typed_config:\n                  \"@type\": type.googleapis.com/envoy.extensions.filters.http.router.v3.Router\n          transport_socket:\n            name: envoy.transport_sockets.tls\n            typed_config:\n              \"@type\": type.googleapis.com/envoy.extensions.transport_sockets.tls.v3.DownstreamTlsContext\n              common_tls_context:\n                tls_certificates:\n                - certificate_chain:\n                    filename: /etc/envoy-certificates/certificate.pem\n                  private_key:\n                    filename: /etc/envoy-certificates/certificate.key\n      clusters:\n      - name: backend\n        connect_timeout: 5s\n        type: STATIC\n        dns_refresh_rate: 1800s\n        lb_policy: ROUND_ROBIN\n        load_assignment:\n          cluster_name: backend\n          endpoints:\n          - lb_endpoints:\n            - endpoint:\n                address:\n                  socket_address:\n                    address: 127.0.0.1\n                    port_value: 80",
        "__podman_quadlet_template_src": ""
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:44:07 -0500 (0:00:00.051)       0:02:05.515 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "absent",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:44:07 -0500 (0:00:00.050)       0:02:05.565 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:44:07 -0500 (0:00:00.056)       0:02:05.621 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "envoy-proxy-configmap",
        "__podman_quadlet_type": "yml",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:44:07 -0500 (0:00:00.089)       0:02:05.711 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:44:07 -0500 (0:00:00.190)       0:02:05.901 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.069)       0:02:05.971 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.055)       0:02:06.026 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.063)       0:02:06.090 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.396)       0:02:06.487 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.044)       0:02:06.531 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.045)       0:02:06.576 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.042)       0:02:06.619 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.038)       0:02:06.657 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.039)       0:02:06.696 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.039)       0:02:06.736 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.040)       0:02:06.776 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.052)       0:02:06.829 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [],
        "__podman_kube_yamls_raw": "",
        "__podman_service_name": "",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:44:08 -0500 (0:00:00.083)       0:02:06.912 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:44:09 -0500 (0:00:00.045)       0:02:06.957 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:44:09 -0500 (0:00:00.044)       0:02:07.001 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [],
        "__podman_quadlet_file": "/etc/containers/systemd/envoy-proxy-configmap.yml",
        "__podman_volumes": []
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:44:09 -0500 (0:00:00.153)       0:02:07.155 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:44:09 -0500 (0:00:00.043)       0:02:07.199 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4
Saturday 15 February 2025  11:44:09 -0500 (0:00:00.085)       0:02:07.284 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stop and disable service] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12
Saturday 15 February 2025  11:44:09 -0500 (0:00:00.036)       0:02:07.320 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_service_name | length > 0",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : See if quadlet file exists] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33
Saturday 15 February 2025  11:44:09 -0500 (0:00:00.042)       0:02:07.362 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637804.4801939,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 8,
        "charset": "us-ascii",
        "checksum": "d681c7d56f912150d041873e880818b22a90c188",
        "ctime": 1739637774.0100727,
        "dev": 51714,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 683671773,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "text/plain",
        "mode": "0644",
        "mtime": 1739637773.6940713,
        "nlink": 1,
        "path": "/etc/containers/systemd/envoy-proxy-configmap.yml",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 2102,
        "uid": 0,
        "version": "338797091",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38
Saturday 15 February 2025  11:44:09 -0500 (0:00:00.402)       0:02:07.764 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Slurp quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:6
Saturday 15 February 2025  11:44:09 -0500 (0:00:00.064)       0:02:07.829 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:12
Saturday 15 February 2025  11:44:10 -0500 (0:00:00.379)       0:02:08.208 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet yaml file] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:44
Saturday 15 February 2025  11:44:10 -0500 (0:00:00.045)       0:02:08.254 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Reset raw variable] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:52
Saturday 15 February 2025  11:44:10 -0500 (0:00:00.052)       0:02:08.306 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_raw": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Remove quadlet file] ******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42
Saturday 15 February 2025  11:44:10 -0500 (0:00:00.043)       0:02:08.350 ***** 
changed: [managed-node1] => {
    "changed": true,
    "path": "/etc/containers/systemd/envoy-proxy-configmap.yml",
    "state": "absent"
}
TASK [fedora.linux_system_roles.podman : Refresh systemd] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48
Saturday 15 February 2025  11:44:10 -0500 (0:00:00.417)       0:02:08.767 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Remove managed resource] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58
Saturday 15 February 2025  11:44:11 -0500 (0:00:00.782)       0:02:09.550 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Remove volumes] ***********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:99
Saturday 15 February 2025  11:44:11 -0500 (0:00:00.042)       0:02:09.593 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116
Saturday 15 February 2025  11:44:11 -0500 (0:00:00.050)       0:02:09.644 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_parsed": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:120
Saturday 15 February 2025  11:44:11 -0500 (0:00:00.104)       0:02:09.748 ***** 
changed: [managed-node1] => {
    "changed": true,
    "cmd": [
        "podman",
        "image",
        "prune",
        "--all",
        "-f"
    ],
    "delta": "0:00:00.030439",
    "end": "2025-02-15 11:44:12.187698",
    "rc": 0,
    "start": "2025-02-15 11:44:12.157259"
}
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:131
Saturday 15 February 2025  11:44:12 -0500 (0:00:00.425)       0:02:10.174 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:44:12 -0500 (0:00:00.066)       0:02:10.240 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:44:12 -0500 (0:00:00.036)       0:02:10.276 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:44:12 -0500 (0:00:00.041)       0:02:10.318 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - images] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:141
Saturday 15 February 2025  11:44:12 -0500 (0:00:00.052)       0:02:10.371 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "images",
        "-n"
    ],
    "delta": "0:00:00.031271",
    "end": "2025-02-15 11:44:12.826959",
    "rc": 0,
    "start": "2025-02-15 11:44:12.795688"
}
STDOUT:
quay.io/libpod/registry           2.8.2       0030ba3d620c  18 months ago  24.6 MB
quay.io/linux-system-roles/mysql  5.6         dd3b2a5dcb48  3 years ago    308 MB
TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:150
Saturday 15 February 2025  11:44:12 -0500 (0:00:00.443)       0:02:10.814 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "volume",
        "ls",
        "-n"
    ],
    "delta": "0:00:00.029135",
    "end": "2025-02-15 11:44:13.251629",
    "rc": 0,
    "start": "2025-02-15 11:44:13.222494"
}
STDOUT:
local       2fefbc9190adbe5ffeef28f6e938304e8cedd541e0d51e013b7101e905a15702
local       systemd-quadlet-demo-mysql
TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:159
Saturday 15 February 2025  11:44:13 -0500 (0:00:00.423)       0:02:11.238 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "ps",
        "--noheading"
    ],
    "delta": "0:00:00.034101",
    "end": "2025-02-15 11:44:13.675459",
    "rc": 0,
    "start": "2025-02-15 11:44:13.641358"
}
STDOUT:
a5dec710fc02  quay.io/libpod/registry:2.8.2         /etc/docker/regis...  6 minutes ago       Up 6 minutes                 127.0.0.1:5000->5000/tcp  podman_registry
3aa715549e27  quay.io/linux-system-roles/mysql:5.6  mysqld                About a minute ago  Up About a minute (healthy)  3306/tcp                  quadlet-demo-mysql
TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:168
Saturday 15 February 2025  11:44:13 -0500 (0:00:00.423)       0:02:11.661 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "network",
        "ls",
        "-n",
        "-q"
    ],
    "delta": "0:00:00.028538",
    "end": "2025-02-15 11:44:14.086759",
    "rc": 0,
    "start": "2025-02-15 11:44:14.058221"
}
STDOUT:
podman
podman-default-kube-network
systemd-quadlet-demo
TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:177
Saturday 15 February 2025  11:44:14 -0500 (0:00:00.439)       0:02:12.100 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - pods] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:187
Saturday 15 February 2025  11:44:14 -0500 (0:00:00.430)       0:02:12.530 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - services] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197
Saturday 15 February 2025  11:44:15 -0500 (0:00:00.410)       0:02:12.941 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "services": {
            "3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.service": {
                "name": "3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.service",
                "source": "systemd",
                "state": "stopped",
                "status": "failed"
            },
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "NetworkManager.service": {
                "name": "NetworkManager.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "apt-daily.service": {
                "name": "apt-daily.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "audit-rules.service": {
                "name": "audit-rules.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "auditd.service": {
                "name": "auditd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "auth-rpcgss-module.service": {
                "name": "auth-rpcgss-module.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "autofs.service": {
                "name": "autofs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "autovt@.service": {
                "name": "autovt@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "blk-availability.service": {
                "name": "blk-availability.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "capsule@.service": {
                "name": "capsule@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "certmonger.service": {
                "name": "certmonger.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "chrony-wait.service": {
                "name": "chrony-wait.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd-restricted.service": {
                "name": "chronyd-restricted.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd.service": {
                "name": "chronyd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "cloud-config.service": {
                "name": "cloud-config.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-final.service": {
                "name": "cloud-final.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init-hotplugd.service": {
                "name": "cloud-init-hotplugd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "cloud-init-local.service": {
                "name": "cloud-init-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init.service": {
                "name": "cloud-init.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "console-getty.service": {
                "name": "console-getty.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "container-getty@.service": {
                "name": "container-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "crond.service": {
                "name": "crond.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-broker.service": {
                "name": "dbus-broker.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-org.fedoraproject.FirewallD1.service": {
                "name": "dbus-org.fedoraproject.FirewallD1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus.service": {
                "name": "dbus.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "debug-shell.service": {
                "name": "debug-shell.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd.service": {
                "name": "dhcpcd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd@.service": {
                "name": "dhcpcd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "display-manager.service": {
                "name": "display-manager.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "dm-event.service": {
                "name": "dm-event.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-makecache.service": {
                "name": "dnf-makecache.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-system-upgrade-cleanup.service": {
                "name": "dnf-system-upgrade-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "dnf-system-upgrade.service": {
                "name": "dnf-system-upgrade.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-mount.service": {
                "name": "dracut-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown-onfailure.service": {
                "name": "dracut-shutdown-onfailure.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ebtables.service": {
                "name": "ebtables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "emergency.service": {
                "name": "emergency.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fips-crypto-policy-overlay.service": {
                "name": "fips-crypto-policy-overlay.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "firewalld.service": {
                "name": "firewalld.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "fsidd.service": {
                "name": "fsidd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "fstrim.service": {
                "name": "fstrim.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "getty@.service": {
                "name": "getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "enabled"
            },
            "getty@tty1.service": {
                "name": "getty@tty1.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "grub-boot-indeterminate.service": {
                "name": "grub-boot-indeterminate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "grub2-systemd-integration.service": {
                "name": "grub2-systemd-integration.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "gssproxy.service": {
                "name": "gssproxy.service",
                "source": "systemd",
                "state": "running",
                "status": "disabled"
            },
            "hv_kvp_daemon.service": {
                "name": "hv_kvp_daemon.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ip6tables.service": {
                "name": "ip6tables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ipset.service": {
                "name": "ipset.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iptables.service": {
                "name": "iptables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "irqbalance.service": {
                "name": "irqbalance.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "kdump.service": {
                "name": "kdump.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "kvm_stat.service": {
                "name": "kvm_stat.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "ldconfig.service": {
                "name": "ldconfig.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "logrotate.service": {
                "name": "logrotate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm-devices-import.service": {
                "name": "lvm-devices-import.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "lvm2-lvmpolld.service": {
                "name": "lvm2-lvmpolld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-monitor.service": {
                "name": "lvm2-monitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "man-db-cache-update.service": {
                "name": "man-db-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "man-db-restart-cache-update.service": {
                "name": "man-db-restart-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "microcode.service": {
                "name": "microcode.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "modprobe@.service": {
                "name": "modprobe@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "modprobe@configfs.service": {
                "name": "modprobe@configfs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@dm_mod.service": {
                "name": "modprobe@dm_mod.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@drm.service": {
                "name": "modprobe@drm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@efi_pstore.service": {
                "name": "modprobe@efi_pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@fuse.service": {
                "name": "modprobe@fuse.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@loop.service": {
                "name": "modprobe@loop.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "netavark-dhcp-proxy.service": {
                "name": "netavark-dhcp-proxy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "netavark-firewalld-reload.service": {
                "name": "netavark-firewalld-reload.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-blkmap.service": {
                "name": "nfs-blkmap.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-idmapd.service": {
                "name": "nfs-idmapd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-mountd.service": {
                "name": "nfs-mountd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-server.service": {
                "name": "nfs-server.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "nfs-utils.service": {
                "name": "nfs-utils.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfsdcld.service": {
                "name": "nfsdcld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nftables.service": {
                "name": "nftables.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nis-domainname.service": {
                "name": "nis-domainname.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nm-priv-helper.service": {
                "name": "nm-priv-helper.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "ntpd.service": {
                "name": "ntpd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ntpdate.service": {
                "name": "ntpdate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "pam_namespace.service": {
                "name": "pam_namespace.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "pcscd.service": {
                "name": "pcscd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "plymouth-start.service": {
                "name": "plymouth-start.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "podman-auto-update.service": {
                "name": "podman-auto-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-clean-transient.service": {
                "name": "podman-clean-transient.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-kube@.service": {
                "name": "podman-kube@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "podman-restart.service": {
                "name": "podman-restart.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman.service": {
                "name": "podman.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "polkit.service": {
                "name": "polkit.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "quadlet-demo-mysql-volume.service": {
                "name": "quadlet-demo-mysql-volume.service",
                "source": "systemd",
                "state": "stopped",
                "status": "generated"
            },
            "quadlet-demo-mysql.service": {
                "name": "quadlet-demo-mysql.service",
                "source": "systemd",
                "state": "running",
                "status": "generated"
            },
            "quadlet-demo-network.service": {
                "name": "quadlet-demo-network.service",
                "source": "systemd",
                "state": "stopped",
                "status": "generated"
            },
            "quotaon-root.service": {
                "name": "quotaon-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "quotaon@.service": {
                "name": "quotaon@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "rc-local.service": {
                "name": "rc-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rescue.service": {
                "name": "rescue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "restraintd.service": {
                "name": "restraintd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rngd.service": {
                "name": "rngd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpc-gssd.service": {
                "name": "rpc-gssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd-notify.service": {
                "name": "rpc-statd-notify.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd.service": {
                "name": "rpc-statd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-svcgssd.service": {
                "name": "rpc-svcgssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rpcbind.service": {
                "name": "rpcbind.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpmdb-migrate.service": {
                "name": "rpmdb-migrate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rpmdb-rebuild.service": {
                "name": "rpmdb-rebuild.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rsyslog.service": {
                "name": "rsyslog.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "selinux-autorelabel-mark.service": {
                "name": "selinux-autorelabel-mark.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel.service": {
                "name": "selinux-autorelabel.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "selinux-check-proper-disable.service": {
                "name": "selinux-check-proper-disable.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "serial-getty@.service": {
                "name": "serial-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "sntp.service": {
                "name": "sntp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ssh-host-keys-migration.service": {
                "name": "ssh-host-keys-migration.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "sshd-keygen.service": {
                "name": "sshd-keygen.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen@.service": {
                "name": "sshd-keygen@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "sshd-keygen@ecdsa.service": {
                "name": "sshd-keygen@ecdsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@ed25519.service": {
                "name": "sshd-keygen@ed25519.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@rsa.service": {
                "name": "sshd-keygen@rsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-unix-local@.service": {
                "name": "sshd-unix-local@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd-vsock@.service": {
                "name": "sshd-vsock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd.service": {
                "name": "sshd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "sshd@.service": {
                "name": "sshd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "sssd-autofs.service": {
                "name": "sssd-autofs.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-kcm.service": {
                "name": "sssd-kcm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "sssd-nss.service": {
                "name": "sssd-nss.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pac.service": {
                "name": "sssd-pac.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pam.service": {
                "name": "sssd-pam.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-ssh.service": {
                "name": "sssd-ssh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-sudo.service": {
                "name": "sssd-sudo.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd.service": {
                "name": "sssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "syslog.service": {
                "name": "syslog.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "system-update-cleanup.service": {
                "name": "system-update-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-battery-check.service": {
                "name": "systemd-battery-check.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-bless-boot.service": {
                "name": "systemd-bless-boot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-boot-check-no-failures.service": {
                "name": "systemd-boot-check-no-failures.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-boot-random-seed.service": {
                "name": "systemd-boot-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-boot-update.service": {
                "name": "systemd-boot-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-bootctl@.service": {
                "name": "systemd-bootctl@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-confext.service": {
                "name": "systemd-confext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-coredump@.service": {
                "name": "systemd-coredump@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-creds@.service": {
                "name": "systemd-creds@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-exit.service": {
                "name": "systemd-exit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-growfs-root.service": {
                "name": "systemd-growfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-growfs@.service": {
                "name": "systemd-growfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-halt.service": {
                "name": "systemd-halt.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hibernate-clear.service": {
                "name": "systemd-hibernate-clear.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate-resume.service": {
                "name": "systemd-hibernate-resume.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-initctl.service": {
                "name": "systemd-initctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journald-sync@.service": {
                "name": "systemd-journald-sync@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-journald.service": {
                "name": "systemd-journald.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-journald@.service": {
                "name": "systemd-journald@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-kexec.service": {
                "name": "systemd-kexec.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-localed.service": {
                "name": "systemd-localed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-logind.service": {
                "name": "systemd-logind.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-network-generator.service": {
                "name": "systemd-network-generator.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-networkd-wait-online.service": {
                "name": "systemd-networkd-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-oomd.service": {
                "name": "systemd-oomd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-pcrextend@.service": {
                "name": "systemd-pcrextend@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrfs-root.service": {
                "name": "systemd-pcrfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pcrfs@.service": {
                "name": "systemd-pcrfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrlock-file-system.service": {
                "name": "systemd-pcrlock-file-system.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-code.service": {
                "name": "systemd-pcrlock-firmware-code.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-config.service": {
                "name": "systemd-pcrlock-firmware-config.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-machine-id.service": {
                "name": "systemd-pcrlock-machine-id.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-make-policy.service": {
                "name": "systemd-pcrlock-make-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-authority.service": {
                "name": "systemd-pcrlock-secureboot-authority.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-policy.service": {
                "name": "systemd-pcrlock-secureboot-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock@.service": {
                "name": "systemd-pcrlock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrmachine.service": {
                "name": "systemd-pcrmachine.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-initrd.service": {
                "name": "systemd-pcrphase-initrd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-sysinit.service": {
                "name": "systemd-pcrphase-sysinit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase.service": {
                "name": "systemd-pcrphase.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pstore.service": {
                "name": "systemd-pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-quotacheck-root.service": {
                "name": "systemd-quotacheck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-quotacheck@.service": {
                "name": "systemd-quotacheck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-reboot.service": {
                "name": "systemd-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled-runtime"
            },
            "systemd-repart.service": {
                "name": "systemd-repart.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-rfkill.service": {
                "name": "systemd-rfkill.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-soft-reboot.service": {
                "name": "systemd-soft-reboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-suspend-then-hibernate.service": {
                "name": "systemd-suspend-then-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend.service": {
                "name": "systemd-suspend.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-sysext.service": {
                "name": "systemd-sysext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-sysext@.service": {
                "name": "systemd-sysext@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-sysupdate-reboot.service": {
                "name": "systemd-sysupdate-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysupdate.service": {
                "name": "systemd-sysupdate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysusers.service": {
                "name": "systemd-sysusers.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-timedated.service": {
                "name": "systemd-timedated.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-timesyncd.service": {
                "name": "systemd-timesyncd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev-early.service": {
                "name": "systemd-tmpfiles-setup-dev-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup-early.service": {
                "name": "systemd-tpm2-setup-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup.service": {
                "name": "systemd-tpm2-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-load-credentials.service": {
                "name": "systemd-udev-load-credentials.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udevd.service": {
                "name": "systemd-udevd.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-update-done.service": {
                "name": "systemd-update-done.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-userdbd.service": {
                "name": "systemd-userdbd.service",
                "source": "systemd",
                "state": "running",
                "status": "indirect"
            },
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-volatile-root.service": {
                "name": "systemd-volatile-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "user-runtime-dir@.service": {
                "name": "user-runtime-dir@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user-runtime-dir@0.service": {
                "name": "user-runtime-dir@0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "user@.service": {
                "name": "user@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user@0.service": {
                "name": "user@0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "ypbind.service": {
                "name": "ypbind.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            }
        }
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:44:17 -0500 (0:00:02.069)       0:02:15.011 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:44:17 -0500 (0:00:00.041)       0:02:15.052 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "[Install]\nWantedBy=default.target\n\n[Container]\nImage=quay.io/linux-system-roles/mysql:5.6\nContainerName=quadlet-demo-mysql\nVolume=quadlet-demo-mysql.volume:/var/lib/mysql\nVolume=/tmp/quadlet_demo:/var/lib/quadlet_demo:Z\nNetwork=quadlet-demo.network\nSecret=mysql-root-password-container,type=env,target=MYSQL_ROOT_PASSWORD\nHealthCmd=/bin/true\nHealthOnFailure=kill\n",
        "__podman_quadlet_template_src": "quadlet-demo-mysql.container.j2"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:44:17 -0500 (0:00:00.251)       0:02:15.303 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "absent",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:44:17 -0500 (0:00:00.062)       0:02:15.366 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_str",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:44:17 -0500 (0:00:00.057)       0:02:15.423 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "quadlet-demo-mysql",
        "__podman_quadlet_type": "container",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:44:17 -0500 (0:00:00.072)       0:02:15.495 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:44:17 -0500 (0:00:00.078)       0:02:15.574 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:44:17 -0500 (0:00:00.042)       0:02:15.617 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:44:17 -0500 (0:00:00.040)       0:02:15.658 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:44:17 -0500 (0:00:00.052)       0:02:15.710 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.441)       0:02:16.152 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.062)       0:02:16.214 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.063)       0:02:16.277 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.062)       0:02:16.340 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.060)       0:02:16.400 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.063)       0:02:16.464 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.065)       0:02:16.530 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.056)       0:02:16.587 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.120)       0:02:16.708 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [
            "quay.io/linux-system-roles/mysql:5.6"
        ],
        "__podman_kube_yamls_raw": "",
        "__podman_service_name": "quadlet-demo-mysql.service",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.073)       0:02:16.781 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.041)       0:02:16.822 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:44:18 -0500 (0:00:00.034)       0:02:16.857 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [
            "quay.io/linux-system-roles/mysql:5.6"
        ],
        "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo-mysql.container",
        "__podman_volumes": [
            "/tmp/quadlet_demo"
        ]
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:44:19 -0500 (0:00:00.080)       0:02:16.937 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:44:19 -0500 (0:00:00.041)       0:02:16.979 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4
Saturday 15 February 2025  11:44:19 -0500 (0:00:00.114)       0:02:17.093 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stop and disable service] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12
Saturday 15 February 2025  11:44:19 -0500 (0:00:00.042)       0:02:17.136 ***** 
changed: [managed-node1] => {
    "changed": true,
    "enabled": false,
    "failed_when_result": false,
    "name": "quadlet-demo-mysql.service",
    "state": "stopped",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0",
        "ActiveEnterTimestamp": "Sat 2025-02-15 11:42:50 EST",
        "ActiveEnterTimestampMonotonic": "965845646",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "active",
        "After": "system.slice basic.target sysinit.target quadlet-demo-network.service tmp.mount systemd-journald.socket -.mount quadlet-demo-mysql-volume.service network-online.target",
        "AllowIsolate": "no",
        "AssertResult": "yes",
        "AssertTimestamp": "Sat 2025-02-15 11:42:50 EST",
        "AssertTimestampMonotonic": "965568490",
        "Before": "multi-user.target shutdown.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "2653556000",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "yes",
        "ConditionTimestamp": "Sat 2025-02-15 11:42:50 EST",
        "ConditionTimestampMonotonic": "965568487",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "shutdown.target",
        "ControlGroup": "/system.slice/quadlet-demo-mysql.service",
        "ControlGroupId": "155322",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "yes",
        "DelegateControllers": "cpu cpuset io memory pids",
        "Description": "quadlet-demo-mysql.service",
        "DevicePolicy": "auto",
        "DynamicUser": "no",
        "EffectiveCPUs": "0-1",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveMemoryNodes": "0",
        "EffectiveTasksMax": "22347",
        "Environment": "PODMAN_SYSTEMD_UNIT=quadlet-demo-mysql.service",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainHandoffTimestampMonotonic": "0",
        "ExecMainPID": "68602",
        "ExecMainStartTimestamp": "Sat 2025-02-15 11:42:50 EST",
        "ExecMainStartTimestampMonotonic": "965790001",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman run --name quadlet-demo-mysql --cidfile=/run/quadlet-demo-mysql.cid --replace --rm --cgroups=split --network systemd-quadlet-demo --sdnotify=conmon -d -v systemd-quadlet-demo-mysql:/var/lib/mysql -v /tmp/quadlet_demo:/var/lib/quadlet_demo:Z --secret mysql-root-password-container,type=env,target=MYSQL_ROOT_PASSWORD --health-cmd /bin/true --health-on-failure kill quay.io/linux-system-roles/mysql:5.6 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman run --name quadlet-demo-mysql --cidfile=/run/quadlet-demo-mysql.cid --replace --rm --cgroups=split --network systemd-quadlet-demo --sdnotify=conmon -d -v systemd-quadlet-demo-mysql:/var/lib/mysql -v /tmp/quadlet_demo:/var/lib/quadlet_demo:Z --secret mysql-root-password-container,type=env,target=MYSQL_ROOT_PASSWORD --health-cmd /bin/true --health-on-failure kill quay.io/linux-system-roles/mysql:5.6 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStop": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/quadlet-demo-mysql.cid ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStopEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/quadlet-demo-mysql.cid ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStopPost": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/quadlet-demo-mysql.cid ; ignore_errors=yes ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStopPostEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/quadlet-demo-mysql.cid ; flags=ignore-failure ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/run/systemd/generator/quadlet-demo-mysql.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "quadlet-demo-mysql.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestamp": "Sat 2025-02-15 11:42:50 EST",
        "InactiveExitTimestampMonotonic": "965580147",
        "InvocationID": "23afb805cec04b52b9596b61b5405878",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "mixed",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "68602",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "2674552832",
        "MemoryCurrent": "600612864",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "641531904",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "0",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "0",
        "MemoryZSwapCurrent": "0",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "quadlet-demo-mysql.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "all",
        "OOMPolicy": "continue",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "no",
        "ProtectControlGroups": "no",
        "ProtectControlGroupsEx": "no",
        "ProtectHome": "no",
        "ProtectHostname": "no",
        "ProtectKernelLogs": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "quadlet-demo-network.service quadlet-demo-mysql-volume.service sysinit.target system.slice -.mount",
        "RequiresMountsFor": "/run/containers /tmp/quadlet_demo",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "SourcePath": "/etc/containers/systemd/quadlet-demo-mysql.container",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestamp": "Sat 2025-02-15 11:42:50 EST",
        "StateChangeTimestampMonotonic": "965845646",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "running",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogIdentifier": "quadlet-demo-mysql",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "23",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "notify",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "disabled",
        "UnitFileState": "generated",
        "UtmpMode": "init",
        "WantedBy": "multi-user.target",
        "Wants": "network-online.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "0"
    }
}
TASK [fedora.linux_system_roles.podman : See if quadlet file exists] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33
Saturday 15 February 2025  11:44:21 -0500 (0:00:02.221)       0:02:19.358 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637768.9770527,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 8,
        "charset": "us-ascii",
        "checksum": "ca62b2ad3cc9afb5b5371ebbf797b9bc4fd7edd4",
        "ctime": 1739637768.9790525,
        "dev": 51714,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 528482536,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "text/plain",
        "mode": "0644",
        "mtime": 1739637768.7070515,
        "nlink": 1,
        "path": "/etc/containers/systemd/quadlet-demo-mysql.container",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 363,
        "uid": 0,
        "version": "2603552311",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38
Saturday 15 February 2025  11:44:21 -0500 (0:00:00.394)       0:02:19.753 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Slurp quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:6
Saturday 15 February 2025  11:44:21 -0500 (0:00:00.086)       0:02:19.840 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:12
Saturday 15 February 2025  11:44:22 -0500 (0:00:00.384)       0:02:20.225 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet yaml file] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:44
Saturday 15 February 2025  11:44:22 -0500 (0:00:00.055)       0:02:20.280 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Reset raw variable] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:52
Saturday 15 February 2025  11:44:22 -0500 (0:00:00.037)       0:02:20.317 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_raw": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Remove quadlet file] ******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42
Saturday 15 February 2025  11:44:22 -0500 (0:00:00.139)       0:02:20.456 ***** 
changed: [managed-node1] => {
    "changed": true,
    "path": "/etc/containers/systemd/quadlet-demo-mysql.container",
    "state": "absent"
}
TASK [fedora.linux_system_roles.podman : Refresh systemd] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48
Saturday 15 February 2025  11:44:22 -0500 (0:00:00.397)       0:02:20.854 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Remove managed resource] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58
Saturday 15 February 2025  11:44:23 -0500 (0:00:00.754)       0:02:21.608 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Remove volumes] ***********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:99
Saturday 15 February 2025  11:44:24 -0500 (0:00:00.433)       0:02:22.042 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116
Saturday 15 February 2025  11:44:24 -0500 (0:00:00.050)       0:02:22.092 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_parsed": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:120
Saturday 15 February 2025  11:44:24 -0500 (0:00:00.037)       0:02:22.130 ***** 
changed: [managed-node1] => {
    "changed": true,
    "cmd": [
        "podman",
        "image",
        "prune",
        "--all",
        "-f"
    ],
    "delta": "0:00:00.253924",
    "end": "2025-02-15 11:44:24.796090",
    "rc": 0,
    "start": "2025-02-15 11:44:24.542166"
}
STDOUT:
dd3b2a5dcb48ff61113592ed5ddd762581be4387c7bc552375a2159422aa6bf5
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:131
Saturday 15 February 2025  11:44:24 -0500 (0:00:00.660)       0:02:22.791 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:44:24 -0500 (0:00:00.078)       0:02:22.869 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:44:25 -0500 (0:00:00.045)       0:02:22.914 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:44:25 -0500 (0:00:00.041)       0:02:22.955 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - images] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:141
Saturday 15 February 2025  11:44:25 -0500 (0:00:00.035)       0:02:22.991 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "images",
        "-n"
    ],
    "delta": "0:00:00.029319",
    "end": "2025-02-15 11:44:25.419387",
    "rc": 0,
    "start": "2025-02-15 11:44:25.390068"
}
STDOUT:
quay.io/libpod/registry  2.8.2       0030ba3d620c  18 months ago  24.6 MB
TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:150
Saturday 15 February 2025  11:44:25 -0500 (0:00:00.421)       0:02:23.412 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "volume",
        "ls",
        "-n"
    ],
    "delta": "0:00:00.028362",
    "end": "2025-02-15 11:44:25.847817",
    "rc": 0,
    "start": "2025-02-15 11:44:25.819455"
}
STDOUT:
local       2fefbc9190adbe5ffeef28f6e938304e8cedd541e0d51e013b7101e905a15702
local       systemd-quadlet-demo-mysql
TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:159
Saturday 15 February 2025  11:44:25 -0500 (0:00:00.427)       0:02:23.840 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "ps",
        "--noheading"
    ],
    "delta": "0:00:00.032934",
    "end": "2025-02-15 11:44:26.278952",
    "rc": 0,
    "start": "2025-02-15 11:44:26.246018"
}
STDOUT:
a5dec710fc02  quay.io/libpod/registry:2.8.2  /etc/docker/regis...  6 minutes ago  Up 6 minutes  127.0.0.1:5000->5000/tcp  podman_registry
TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:168
Saturday 15 February 2025  11:44:26 -0500 (0:00:00.441)       0:02:24.281 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "network",
        "ls",
        "-n",
        "-q"
    ],
    "delta": "0:00:00.026799",
    "end": "2025-02-15 11:44:26.733914",
    "rc": 0,
    "start": "2025-02-15 11:44:26.707115"
}
STDOUT:
podman
podman-default-kube-network
systemd-quadlet-demo
TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:177
Saturday 15 February 2025  11:44:26 -0500 (0:00:00.439)       0:02:24.721 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - pods] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:187
Saturday 15 February 2025  11:44:27 -0500 (0:00:00.478)       0:02:25.200 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - services] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197
Saturday 15 February 2025  11:44:27 -0500 (0:00:00.412)       0:02:25.612 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "services": {
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "NetworkManager.service": {
                "name": "NetworkManager.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "apt-daily.service": {
                "name": "apt-daily.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "audit-rules.service": {
                "name": "audit-rules.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "auditd.service": {
                "name": "auditd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "auth-rpcgss-module.service": {
                "name": "auth-rpcgss-module.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "autofs.service": {
                "name": "autofs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "autovt@.service": {
                "name": "autovt@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "blk-availability.service": {
                "name": "blk-availability.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "capsule@.service": {
                "name": "capsule@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "certmonger.service": {
                "name": "certmonger.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "chrony-wait.service": {
                "name": "chrony-wait.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd-restricted.service": {
                "name": "chronyd-restricted.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd.service": {
                "name": "chronyd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "cloud-config.service": {
                "name": "cloud-config.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-final.service": {
                "name": "cloud-final.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init-hotplugd.service": {
                "name": "cloud-init-hotplugd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "cloud-init-local.service": {
                "name": "cloud-init-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init.service": {
                "name": "cloud-init.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "console-getty.service": {
                "name": "console-getty.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "container-getty@.service": {
                "name": "container-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "crond.service": {
                "name": "crond.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-broker.service": {
                "name": "dbus-broker.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-org.fedoraproject.FirewallD1.service": {
                "name": "dbus-org.fedoraproject.FirewallD1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus.service": {
                "name": "dbus.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "debug-shell.service": {
                "name": "debug-shell.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd.service": {
                "name": "dhcpcd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd@.service": {
                "name": "dhcpcd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "display-manager.service": {
                "name": "display-manager.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "dm-event.service": {
                "name": "dm-event.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-makecache.service": {
                "name": "dnf-makecache.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-system-upgrade-cleanup.service": {
                "name": "dnf-system-upgrade-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "dnf-system-upgrade.service": {
                "name": "dnf-system-upgrade.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-mount.service": {
                "name": "dracut-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown-onfailure.service": {
                "name": "dracut-shutdown-onfailure.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ebtables.service": {
                "name": "ebtables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "emergency.service": {
                "name": "emergency.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fips-crypto-policy-overlay.service": {
                "name": "fips-crypto-policy-overlay.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "firewalld.service": {
                "name": "firewalld.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "fsidd.service": {
                "name": "fsidd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "fstrim.service": {
                "name": "fstrim.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "getty@.service": {
                "name": "getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "enabled"
            },
            "getty@tty1.service": {
                "name": "getty@tty1.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "grub-boot-indeterminate.service": {
                "name": "grub-boot-indeterminate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "grub2-systemd-integration.service": {
                "name": "grub2-systemd-integration.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "gssproxy.service": {
                "name": "gssproxy.service",
                "source": "systemd",
                "state": "running",
                "status": "disabled"
            },
            "hv_kvp_daemon.service": {
                "name": "hv_kvp_daemon.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ip6tables.service": {
                "name": "ip6tables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ipset.service": {
                "name": "ipset.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iptables.service": {
                "name": "iptables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "irqbalance.service": {
                "name": "irqbalance.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "kdump.service": {
                "name": "kdump.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "kvm_stat.service": {
                "name": "kvm_stat.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "ldconfig.service": {
                "name": "ldconfig.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "logrotate.service": {
                "name": "logrotate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm-devices-import.service": {
                "name": "lvm-devices-import.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "lvm2-lvmpolld.service": {
                "name": "lvm2-lvmpolld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-monitor.service": {
                "name": "lvm2-monitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "man-db-cache-update.service": {
                "name": "man-db-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "man-db-restart-cache-update.service": {
                "name": "man-db-restart-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "microcode.service": {
                "name": "microcode.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "modprobe@.service": {
                "name": "modprobe@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "modprobe@configfs.service": {
                "name": "modprobe@configfs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@dm_mod.service": {
                "name": "modprobe@dm_mod.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@drm.service": {
                "name": "modprobe@drm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@efi_pstore.service": {
                "name": "modprobe@efi_pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@fuse.service": {
                "name": "modprobe@fuse.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@loop.service": {
                "name": "modprobe@loop.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "netavark-dhcp-proxy.service": {
                "name": "netavark-dhcp-proxy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "netavark-firewalld-reload.service": {
                "name": "netavark-firewalld-reload.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-blkmap.service": {
                "name": "nfs-blkmap.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-idmapd.service": {
                "name": "nfs-idmapd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-mountd.service": {
                "name": "nfs-mountd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-server.service": {
                "name": "nfs-server.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "nfs-utils.service": {
                "name": "nfs-utils.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfsdcld.service": {
                "name": "nfsdcld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nftables.service": {
                "name": "nftables.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nis-domainname.service": {
                "name": "nis-domainname.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nm-priv-helper.service": {
                "name": "nm-priv-helper.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "ntpd.service": {
                "name": "ntpd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ntpdate.service": {
                "name": "ntpdate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "pam_namespace.service": {
                "name": "pam_namespace.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "pcscd.service": {
                "name": "pcscd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "plymouth-start.service": {
                "name": "plymouth-start.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "podman-auto-update.service": {
                "name": "podman-auto-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-clean-transient.service": {
                "name": "podman-clean-transient.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-kube@.service": {
                "name": "podman-kube@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "podman-restart.service": {
                "name": "podman-restart.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman.service": {
                "name": "podman.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "polkit.service": {
                "name": "polkit.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "quadlet-demo-mysql-volume.service": {
                "name": "quadlet-demo-mysql-volume.service",
                "source": "systemd",
                "state": "stopped",
                "status": "generated"
            },
            "quadlet-demo-network.service": {
                "name": "quadlet-demo-network.service",
                "source": "systemd",
                "state": "stopped",
                "status": "generated"
            },
            "quotaon-root.service": {
                "name": "quotaon-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "quotaon@.service": {
                "name": "quotaon@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "rc-local.service": {
                "name": "rc-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rescue.service": {
                "name": "rescue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "restraintd.service": {
                "name": "restraintd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rngd.service": {
                "name": "rngd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpc-gssd.service": {
                "name": "rpc-gssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd-notify.service": {
                "name": "rpc-statd-notify.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd.service": {
                "name": "rpc-statd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-svcgssd.service": {
                "name": "rpc-svcgssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rpcbind.service": {
                "name": "rpcbind.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpmdb-migrate.service": {
                "name": "rpmdb-migrate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rpmdb-rebuild.service": {
                "name": "rpmdb-rebuild.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rsyslog.service": {
                "name": "rsyslog.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "selinux-autorelabel-mark.service": {
                "name": "selinux-autorelabel-mark.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel.service": {
                "name": "selinux-autorelabel.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "selinux-check-proper-disable.service": {
                "name": "selinux-check-proper-disable.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "serial-getty@.service": {
                "name": "serial-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "sntp.service": {
                "name": "sntp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ssh-host-keys-migration.service": {
                "name": "ssh-host-keys-migration.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "sshd-keygen.service": {
                "name": "sshd-keygen.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen@.service": {
                "name": "sshd-keygen@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "sshd-keygen@ecdsa.service": {
                "name": "sshd-keygen@ecdsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@ed25519.service": {
                "name": "sshd-keygen@ed25519.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@rsa.service": {
                "name": "sshd-keygen@rsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-unix-local@.service": {
                "name": "sshd-unix-local@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd-vsock@.service": {
                "name": "sshd-vsock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd.service": {
                "name": "sshd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "sshd@.service": {
                "name": "sshd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "sssd-autofs.service": {
                "name": "sssd-autofs.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-kcm.service": {
                "name": "sssd-kcm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "sssd-nss.service": {
                "name": "sssd-nss.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pac.service": {
                "name": "sssd-pac.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pam.service": {
                "name": "sssd-pam.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-ssh.service": {
                "name": "sssd-ssh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-sudo.service": {
                "name": "sssd-sudo.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd.service": {
                "name": "sssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "syslog.service": {
                "name": "syslog.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "system-update-cleanup.service": {
                "name": "system-update-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-battery-check.service": {
                "name": "systemd-battery-check.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-bless-boot.service": {
                "name": "systemd-bless-boot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-boot-check-no-failures.service": {
                "name": "systemd-boot-check-no-failures.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-boot-random-seed.service": {
                "name": "systemd-boot-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-boot-update.service": {
                "name": "systemd-boot-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-bootctl@.service": {
                "name": "systemd-bootctl@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-confext.service": {
                "name": "systemd-confext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-coredump@.service": {
                "name": "systemd-coredump@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-creds@.service": {
                "name": "systemd-creds@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-exit.service": {
                "name": "systemd-exit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-growfs-root.service": {
                "name": "systemd-growfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-growfs@.service": {
                "name": "systemd-growfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-halt.service": {
                "name": "systemd-halt.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hibernate-clear.service": {
                "name": "systemd-hibernate-clear.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate-resume.service": {
                "name": "systemd-hibernate-resume.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-initctl.service": {
                "name": "systemd-initctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journald-sync@.service": {
                "name": "systemd-journald-sync@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-journald.service": {
                "name": "systemd-journald.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-journald@.service": {
                "name": "systemd-journald@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-kexec.service": {
                "name": "systemd-kexec.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-localed.service": {
                "name": "systemd-localed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-logind.service": {
                "name": "systemd-logind.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-network-generator.service": {
                "name": "systemd-network-generator.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-networkd-wait-online.service": {
                "name": "systemd-networkd-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-oomd.service": {
                "name": "systemd-oomd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-pcrextend@.service": {
                "name": "systemd-pcrextend@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrfs-root.service": {
                "name": "systemd-pcrfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pcrfs@.service": {
                "name": "systemd-pcrfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrlock-file-system.service": {
                "name": "systemd-pcrlock-file-system.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-code.service": {
                "name": "systemd-pcrlock-firmware-code.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-config.service": {
                "name": "systemd-pcrlock-firmware-config.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-machine-id.service": {
                "name": "systemd-pcrlock-machine-id.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-make-policy.service": {
                "name": "systemd-pcrlock-make-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-authority.service": {
                "name": "systemd-pcrlock-secureboot-authority.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-policy.service": {
                "name": "systemd-pcrlock-secureboot-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock@.service": {
                "name": "systemd-pcrlock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrmachine.service": {
                "name": "systemd-pcrmachine.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-initrd.service": {
                "name": "systemd-pcrphase-initrd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-sysinit.service": {
                "name": "systemd-pcrphase-sysinit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase.service": {
                "name": "systemd-pcrphase.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pstore.service": {
                "name": "systemd-pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-quotacheck-root.service": {
                "name": "systemd-quotacheck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-quotacheck@.service": {
                "name": "systemd-quotacheck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-reboot.service": {
                "name": "systemd-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled-runtime"
            },
            "systemd-repart.service": {
                "name": "systemd-repart.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-rfkill.service": {
                "name": "systemd-rfkill.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-soft-reboot.service": {
                "name": "systemd-soft-reboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-suspend-then-hibernate.service": {
                "name": "systemd-suspend-then-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend.service": {
                "name": "systemd-suspend.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-sysext.service": {
                "name": "systemd-sysext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-sysext@.service": {
                "name": "systemd-sysext@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-sysupdate-reboot.service": {
                "name": "systemd-sysupdate-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysupdate.service": {
                "name": "systemd-sysupdate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysusers.service": {
                "name": "systemd-sysusers.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-timedated.service": {
                "name": "systemd-timedated.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-timesyncd.service": {
                "name": "systemd-timesyncd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev-early.service": {
                "name": "systemd-tmpfiles-setup-dev-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup-early.service": {
                "name": "systemd-tpm2-setup-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup.service": {
                "name": "systemd-tpm2-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-load-credentials.service": {
                "name": "systemd-udev-load-credentials.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udevd.service": {
                "name": "systemd-udevd.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-update-done.service": {
                "name": "systemd-update-done.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-userdbd.service": {
                "name": "systemd-userdbd.service",
                "source": "systemd",
                "state": "running",
                "status": "indirect"
            },
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-volatile-root.service": {
                "name": "systemd-volatile-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "user-runtime-dir@.service": {
                "name": "user-runtime-dir@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user-runtime-dir@0.service": {
                "name": "user-runtime-dir@0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "user@.service": {
                "name": "user@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user@0.service": {
                "name": "user@0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "ypbind.service": {
                "name": "ypbind.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            }
        }
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:44:29 -0500 (0:00:02.094)       0:02:27.707 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:44:29 -0500 (0:00:00.035)       0:02:27.743 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "quadlet-demo-mysql.volume",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "[Volume]",
        "__podman_quadlet_template_src": ""
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:44:29 -0500 (0:00:00.050)       0:02:27.794 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "absent",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:44:29 -0500 (0:00:00.046)       0:02:27.840 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:44:29 -0500 (0:00:00.039)       0:02:27.880 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "quadlet-demo-mysql",
        "__podman_quadlet_type": "volume",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:44:30 -0500 (0:00:00.054)       0:02:27.934 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:44:30 -0500 (0:00:00.093)       0:02:28.027 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:44:30 -0500 (0:00:00.053)       0:02:28.081 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:44:30 -0500 (0:00:00.049)       0:02:28.131 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:44:30 -0500 (0:00:00.060)       0:02:28.191 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:44:30 -0500 (0:00:00.412)       0:02:28.603 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:44:30 -0500 (0:00:00.042)       0:02:28.646 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:44:30 -0500 (0:00:00.050)       0:02:28.696 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:44:30 -0500 (0:00:00.116)       0:02:28.813 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:44:30 -0500 (0:00:00.037)       0:02:28.850 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:44:30 -0500 (0:00:00.039)       0:02:28.890 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:44:31 -0500 (0:00:00.039)       0:02:28.930 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:44:31 -0500 (0:00:00.042)       0:02:28.972 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:44:31 -0500 (0:00:00.040)       0:02:29.013 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [],
        "__podman_kube_yamls_raw": "",
        "__podman_service_name": "quadlet-demo-mysql-volume.service",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:44:31 -0500 (0:00:00.088)       0:02:29.102 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:44:31 -0500 (0:00:00.047)       0:02:29.149 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:44:31 -0500 (0:00:00.042)       0:02:29.192 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [],
        "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo-mysql.volume",
        "__podman_volumes": []
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:44:31 -0500 (0:00:00.085)       0:02:29.277 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:44:31 -0500 (0:00:00.043)       0:02:29.321 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4
Saturday 15 February 2025  11:44:31 -0500 (0:00:00.084)       0:02:29.406 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stop and disable service] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12
Saturday 15 February 2025  11:44:31 -0500 (0:00:00.042)       0:02:29.448 ***** 
changed: [managed-node1] => {
    "changed": true,
    "enabled": false,
    "failed_when_result": false,
    "name": "quadlet-demo-mysql-volume.service",
    "state": "stopped",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0",
        "ActiveEnterTimestamp": "Sat 2025-02-15 11:42:38 EST",
        "ActiveEnterTimestampMonotonic": "953669880",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "active",
        "After": "network-online.target basic.target system.slice systemd-journald.socket sysinit.target -.mount",
        "AllowIsolate": "no",
        "AssertResult": "yes",
        "AssertTimestamp": "Sat 2025-02-15 11:42:38 EST",
        "AssertTimestampMonotonic": "953618205",
        "Before": "shutdown.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "39777000",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "yes",
        "ConditionTimestamp": "Sat 2025-02-15 11:42:38 EST",
        "ConditionTimestampMonotonic": "953618202",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "shutdown.target",
        "ControlGroupId": "154770",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "no",
        "Description": "quadlet-demo-mysql-volume.service",
        "DevicePolicy": "auto",
        "DynamicUser": "no",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveTasksMax": "22347",
        "ExecMainCode": "1",
        "ExecMainExitTimestamp": "Sat 2025-02-15 11:42:38 EST",
        "ExecMainExitTimestampMonotonic": "953669682",
        "ExecMainHandoffTimestamp": "Sat 2025-02-15 11:42:38 EST",
        "ExecMainHandoffTimestampMonotonic": "953631510",
        "ExecMainPID": "67251",
        "ExecMainStartTimestamp": "Sat 2025-02-15 11:42:38 EST",
        "ExecMainStartTimestampMonotonic": "953618915",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman volume create --ignore systemd-quadlet-demo-mysql ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman volume create --ignore systemd-quadlet-demo-mysql ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/run/systemd/generator/quadlet-demo-mysql-volume.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "quadlet-demo-mysql-volume.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestamp": "Sat 2025-02-15 11:42:38 EST",
        "InactiveExitTimestampMonotonic": "953619460",
        "InvocationID": "1a9aa0f3ece745ab8490e7873ff36040",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "control-group",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "0",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "3150659584",
        "MemoryCurrent": "[not set]",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "16338944",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "[not set]",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "0",
        "MemoryZSwapCurrent": "[not set]",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "quadlet-demo-mysql-volume.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMPolicy": "stop",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "no",
        "ProtectControlGroups": "no",
        "ProtectControlGroupsEx": "no",
        "ProtectHome": "no",
        "ProtectHostname": "no",
        "ProtectKernelLogs": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "yes",
        "RemoveIPC": "no",
        "Requires": "system.slice sysinit.target -.mount",
        "RequiresMountsFor": "/run/containers",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "SourcePath": "/etc/containers/systemd/quadlet-demo-mysql.volume",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestamp": "Sat 2025-02-15 11:42:38 EST",
        "StateChangeTimestampMonotonic": "953669880",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "exited",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogIdentifier": "quadlet-demo-mysql-volume",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "[not set]",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "infinity",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "oneshot",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "disabled",
        "UnitFileState": "generated",
        "UtmpMode": "init",
        "Wants": "network-online.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "0"
    }
}
TASK [fedora.linux_system_roles.podman : See if quadlet file exists] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33
Saturday 15 February 2025  11:44:32 -0500 (0:00:00.822)       0:02:30.270 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637757.010005,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 8,
        "charset": "us-ascii",
        "checksum": "585f8cbdf0ec73000f9227dcffbef71e9552ea4a",
        "ctime": 1739637757.0120049,
        "dev": 51714,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 218104032,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "text/plain",
        "mode": "0644",
        "mtime": 1739637756.7330039,
        "nlink": 1,
        "path": "/etc/containers/systemd/quadlet-demo-mysql.volume",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 9,
        "uid": 0,
        "version": "1498893021",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38
Saturday 15 February 2025  11:44:32 -0500 (0:00:00.414)       0:02:30.685 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Slurp quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:6
Saturday 15 February 2025  11:44:32 -0500 (0:00:00.205)       0:02:30.891 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:12
Saturday 15 February 2025  11:44:33 -0500 (0:00:00.404)       0:02:31.295 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet yaml file] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:44
Saturday 15 February 2025  11:44:33 -0500 (0:00:00.090)       0:02:31.386 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Reset raw variable] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:52
Saturday 15 February 2025  11:44:33 -0500 (0:00:00.064)       0:02:31.450 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_raw": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Remove quadlet file] ******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42
Saturday 15 February 2025  11:44:33 -0500 (0:00:00.067)       0:02:31.518 ***** 
changed: [managed-node1] => {
    "changed": true,
    "path": "/etc/containers/systemd/quadlet-demo-mysql.volume",
    "state": "absent"
}
TASK [fedora.linux_system_roles.podman : Refresh systemd] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48
Saturday 15 February 2025  11:44:34 -0500 (0:00:00.421)       0:02:31.939 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Remove managed resource] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58
Saturday 15 February 2025  11:44:34 -0500 (0:00:00.769)       0:02:32.709 ***** 
changed: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
TASK [fedora.linux_system_roles.podman : Remove volumes] ***********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:99
Saturday 15 February 2025  11:44:35 -0500 (0:00:00.484)       0:02:33.194 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116
Saturday 15 February 2025  11:44:35 -0500 (0:00:00.049)       0:02:33.244 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_parsed": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:120
Saturday 15 February 2025  11:44:35 -0500 (0:00:00.036)       0:02:33.280 ***** 
changed: [managed-node1] => {
    "changed": true,
    "cmd": [
        "podman",
        "image",
        "prune",
        "--all",
        "-f"
    ],
    "delta": "0:00:00.028442",
    "end": "2025-02-15 11:44:35.721900",
    "rc": 0,
    "start": "2025-02-15 11:44:35.693458"
}
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:131
Saturday 15 February 2025  11:44:35 -0500 (0:00:00.451)       0:02:33.731 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:44:35 -0500 (0:00:00.132)       0:02:33.864 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:44:36 -0500 (0:00:00.060)       0:02:33.925 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:44:36 -0500 (0:00:00.059)       0:02:33.984 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - images] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:141
Saturday 15 February 2025  11:44:36 -0500 (0:00:00.060)       0:02:34.045 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "images",
        "-n"
    ],
    "delta": "0:00:00.029973",
    "end": "2025-02-15 11:44:36.501835",
    "rc": 0,
    "start": "2025-02-15 11:44:36.471862"
}
STDOUT:
quay.io/libpod/registry  2.8.2       0030ba3d620c  18 months ago  24.6 MB
TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:150
Saturday 15 February 2025  11:44:36 -0500 (0:00:00.527)       0:02:34.573 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "volume",
        "ls",
        "-n"
    ],
    "delta": "0:00:00.027979",
    "end": "2025-02-15 11:44:37.000301",
    "rc": 0,
    "start": "2025-02-15 11:44:36.972322"
}
STDOUT:
local       2fefbc9190adbe5ffeef28f6e938304e8cedd541e0d51e013b7101e905a15702
TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:159
Saturday 15 February 2025  11:44:37 -0500 (0:00:00.417)       0:02:34.990 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "ps",
        "--noheading"
    ],
    "delta": "0:00:00.032296",
    "end": "2025-02-15 11:44:37.426116",
    "rc": 0,
    "start": "2025-02-15 11:44:37.393820"
}
STDOUT:
a5dec710fc02  quay.io/libpod/registry:2.8.2  /etc/docker/regis...  6 minutes ago  Up 6 minutes  127.0.0.1:5000->5000/tcp  podman_registry
TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:168
Saturday 15 February 2025  11:44:37 -0500 (0:00:00.442)       0:02:35.433 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "network",
        "ls",
        "-n",
        "-q"
    ],
    "delta": "0:00:00.026920",
    "end": "2025-02-15 11:44:37.885730",
    "rc": 0,
    "start": "2025-02-15 11:44:37.858810"
}
STDOUT:
podman
podman-default-kube-network
systemd-quadlet-demo
TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:177
Saturday 15 February 2025  11:44:37 -0500 (0:00:00.475)       0:02:35.908 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - pods] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:187
Saturday 15 February 2025  11:44:38 -0500 (0:00:00.470)       0:02:36.378 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - services] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197
Saturday 15 February 2025  11:44:38 -0500 (0:00:00.471)       0:02:36.850 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "services": {
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "NetworkManager.service": {
                "name": "NetworkManager.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "apt-daily.service": {
                "name": "apt-daily.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "audit-rules.service": {
                "name": "audit-rules.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "auditd.service": {
                "name": "auditd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "auth-rpcgss-module.service": {
                "name": "auth-rpcgss-module.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "autofs.service": {
                "name": "autofs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "autovt@.service": {
                "name": "autovt@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "blk-availability.service": {
                "name": "blk-availability.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "capsule@.service": {
                "name": "capsule@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "certmonger.service": {
                "name": "certmonger.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "chrony-wait.service": {
                "name": "chrony-wait.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd-restricted.service": {
                "name": "chronyd-restricted.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd.service": {
                "name": "chronyd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "cloud-config.service": {
                "name": "cloud-config.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-final.service": {
                "name": "cloud-final.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init-hotplugd.service": {
                "name": "cloud-init-hotplugd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "cloud-init-local.service": {
                "name": "cloud-init-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init.service": {
                "name": "cloud-init.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "console-getty.service": {
                "name": "console-getty.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "container-getty@.service": {
                "name": "container-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "crond.service": {
                "name": "crond.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-broker.service": {
                "name": "dbus-broker.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-org.fedoraproject.FirewallD1.service": {
                "name": "dbus-org.fedoraproject.FirewallD1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus.service": {
                "name": "dbus.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "debug-shell.service": {
                "name": "debug-shell.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd.service": {
                "name": "dhcpcd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd@.service": {
                "name": "dhcpcd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "display-manager.service": {
                "name": "display-manager.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "dm-event.service": {
                "name": "dm-event.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-makecache.service": {
                "name": "dnf-makecache.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-system-upgrade-cleanup.service": {
                "name": "dnf-system-upgrade-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "dnf-system-upgrade.service": {
                "name": "dnf-system-upgrade.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-mount.service": {
                "name": "dracut-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown-onfailure.service": {
                "name": "dracut-shutdown-onfailure.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ebtables.service": {
                "name": "ebtables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "emergency.service": {
                "name": "emergency.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fips-crypto-policy-overlay.service": {
                "name": "fips-crypto-policy-overlay.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "firewalld.service": {
                "name": "firewalld.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "fsidd.service": {
                "name": "fsidd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "fstrim.service": {
                "name": "fstrim.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "getty@.service": {
                "name": "getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "enabled"
            },
            "getty@tty1.service": {
                "name": "getty@tty1.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "grub-boot-indeterminate.service": {
                "name": "grub-boot-indeterminate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "grub2-systemd-integration.service": {
                "name": "grub2-systemd-integration.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "gssproxy.service": {
                "name": "gssproxy.service",
                "source": "systemd",
                "state": "running",
                "status": "disabled"
            },
            "hv_kvp_daemon.service": {
                "name": "hv_kvp_daemon.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ip6tables.service": {
                "name": "ip6tables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ipset.service": {
                "name": "ipset.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iptables.service": {
                "name": "iptables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "irqbalance.service": {
                "name": "irqbalance.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "kdump.service": {
                "name": "kdump.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "kvm_stat.service": {
                "name": "kvm_stat.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "ldconfig.service": {
                "name": "ldconfig.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "logrotate.service": {
                "name": "logrotate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm-devices-import.service": {
                "name": "lvm-devices-import.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "lvm2-lvmpolld.service": {
                "name": "lvm2-lvmpolld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-monitor.service": {
                "name": "lvm2-monitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "man-db-cache-update.service": {
                "name": "man-db-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "man-db-restart-cache-update.service": {
                "name": "man-db-restart-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "microcode.service": {
                "name": "microcode.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "modprobe@.service": {
                "name": "modprobe@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "modprobe@configfs.service": {
                "name": "modprobe@configfs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@dm_mod.service": {
                "name": "modprobe@dm_mod.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@drm.service": {
                "name": "modprobe@drm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@efi_pstore.service": {
                "name": "modprobe@efi_pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@fuse.service": {
                "name": "modprobe@fuse.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@loop.service": {
                "name": "modprobe@loop.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "netavark-dhcp-proxy.service": {
                "name": "netavark-dhcp-proxy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "netavark-firewalld-reload.service": {
                "name": "netavark-firewalld-reload.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-blkmap.service": {
                "name": "nfs-blkmap.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-idmapd.service": {
                "name": "nfs-idmapd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-mountd.service": {
                "name": "nfs-mountd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-server.service": {
                "name": "nfs-server.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "nfs-utils.service": {
                "name": "nfs-utils.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfsdcld.service": {
                "name": "nfsdcld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nftables.service": {
                "name": "nftables.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nis-domainname.service": {
                "name": "nis-domainname.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nm-priv-helper.service": {
                "name": "nm-priv-helper.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "ntpd.service": {
                "name": "ntpd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ntpdate.service": {
                "name": "ntpdate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "pam_namespace.service": {
                "name": "pam_namespace.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "pcscd.service": {
                "name": "pcscd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "plymouth-start.service": {
                "name": "plymouth-start.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "podman-auto-update.service": {
                "name": "podman-auto-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-clean-transient.service": {
                "name": "podman-clean-transient.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-kube@.service": {
                "name": "podman-kube@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "podman-restart.service": {
                "name": "podman-restart.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman.service": {
                "name": "podman.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "polkit.service": {
                "name": "polkit.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "quadlet-demo-network.service": {
                "name": "quadlet-demo-network.service",
                "source": "systemd",
                "state": "stopped",
                "status": "generated"
            },
            "quotaon-root.service": {
                "name": "quotaon-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "quotaon@.service": {
                "name": "quotaon@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "rc-local.service": {
                "name": "rc-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rescue.service": {
                "name": "rescue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "restraintd.service": {
                "name": "restraintd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rngd.service": {
                "name": "rngd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpc-gssd.service": {
                "name": "rpc-gssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd-notify.service": {
                "name": "rpc-statd-notify.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd.service": {
                "name": "rpc-statd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-svcgssd.service": {
                "name": "rpc-svcgssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rpcbind.service": {
                "name": "rpcbind.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpmdb-migrate.service": {
                "name": "rpmdb-migrate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rpmdb-rebuild.service": {
                "name": "rpmdb-rebuild.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rsyslog.service": {
                "name": "rsyslog.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "selinux-autorelabel-mark.service": {
                "name": "selinux-autorelabel-mark.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel.service": {
                "name": "selinux-autorelabel.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "selinux-check-proper-disable.service": {
                "name": "selinux-check-proper-disable.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "serial-getty@.service": {
                "name": "serial-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "sntp.service": {
                "name": "sntp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ssh-host-keys-migration.service": {
                "name": "ssh-host-keys-migration.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "sshd-keygen.service": {
                "name": "sshd-keygen.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen@.service": {
                "name": "sshd-keygen@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "sshd-keygen@ecdsa.service": {
                "name": "sshd-keygen@ecdsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@ed25519.service": {
                "name": "sshd-keygen@ed25519.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@rsa.service": {
                "name": "sshd-keygen@rsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-unix-local@.service": {
                "name": "sshd-unix-local@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd-vsock@.service": {
                "name": "sshd-vsock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd.service": {
                "name": "sshd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "sshd@.service": {
                "name": "sshd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "sssd-autofs.service": {
                "name": "sssd-autofs.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-kcm.service": {
                "name": "sssd-kcm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "sssd-nss.service": {
                "name": "sssd-nss.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pac.service": {
                "name": "sssd-pac.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pam.service": {
                "name": "sssd-pam.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-ssh.service": {
                "name": "sssd-ssh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-sudo.service": {
                "name": "sssd-sudo.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd.service": {
                "name": "sssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "syslog.service": {
                "name": "syslog.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "system-update-cleanup.service": {
                "name": "system-update-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-battery-check.service": {
                "name": "systemd-battery-check.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-bless-boot.service": {
                "name": "systemd-bless-boot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-boot-check-no-failures.service": {
                "name": "systemd-boot-check-no-failures.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-boot-random-seed.service": {
                "name": "systemd-boot-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-boot-update.service": {
                "name": "systemd-boot-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-bootctl@.service": {
                "name": "systemd-bootctl@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-confext.service": {
                "name": "systemd-confext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-coredump@.service": {
                "name": "systemd-coredump@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-creds@.service": {
                "name": "systemd-creds@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-exit.service": {
                "name": "systemd-exit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-growfs-root.service": {
                "name": "systemd-growfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-growfs@.service": {
                "name": "systemd-growfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-halt.service": {
                "name": "systemd-halt.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hibernate-clear.service": {
                "name": "systemd-hibernate-clear.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate-resume.service": {
                "name": "systemd-hibernate-resume.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-initctl.service": {
                "name": "systemd-initctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journald-sync@.service": {
                "name": "systemd-journald-sync@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-journald.service": {
                "name": "systemd-journald.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-journald@.service": {
                "name": "systemd-journald@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-kexec.service": {
                "name": "systemd-kexec.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-localed.service": {
                "name": "systemd-localed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-logind.service": {
                "name": "systemd-logind.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-network-generator.service": {
                "name": "systemd-network-generator.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-networkd-wait-online.service": {
                "name": "systemd-networkd-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-oomd.service": {
                "name": "systemd-oomd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-pcrextend@.service": {
                "name": "systemd-pcrextend@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrfs-root.service": {
                "name": "systemd-pcrfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pcrfs@.service": {
                "name": "systemd-pcrfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrlock-file-system.service": {
                "name": "systemd-pcrlock-file-system.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-code.service": {
                "name": "systemd-pcrlock-firmware-code.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-config.service": {
                "name": "systemd-pcrlock-firmware-config.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-machine-id.service": {
                "name": "systemd-pcrlock-machine-id.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-make-policy.service": {
                "name": "systemd-pcrlock-make-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-authority.service": {
                "name": "systemd-pcrlock-secureboot-authority.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-policy.service": {
                "name": "systemd-pcrlock-secureboot-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock@.service": {
                "name": "systemd-pcrlock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrmachine.service": {
                "name": "systemd-pcrmachine.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-initrd.service": {
                "name": "systemd-pcrphase-initrd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-sysinit.service": {
                "name": "systemd-pcrphase-sysinit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase.service": {
                "name": "systemd-pcrphase.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pstore.service": {
                "name": "systemd-pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-quotacheck-root.service": {
                "name": "systemd-quotacheck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-quotacheck@.service": {
                "name": "systemd-quotacheck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-reboot.service": {
                "name": "systemd-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled-runtime"
            },
            "systemd-repart.service": {
                "name": "systemd-repart.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-rfkill.service": {
                "name": "systemd-rfkill.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-soft-reboot.service": {
                "name": "systemd-soft-reboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-suspend-then-hibernate.service": {
                "name": "systemd-suspend-then-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend.service": {
                "name": "systemd-suspend.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-sysext.service": {
                "name": "systemd-sysext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-sysext@.service": {
                "name": "systemd-sysext@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-sysupdate-reboot.service": {
                "name": "systemd-sysupdate-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysupdate.service": {
                "name": "systemd-sysupdate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysusers.service": {
                "name": "systemd-sysusers.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-timedated.service": {
                "name": "systemd-timedated.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-timesyncd.service": {
                "name": "systemd-timesyncd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev-early.service": {
                "name": "systemd-tmpfiles-setup-dev-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup-early.service": {
                "name": "systemd-tpm2-setup-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup.service": {
                "name": "systemd-tpm2-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-load-credentials.service": {
                "name": "systemd-udev-load-credentials.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udevd.service": {
                "name": "systemd-udevd.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-update-done.service": {
                "name": "systemd-update-done.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-userdbd.service": {
                "name": "systemd-userdbd.service",
                "source": "systemd",
                "state": "running",
                "status": "indirect"
            },
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-volatile-root.service": {
                "name": "systemd-volatile-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "user-runtime-dir@.service": {
                "name": "user-runtime-dir@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user-runtime-dir@0.service": {
                "name": "user-runtime-dir@0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "user@.service": {
                "name": "user@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user@0.service": {
                "name": "user@0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "ypbind.service": {
                "name": "ypbind.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            }
        }
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:44:42 -0500 (0:00:03.125)       0:02:39.975 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14
Saturday 15 February 2025  11:44:42 -0500 (0:00:00.042)       0:02:40.018 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_file_src": "quadlet-demo.network",
        "__podman_quadlet_spec": {},
        "__podman_quadlet_str": "[Network]\nSubnet=192.168.30.0/24\nGateway=192.168.30.1\nLabel=app=wordpress",
        "__podman_quadlet_template_src": ""
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25
Saturday 15 February 2025  11:44:42 -0500 (0:00:00.057)       0:02:40.076 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_continue_if_pull_fails": false,
        "__podman_pull_image": true,
        "__podman_state": "absent",
        "__podman_systemd_unit_scope": "",
        "__podman_user": "root"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35
Saturday 15 February 2025  11:44:42 -0500 (0:00:00.056)       0:02:40.132 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_quadlet_file_src",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48
Saturday 15 February 2025  11:44:42 -0500 (0:00:00.037)       0:02:40.169 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_name": "quadlet-demo",
        "__podman_quadlet_type": "network",
        "__podman_rootless": false
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Check user and group information] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57
Saturday 15 February 2025  11:44:42 -0500 (0:00:00.054)       0:02:40.223 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Get user information] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2
Saturday 15 February 2025  11:44:42 -0500 (0:00:00.068)       0:02:40.292 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user does not exist] **********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9
Saturday 15 February 2025  11:44:42 -0500 (0:00:00.047)       0:02:40.339 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set group for podman user] ************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16
Saturday 15 February 2025  11:44:42 -0500 (0:00:00.155)       0:02:40.495 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_group": "0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : See if getsubids exists] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:31
Saturday 15 February 2025  11:44:42 -0500 (0:00:00.079)       0:02:40.574 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637455.0608227,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 32,
        "charset": "binary",
        "checksum": "89ab10a2a8fa81bcc0c1df0058f200469ce46f97",
        "ctime": 1739637449.0297961,
        "dev": 51714,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 9120230,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "application/x-pie-executable",
        "mode": "0755",
        "mtime": 1730678400.0,
        "nlink": 1,
        "path": "/usr/bin/getsubids",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 15744,
        "uid": 0,
        "version": "2319697789",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subuids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:42
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.452)       0:02:41.026 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Check with getsubids for user subgids] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:47
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.062)       0:02:41.089 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:52
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.062)       0:02:41.152 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_user not in [\"root\", \"0\"]",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subuid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:65
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.064)       0:02:41.216 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Get subgid file] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:70
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.061)       0:02:41.277 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:75
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.060)       0:02:41.337 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:85
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.061)       0:02:41.399 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Fail if user not in subgid file] ******
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:92
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.061)       0:02:41.461 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "not __podman_stat_getsubids.stat.exists",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.059)       0:02:41.520 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_activate_systemd_unit": true,
        "__podman_images_found": [],
        "__podman_kube_yamls_raw": "",
        "__podman_service_name": "quadlet-demo-network.service",
        "__podman_systemd_scope": "system",
        "__podman_user_home_dir": "/root",
        "__podman_xdg_runtime_dir": "/run/user/0"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.104)       0:02:41.625 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_path": "/etc/containers/systemd"
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Get kube yaml contents] ***************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.066)       0:02:41.691 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.052)       0:02:41.744 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_images": [],
        "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo.network",
        "__podman_volumes": []
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:105
Saturday 15 February 2025  11:44:43 -0500 (0:00:00.129)       0:02:41.873 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Cleanup quadlets] *********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:112
Saturday 15 February 2025  11:44:44 -0500 (0:00:00.073)       0:02:41.947 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] *****************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4
Saturday 15 February 2025  11:44:44 -0500 (0:00:00.237)       0:02:42.185 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Stop and disable service] *************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12
Saturday 15 February 2025  11:44:44 -0500 (0:00:00.037)       0:02:42.222 ***** 
changed: [managed-node1] => {
    "changed": true,
    "enabled": false,
    "failed_when_result": false,
    "name": "quadlet-demo-network.service",
    "state": "stopped",
    "status": {
        "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0",
        "ActiveEnterTimestamp": "Sat 2025-02-15 11:42:33 EST",
        "ActiveEnterTimestampMonotonic": "948973259",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "active",
        "After": "system.slice network-online.target basic.target -.mount sysinit.target systemd-journald.socket",
        "AllowIsolate": "no",
        "AssertResult": "yes",
        "AssertTimestamp": "Sat 2025-02-15 11:42:33 EST",
        "AssertTimestampMonotonic": "948930032",
        "Before": "shutdown.target",
        "BindLogSockets": "no",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "CPUAccounting": "yes",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "38085000",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanLiveMount": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore",
        "CleanResult": "success",
        "CollectMode": "inactive",
        "ConditionResult": "yes",
        "ConditionTimestamp": "Sat 2025-02-15 11:42:33 EST",
        "ConditionTimestampMonotonic": "948930028",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "shutdown.target",
        "ControlGroupId": "154731",
        "ControlPID": "0",
        "CoredumpFilter": "0x33",
        "CoredumpReceive": "no",
        "DebugInvocation": "no",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "DefaultStartupMemoryLow": "0",
        "Delegate": "no",
        "Description": "quadlet-demo-network.service",
        "DevicePolicy": "auto",
        "DynamicUser": "no",
        "EffectiveMemoryHigh": "3698229248",
        "EffectiveMemoryMax": "3698229248",
        "EffectiveTasksMax": "22347",
        "ExecMainCode": "1",
        "ExecMainExitTimestamp": "Sat 2025-02-15 11:42:33 EST",
        "ExecMainExitTimestampMonotonic": "948973018",
        "ExecMainHandoffTimestamp": "Sat 2025-02-15 11:42:33 EST",
        "ExecMainHandoffTimestampMonotonic": "948941510",
        "ExecMainPID": "66424",
        "ExecMainStartTimestamp": "Sat 2025-02-15 11:42:33 EST",
        "ExecMainStartTimestampMonotonic": "948930778",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman network create --ignore --subnet 192.168.30.0/24 --gateway 192.168.30.1 --label app=wordpress systemd-quadlet-demo ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExecStartEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman network create --ignore --subnet 192.168.30.0/24 --gateway 192.168.30.1 --label app=wordpress systemd-quadlet-demo ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }",
        "ExitType": "main",
        "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FileDescriptorStorePreserve": "restart",
        "FinalKillSignal": "9",
        "FragmentPath": "/run/systemd/generator/quadlet-demo-network.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOReadBytes": "[not set]",
        "IOReadOperations": "[not set]",
        "IOSchedulingClass": "2",
        "IOSchedulingPriority": "4",
        "IOWeight": "[not set]",
        "IOWriteBytes": "[not set]",
        "IOWriteOperations": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "[no data]",
        "IPEgressPackets": "[no data]",
        "IPIngressBytes": "[no data]",
        "IPIngressPackets": "[no data]",
        "Id": "quadlet-demo-network.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestamp": "Sat 2025-02-15 11:42:33 EST",
        "InactiveExitTimestampMonotonic": "948931299",
        "InvocationID": "8405c19c0bbc49f194e87ad950c85609",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "control-group",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "infinity",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "8388608",
        "LimitMEMLOCKSoft": "8388608",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "524288",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "13967",
        "LimitNPROCSoft": "13967",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "13967",
        "LimitSIGPENDINGSoft": "13967",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LiveMountResult": "success",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "0",
        "ManagedOOMMemoryPressure": "auto",
        "ManagedOOMMemoryPressureDurationUSec": "[not set]",
        "ManagedOOMMemoryPressureLimit": "0",
        "ManagedOOMPreference": "none",
        "ManagedOOMSwap": "auto",
        "MemoryAccounting": "yes",
        "MemoryAvailable": "3196116992",
        "MemoryCurrent": "[not set]",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryKSM": "no",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemoryPeak": "16297984",
        "MemoryPressureThresholdUSec": "200ms",
        "MemoryPressureWatch": "auto",
        "MemorySwapCurrent": "[not set]",
        "MemorySwapMax": "infinity",
        "MemorySwapPeak": "0",
        "MemoryZSwapCurrent": "[not set]",
        "MemoryZSwapMax": "infinity",
        "MemoryZSwapWriteback": "yes",
        "MountAPIVFS": "no",
        "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAPolicy": "n/a",
        "Names": "quadlet-demo-network.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMPolicy": "stop",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "OnSuccessJobMode": "fail",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateIPC": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivatePIDs": "no",
        "PrivateTmp": "no",
        "PrivateTmpEx": "no",
        "PrivateUsers": "no",
        "PrivateUsersEx": "no",
        "ProcSubset": "all",
        "ProtectClock": "no",
        "ProtectControlGroups": "no",
        "ProtectControlGroupsEx": "no",
        "ProtectHome": "no",
        "ProtectHostname": "no",
        "ProtectKernelLogs": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectProc": "default",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "ReloadResult": "success",
        "ReloadSignal": "1",
        "RemainAfterExit": "yes",
        "RemoveIPC": "no",
        "Requires": "-.mount sysinit.target system.slice",
        "RequiresMountsFor": "/run/containers",
        "Restart": "no",
        "RestartKillSignal": "15",
        "RestartMaxDelayUSec": "infinity",
        "RestartMode": "normal",
        "RestartSteps": "0",
        "RestartUSec": "100ms",
        "RestartUSecNext": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RootEphemeral": "no",
        "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "RuntimeRandomizedExtraUSec": "0",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "SetLoginEnvironment": "no",
        "Slice": "system.slice",
        "SourcePath": "/etc/containers/systemd/quadlet-demo.network",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StartupMemoryHigh": "infinity",
        "StartupMemoryLow": "0",
        "StartupMemoryMax": "infinity",
        "StartupMemorySwapMax": "infinity",
        "StartupMemoryZSwapMax": "infinity",
        "StateChangeTimestamp": "Sat 2025-02-15 11:42:33 EST",
        "StateChangeTimestampMonotonic": "948973259",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "exited",
        "SuccessAction": "none",
        "SurviveFinalKillSignal": "no",
        "SyslogFacility": "3",
        "SyslogIdentifier": "quadlet-demo-network",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "2147483646",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "[not set]",
        "TasksMax": "22347",
        "TimeoutAbortUSec": "1min 30s",
        "TimeoutCleanUSec": "infinity",
        "TimeoutStartFailureMode": "terminate",
        "TimeoutStartUSec": "infinity",
        "TimeoutStopFailureMode": "terminate",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "oneshot",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "disabled",
        "UnitFileState": "generated",
        "UtmpMode": "init",
        "Wants": "network-online.target",
        "WatchdogSignal": "6",
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "0"
    }
}
TASK [fedora.linux_system_roles.podman : See if quadlet file exists] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33
Saturday 15 February 2025  11:44:45 -0500 (0:00:00.813)       0:02:43.036 ***** 
ok: [managed-node1] => {
    "changed": false,
    "stat": {
        "atime": 1739637752.2849863,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 8,
        "charset": "us-ascii",
        "checksum": "e57c08d49aff4bae8daab138d913aeddaa8682a0",
        "ctime": 1739637752.286986,
        "dev": 51714,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 171966675,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "text/plain",
        "mode": "0644",
        "mtime": 1739637751.8169842,
        "nlink": 1,
        "path": "/etc/containers/systemd/quadlet-demo.network",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 74,
        "uid": 0,
        "version": "1483552506",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38
Saturday 15 February 2025  11:44:45 -0500 (0:00:00.392)       0:02:43.429 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Slurp quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:6
Saturday 15 February 2025  11:44:45 -0500 (0:00:00.069)       0:02:43.499 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet file] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:12
Saturday 15 February 2025  11:44:45 -0500 (0:00:00.389)       0:02:43.888 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Parse quadlet yaml file] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:44
Saturday 15 February 2025  11:44:46 -0500 (0:00:00.064)       0:02:43.953 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Reset raw variable] *******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:52
Saturday 15 February 2025  11:44:46 -0500 (0:00:00.044)       0:02:43.997 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_raw": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Remove quadlet file] ******************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42
Saturday 15 February 2025  11:44:46 -0500 (0:00:00.049)       0:02:44.046 ***** 
changed: [managed-node1] => {
    "changed": true,
    "path": "/etc/containers/systemd/quadlet-demo.network",
    "state": "absent"
}
TASK [fedora.linux_system_roles.podman : Refresh systemd] **********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48
Saturday 15 February 2025  11:44:46 -0500 (0:00:00.386)       0:02:44.433 ***** 
ok: [managed-node1] => {
    "changed": false,
    "name": null,
    "status": {}
}
TASK [fedora.linux_system_roles.podman : Remove managed resource] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58
Saturday 15 February 2025  11:44:47 -0500 (0:00:00.767)       0:02:45.200 ***** 
changed: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": true
}
TASK [fedora.linux_system_roles.podman : Remove volumes] ***********************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:99
Saturday 15 February 2025  11:44:47 -0500 (0:00:00.460)       0:02:45.660 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] *********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116
Saturday 15 February 2025  11:44:47 -0500 (0:00:00.056)       0:02:45.717 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "__podman_quadlet_parsed": null
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:120
Saturday 15 February 2025  11:44:47 -0500 (0:00:00.060)       0:02:45.777 ***** 
changed: [managed-node1] => {
    "changed": true,
    "cmd": [
        "podman",
        "image",
        "prune",
        "--all",
        "-f"
    ],
    "delta": "0:00:00.029001",
    "end": "2025-02-15 11:44:48.217043",
    "rc": 0,
    "start": "2025-02-15 11:44:48.188042"
}
TASK [fedora.linux_system_roles.podman : Manage linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:131
Saturday 15 February 2025  11:44:48 -0500 (0:00:00.444)       0:02:46.222 ***** 
included: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed-node1
TASK [fedora.linux_system_roles.podman : Enable linger if needed] **************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12
Saturday 15 February 2025  11:44:48 -0500 (0:00:00.172)       0:02:46.395 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18
Saturday 15 February 2025  11:44:48 -0500 (0:00:00.044)       0:02:46.439 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22
Saturday 15 February 2025  11:44:48 -0500 (0:00:00.037)       0:02:46.477 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_rootless | bool",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - images] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:141
Saturday 15 February 2025  11:44:48 -0500 (0:00:00.034)       0:02:46.512 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "images",
        "-n"
    ],
    "delta": "0:00:00.030617",
    "end": "2025-02-15 11:44:48.948666",
    "rc": 0,
    "start": "2025-02-15 11:44:48.918049"
}
STDOUT:
quay.io/libpod/registry  2.8.2       0030ba3d620c  18 months ago  24.6 MB
TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:150
Saturday 15 February 2025  11:44:49 -0500 (0:00:00.446)       0:02:46.958 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "volume",
        "ls",
        "-n"
    ],
    "delta": "0:00:00.027003",
    "end": "2025-02-15 11:44:49.419866",
    "rc": 0,
    "start": "2025-02-15 11:44:49.392863"
}
STDOUT:
local       2fefbc9190adbe5ffeef28f6e938304e8cedd541e0d51e013b7101e905a15702
TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:159
Saturday 15 February 2025  11:44:49 -0500 (0:00:00.479)       0:02:47.437 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "ps",
        "--noheading"
    ],
    "delta": "0:00:00.033299",
    "end": "2025-02-15 11:44:49.906546",
    "rc": 0,
    "start": "2025-02-15 11:44:49.873247"
}
STDOUT:
a5dec710fc02  quay.io/libpod/registry:2.8.2  /etc/docker/regis...  6 minutes ago  Up 6 minutes  127.0.0.1:5000->5000/tcp  podman_registry
TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:168
Saturday 15 February 2025  11:44:49 -0500 (0:00:00.456)       0:02:47.893 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": [
        "podman",
        "network",
        "ls",
        "-n",
        "-q"
    ],
    "delta": "0:00:00.028716",
    "end": "2025-02-15 11:44:50.323489",
    "rc": 0,
    "start": "2025-02-15 11:44:50.294773"
}
STDOUT:
podman
podman-default-kube-network
TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:177
Saturday 15 February 2025  11:44:50 -0500 (0:00:00.437)       0:02:48.330 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - pods] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:187
Saturday 15 February 2025  11:44:50 -0500 (0:00:00.448)       0:02:48.779 ***** 
ok: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : For testing and debugging - services] ***
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197
Saturday 15 February 2025  11:44:51 -0500 (0:00:00.430)       0:02:49.210 ***** 
ok: [managed-node1] => {
    "ansible_facts": {
        "services": {
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "NetworkManager.service": {
                "name": "NetworkManager.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "apt-daily.service": {
                "name": "apt-daily.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "audit-rules.service": {
                "name": "audit-rules.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "auditd.service": {
                "name": "auditd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "auth-rpcgss-module.service": {
                "name": "auth-rpcgss-module.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "autofs.service": {
                "name": "autofs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "autovt@.service": {
                "name": "autovt@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "blk-availability.service": {
                "name": "blk-availability.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "capsule@.service": {
                "name": "capsule@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "certmonger.service": {
                "name": "certmonger.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "chrony-wait.service": {
                "name": "chrony-wait.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd-restricted.service": {
                "name": "chronyd-restricted.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd.service": {
                "name": "chronyd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "cloud-config.service": {
                "name": "cloud-config.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-final.service": {
                "name": "cloud-final.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init-hotplugd.service": {
                "name": "cloud-init-hotplugd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "cloud-init-local.service": {
                "name": "cloud-init-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init.service": {
                "name": "cloud-init.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "console-getty.service": {
                "name": "console-getty.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "container-getty@.service": {
                "name": "container-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "crond.service": {
                "name": "crond.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-broker.service": {
                "name": "dbus-broker.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-org.fedoraproject.FirewallD1.service": {
                "name": "dbus-org.fedoraproject.FirewallD1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus.service": {
                "name": "dbus.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "debug-shell.service": {
                "name": "debug-shell.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd.service": {
                "name": "dhcpcd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dhcpcd@.service": {
                "name": "dhcpcd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "display-manager.service": {
                "name": "display-manager.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "dm-event.service": {
                "name": "dm-event.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-makecache.service": {
                "name": "dnf-makecache.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-system-upgrade-cleanup.service": {
                "name": "dnf-system-upgrade-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "dnf-system-upgrade.service": {
                "name": "dnf-system-upgrade.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-mount.service": {
                "name": "dracut-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown-onfailure.service": {
                "name": "dracut-shutdown-onfailure.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ebtables.service": {
                "name": "ebtables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "emergency.service": {
                "name": "emergency.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fips-crypto-policy-overlay.service": {
                "name": "fips-crypto-policy-overlay.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "firewalld.service": {
                "name": "firewalld.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "fsidd.service": {
                "name": "fsidd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "fstrim.service": {
                "name": "fstrim.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "getty@.service": {
                "name": "getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "enabled"
            },
            "getty@tty1.service": {
                "name": "getty@tty1.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "grub-boot-indeterminate.service": {
                "name": "grub-boot-indeterminate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "grub2-systemd-integration.service": {
                "name": "grub2-systemd-integration.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "gssproxy.service": {
                "name": "gssproxy.service",
                "source": "systemd",
                "state": "running",
                "status": "disabled"
            },
            "hv_kvp_daemon.service": {
                "name": "hv_kvp_daemon.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ip6tables.service": {
                "name": "ip6tables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ipset.service": {
                "name": "ipset.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iptables.service": {
                "name": "iptables.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "irqbalance.service": {
                "name": "irqbalance.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "kdump.service": {
                "name": "kdump.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "kvm_stat.service": {
                "name": "kvm_stat.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "ldconfig.service": {
                "name": "ldconfig.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "logrotate.service": {
                "name": "logrotate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm-devices-import.service": {
                "name": "lvm-devices-import.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "lvm2-lvmpolld.service": {
                "name": "lvm2-lvmpolld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-monitor.service": {
                "name": "lvm2-monitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "man-db-cache-update.service": {
                "name": "man-db-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "man-db-restart-cache-update.service": {
                "name": "man-db-restart-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "microcode.service": {
                "name": "microcode.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "modprobe@.service": {
                "name": "modprobe@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "modprobe@configfs.service": {
                "name": "modprobe@configfs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@dm_mod.service": {
                "name": "modprobe@dm_mod.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@drm.service": {
                "name": "modprobe@drm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@efi_pstore.service": {
                "name": "modprobe@efi_pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@fuse.service": {
                "name": "modprobe@fuse.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@loop.service": {
                "name": "modprobe@loop.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "netavark-dhcp-proxy.service": {
                "name": "netavark-dhcp-proxy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "netavark-firewalld-reload.service": {
                "name": "netavark-firewalld-reload.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-blkmap.service": {
                "name": "nfs-blkmap.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-idmapd.service": {
                "name": "nfs-idmapd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-mountd.service": {
                "name": "nfs-mountd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-server.service": {
                "name": "nfs-server.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "nfs-utils.service": {
                "name": "nfs-utils.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfsdcld.service": {
                "name": "nfsdcld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nftables.service": {
                "name": "nftables.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nis-domainname.service": {
                "name": "nis-domainname.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nm-priv-helper.service": {
                "name": "nm-priv-helper.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "ntpd.service": {
                "name": "ntpd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ntpdate.service": {
                "name": "ntpdate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "pam_namespace.service": {
                "name": "pam_namespace.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "pcscd.service": {
                "name": "pcscd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "plymouth-start.service": {
                "name": "plymouth-start.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "podman-auto-update.service": {
                "name": "podman-auto-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-clean-transient.service": {
                "name": "podman-clean-transient.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman-kube@.service": {
                "name": "podman-kube@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "podman-restart.service": {
                "name": "podman-restart.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "podman.service": {
                "name": "podman.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "polkit.service": {
                "name": "polkit.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "quotaon-root.service": {
                "name": "quotaon-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "quotaon@.service": {
                "name": "quotaon@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "rc-local.service": {
                "name": "rc-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rescue.service": {
                "name": "rescue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "restraintd.service": {
                "name": "restraintd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rngd.service": {
                "name": "rngd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpc-gssd.service": {
                "name": "rpc-gssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd-notify.service": {
                "name": "rpc-statd-notify.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd.service": {
                "name": "rpc-statd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-svcgssd.service": {
                "name": "rpc-svcgssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rpcbind.service": {
                "name": "rpcbind.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpmdb-migrate.service": {
                "name": "rpmdb-migrate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rpmdb-rebuild.service": {
                "name": "rpmdb-rebuild.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rsyslog.service": {
                "name": "rsyslog.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "selinux-autorelabel-mark.service": {
                "name": "selinux-autorelabel-mark.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel.service": {
                "name": "selinux-autorelabel.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "selinux-check-proper-disable.service": {
                "name": "selinux-check-proper-disable.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "serial-getty@.service": {
                "name": "serial-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "sntp.service": {
                "name": "sntp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ssh-host-keys-migration.service": {
                "name": "ssh-host-keys-migration.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "sshd-keygen.service": {
                "name": "sshd-keygen.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen@.service": {
                "name": "sshd-keygen@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "sshd-keygen@ecdsa.service": {
                "name": "sshd-keygen@ecdsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@ed25519.service": {
                "name": "sshd-keygen@ed25519.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@rsa.service": {
                "name": "sshd-keygen@rsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-unix-local@.service": {
                "name": "sshd-unix-local@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd-vsock@.service": {
                "name": "sshd-vsock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "sshd.service": {
                "name": "sshd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "sshd@.service": {
                "name": "sshd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "sssd-autofs.service": {
                "name": "sssd-autofs.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-kcm.service": {
                "name": "sssd-kcm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "sssd-nss.service": {
                "name": "sssd-nss.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pac.service": {
                "name": "sssd-pac.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pam.service": {
                "name": "sssd-pam.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-ssh.service": {
                "name": "sssd-ssh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-sudo.service": {
                "name": "sssd-sudo.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd.service": {
                "name": "sssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "syslog.service": {
                "name": "syslog.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "system-update-cleanup.service": {
                "name": "system-update-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-battery-check.service": {
                "name": "systemd-battery-check.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-bless-boot.service": {
                "name": "systemd-bless-boot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-boot-check-no-failures.service": {
                "name": "systemd-boot-check-no-failures.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-boot-random-seed.service": {
                "name": "systemd-boot-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-boot-update.service": {
                "name": "systemd-boot-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-bootctl@.service": {
                "name": "systemd-bootctl@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-confext.service": {
                "name": "systemd-confext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-coredump@.service": {
                "name": "systemd-coredump@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-creds@.service": {
                "name": "systemd-creds@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-exit.service": {
                "name": "systemd-exit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-growfs-root.service": {
                "name": "systemd-growfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-growfs@.service": {
                "name": "systemd-growfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-halt.service": {
                "name": "systemd-halt.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hibernate-clear.service": {
                "name": "systemd-hibernate-clear.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate-resume.service": {
                "name": "systemd-hibernate-resume.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-initctl.service": {
                "name": "systemd-initctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journald-sync@.service": {
                "name": "systemd-journald-sync@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-journald.service": {
                "name": "systemd-journald.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-journald@.service": {
                "name": "systemd-journald@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-kexec.service": {
                "name": "systemd-kexec.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-localed.service": {
                "name": "systemd-localed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-logind.service": {
                "name": "systemd-logind.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-network-generator.service": {
                "name": "systemd-network-generator.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-networkd-wait-online.service": {
                "name": "systemd-networkd-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-oomd.service": {
                "name": "systemd-oomd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-pcrextend@.service": {
                "name": "systemd-pcrextend@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrfs-root.service": {
                "name": "systemd-pcrfs-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pcrfs@.service": {
                "name": "systemd-pcrfs@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrlock-file-system.service": {
                "name": "systemd-pcrlock-file-system.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-code.service": {
                "name": "systemd-pcrlock-firmware-code.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-firmware-config.service": {
                "name": "systemd-pcrlock-firmware-config.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-machine-id.service": {
                "name": "systemd-pcrlock-machine-id.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-make-policy.service": {
                "name": "systemd-pcrlock-make-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-authority.service": {
                "name": "systemd-pcrlock-secureboot-authority.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock-secureboot-policy.service": {
                "name": "systemd-pcrlock-secureboot-policy.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-pcrlock@.service": {
                "name": "systemd-pcrlock@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-pcrmachine.service": {
                "name": "systemd-pcrmachine.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-initrd.service": {
                "name": "systemd-pcrphase-initrd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase-sysinit.service": {
                "name": "systemd-pcrphase-sysinit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-pcrphase.service": {
                "name": "systemd-pcrphase.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pstore.service": {
                "name": "systemd-pstore.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-quotacheck-root.service": {
                "name": "systemd-quotacheck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-quotacheck@.service": {
                "name": "systemd-quotacheck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-reboot.service": {
                "name": "systemd-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled-runtime"
            },
            "systemd-repart.service": {
                "name": "systemd-repart.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-rfkill.service": {
                "name": "systemd-rfkill.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-soft-reboot.service": {
                "name": "systemd-soft-reboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-suspend-then-hibernate.service": {
                "name": "systemd-suspend-then-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend.service": {
                "name": "systemd-suspend.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-sysext.service": {
                "name": "systemd-sysext.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-sysext@.service": {
                "name": "systemd-sysext@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-sysupdate-reboot.service": {
                "name": "systemd-sysupdate-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysupdate.service": {
                "name": "systemd-sysupdate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "systemd-sysusers.service": {
                "name": "systemd-sysusers.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-timedated.service": {
                "name": "systemd-timedated.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-timesyncd.service": {
                "name": "systemd-timesyncd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev-early.service": {
                "name": "systemd-tmpfiles-setup-dev-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup-early.service": {
                "name": "systemd-tpm2-setup-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tpm2-setup.service": {
                "name": "systemd-tpm2-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-load-credentials.service": {
                "name": "systemd-udev-load-credentials.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udevd.service": {
                "name": "systemd-udevd.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-update-done.service": {
                "name": "systemd-update-done.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-userdbd.service": {
                "name": "systemd-userdbd.service",
                "source": "systemd",
                "state": "running",
                "status": "indirect"
            },
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-volatile-root.service": {
                "name": "systemd-volatile-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "user-runtime-dir@.service": {
                "name": "user-runtime-dir@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user-runtime-dir@0.service": {
                "name": "user-runtime-dir@0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "user@.service": {
                "name": "user@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user@0.service": {
                "name": "user@0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "ypbind.service": {
                "name": "ypbind.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            }
        }
    },
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Create and update quadlets] ***********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:116
Saturday 15 February 2025  11:44:53 -0500 (0:00:02.115)       0:02:51.326 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "false_condition": "__podman_state != \"absent\"",
    "skip_reason": "Conditional result was False"
}
TASK [fedora.linux_system_roles.podman : Cancel linger] ************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:196
Saturday 15 February 2025  11:44:53 -0500 (0:00:00.057)       0:02:51.383 ***** 
skipping: [managed-node1] => {
    "changed": false,
    "skipped_reason": "No items in the list"
}
TASK [fedora.linux_system_roles.podman : Handle credential files - absent] *****
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:202
Saturday 15 February 2025  11:44:53 -0500 (0:00:00.055)       0:02:51.439 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [fedora.linux_system_roles.podman : Handle certs.d files - absent] ********
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:211
Saturday 15 February 2025  11:44:53 -0500 (0:00:00.056)       0:02:51.495 ***** 
skipping: [managed-node1] => {
    "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
    "changed": false
}
TASK [Ensure no resources] *****************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:188
Saturday 15 February 2025  11:44:53 -0500 (0:00:00.177)       0:02:51.673 ***** 
fatal: [managed-node1]: FAILED! => {
    "assertion": "__podman_test_debug_images.stdout == \"\"",
    "changed": false,
    "evaluated_to": false
}
MSG:
Assertion failed
TASK [Debug] *******************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:199
Saturday 15 February 2025  11:44:53 -0500 (0:00:00.066)       0:02:51.739 ***** 
ok: [managed-node1] => {
    "changed": false,
    "cmd": "exec 1>&2\nset -x\nset -o pipefail\nsystemctl list-units --plain -l --all | grep quadlet || :\nsystemctl list-unit-files --all | grep quadlet || :\nsystemctl list-units --plain --failed -l --all | grep quadlet || :\n",
    "delta": "0:00:00.378121",
    "end": "2025-02-15 11:44:54.527604",
    "rc": 0,
    "start": "2025-02-15 11:44:54.149483"
}
STDERR:
+ set -o pipefail
+ systemctl list-units --plain -l --all
+ grep quadlet
+ :
+ systemctl list-unit-files --all
+ grep quadlet
+ :
+ grep quadlet
+ systemctl list-units --plain --failed -l --all
+ :
TASK [Get journald] ************************************************************
task path: /tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:209
Saturday 15 February 2025  11:44:54 -0500 (0:00:00.773)       0:02:52.512 ***** 
fatal: [managed-node1]: FAILED! => {
    "changed": false,
    "cmd": [
        "journalctl",
        "-ex"
    ],
    "delta": "0:00:00.032661",
    "end": "2025-02-15 11:44:54.935902",
    "failed_when_result": true,
    "rc": 0,
    "start": "2025-02-15 11:44:54.903241"
}
STDOUT:
Feb 15 11:41:35 managed-node1 sudo[56249]: pam_unix(sudo:session): session opened for user auth_test_user1(uid=2001) by root(uid=0)
Feb 15 11:41:35 managed-node1 python3.12[56252]: ansible-containers.podman.podman_play Invoked with state=absent kube_file=/home/auth_test_user1/.config/containers/ansible-kubernetes.d/auth_test_1_kube.yml executable=podman annotation=None kube_file_content=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None debug=None quiet=None recreate=None userns=None log_level=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 15 11:41:35 managed-node1 python3.12[56252]: ansible-containers.podman.podman_play version: 5.3.1, kube file /home/auth_test_user1/.config/containers/ansible-kubernetes.d/auth_test_1_kube.yml
Feb 15 11:41:35 managed-node1 systemd[39720]: Started podman-56260.scope.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 2520.
Feb 15 11:41:35 managed-node1 sudo[56249]: pam_unix(sudo:session): session closed for user auth_test_user1
Feb 15 11:41:36 managed-node1 python3.12[56398]: ansible-file Invoked with path=/home/auth_test_user1/.config/containers/ansible-kubernetes.d/auth_test_1_kube.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:41:37 managed-node1 python3.12[56529]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:41:38 managed-node1 python3.12[56662]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids auth_test_user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:41:38 managed-node1 python3.12[56794]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g auth_test_user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:41:39 managed-node1 python3.12[56926]: ansible-stat Invoked with path=/run/user/2001 follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:41:40 managed-node1 sudo[57101]:     root : TTY=pts/0 ; PWD=/root ; USER=auth_test_user1 ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isworjfwlotocrtbysathwcyfbunvzrp ; XDG_RUNTIME_DIR=/run/user/2001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1739637700.082501-18621-172744025236423/AnsiballZ_systemd.py'
Feb 15 11:41:40 managed-node1 sudo[57101]: pam_unix(sudo:session): session opened for user auth_test_user1(uid=2001) by root(uid=0)
Feb 15 11:41:40 managed-node1 python3.12[57104]: ansible-systemd Invoked with name=auth_test_1_quadlet.service scope=user state=stopped enabled=False force=True daemon_reload=False daemon_reexec=False no_block=False masked=None
Feb 15 11:41:40 managed-node1 systemd[39720]: Reload requested from client PID 57107 ('systemctl')...
Feb 15 11:41:40 managed-node1 systemd[39720]: Reloading...
Feb 15 11:41:40 managed-node1 systemd[39720]: Reloading finished in 47 ms.
Feb 15 11:41:40 managed-node1 sudo[57101]: pam_unix(sudo:session): session closed for user auth_test_user1
Feb 15 11:41:41 managed-node1 python3.12[57247]: ansible-stat Invoked with path=/home/auth_test_user1/.config/containers/systemd/auth_test_1_quadlet.container follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:41:42 managed-node1 python3.12[57511]: ansible-file Invoked with path=/home/auth_test_user1/.config/containers/systemd/auth_test_1_quadlet.container state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:41:42 managed-node1 sudo[57684]:     root : TTY=pts/0 ; PWD=/root ; USER=auth_test_user1 ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpgicuflssmzfvicgplnagtnglacwcoa ; XDG_RUNTIME_DIR=/run/user/2001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1739637702.4887562-18734-230170832246732/AnsiballZ_systemd.py'
Feb 15 11:41:42 managed-node1 sudo[57684]: pam_unix(sudo:session): session opened for user auth_test_user1(uid=2001) by root(uid=0)
Feb 15 11:41:43 managed-node1 python3.12[57687]: ansible-systemd Invoked with daemon_reload=True scope=user daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:41:43 managed-node1 systemd[39720]: Reload requested from client PID 57688 ('systemctl')...
Feb 15 11:41:43 managed-node1 systemd[39720]: Reloading...
Feb 15 11:41:43 managed-node1 systemd[39720]: Reloading finished in 45 ms.
Feb 15 11:41:43 managed-node1 sudo[57684]: pam_unix(sudo:session): session closed for user auth_test_user1
Feb 15 11:41:43 managed-node1 sudo[57870]:     root : TTY=pts/0 ; PWD=/root ; USER=auth_test_user1 ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulirhspahzrttotkcnpjazopwopfxqkc ; XDG_RUNTIME_DIR=/run/user/2001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1739637703.3652833-18780-133744402841072/AnsiballZ_command.py'
Feb 15 11:41:43 managed-node1 sudo[57870]: pam_unix(sudo:session): session opened for user auth_test_user1(uid=2001) by root(uid=0)
Feb 15 11:41:43 managed-node1 systemd[39720]: Started podman-57874.scope.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 2524.
Feb 15 11:41:43 managed-node1 sudo[57870]: pam_unix(sudo:session): session closed for user auth_test_user1
Feb 15 11:41:45 managed-node1 systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories...
░░ Subject: A start job for unit systemd-tmpfiles-clean.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit systemd-tmpfiles-clean.service has begun execution.
░░ 
░░ The job identifier is 7084.
Feb 15 11:41:45 managed-node1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit systemd-tmpfiles-clean.service has successfully entered the 'dead' state.
Feb 15 11:41:45 managed-node1 systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories.
░░ Subject: A start job for unit systemd-tmpfiles-clean.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit systemd-tmpfiles-clean.service has finished successfully.
░░ 
░░ The job identifier is 7084.
Feb 15 11:41:45 managed-node1 python3.12[58015]: ansible-stat Invoked with path=/run/user/2001 follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:41:45 managed-node1 sudo[58191]:     root : TTY=pts/0 ; PWD=/root ; USER=auth_test_user1 ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjnbbyxfuupldfaimsrqavwhxgzyydep ; XDG_RUNTIME_DIR=/run/user/2001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1739637705.49974-18894-277687363707326/AnsiballZ_podman_container_info.py'
Feb 15 11:41:45 managed-node1 sudo[58191]: pam_unix(sudo:session): session opened for user auth_test_user1(uid=2001) by root(uid=0)
Feb 15 11:41:46 managed-node1 python3.12[58194]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Feb 15 11:41:46 managed-node1 systemd[39720]: Started podman-58195.scope.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 2528.
Feb 15 11:41:46 managed-node1 sudo[58191]: pam_unix(sudo:session): session closed for user auth_test_user1
Feb 15 11:41:46 managed-node1 sudo[58374]:     root : TTY=pts/0 ; PWD=/root ; USER=auth_test_user1 ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvockmzthdksqucgovvyompmkxvxdpnf ; XDG_RUNTIME_DIR=/run/user/2001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1739637706.2257183-18932-156731950470627/AnsiballZ_command.py'
Feb 15 11:41:46 managed-node1 sudo[58374]: pam_unix(sudo:session): session opened for user auth_test_user1(uid=2001) by root(uid=0)
Feb 15 11:41:46 managed-node1 python3.12[58378]: ansible-ansible.legacy.command Invoked with _raw_params=podman network ls -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:41:46 managed-node1 systemd[39720]: Started podman-58379.scope.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 2532.
Feb 15 11:41:46 managed-node1 sudo[58374]: pam_unix(sudo:session): session closed for user auth_test_user1
Feb 15 11:41:46 managed-node1 sudo[58559]:     root : TTY=pts/0 ; PWD=/root ; USER=auth_test_user1 ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsaufjkbhuhofooznneebcjxsfnqaurn ; XDG_RUNTIME_DIR=/run/user/2001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1739637706.7190015-18950-226890045591740/AnsiballZ_command.py'
Feb 15 11:41:46 managed-node1 sudo[58559]: pam_unix(sudo:session): session opened for user auth_test_user1(uid=2001) by root(uid=0)
Feb 15 11:41:47 managed-node1 python3.12[58562]: ansible-ansible.legacy.command Invoked with _raw_params=podman secret ls -n -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:41:47 managed-node1 systemd[39720]: Started podman-58563.scope.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 2536.
Feb 15 11:41:47 managed-node1 sudo[58559]: pam_unix(sudo:session): session closed for user auth_test_user1
Feb 15 11:41:47 managed-node1 python3.12[58702]: ansible-ansible.legacy.command Invoked with removes=/var/lib/systemd/linger/auth_test_user1 _raw_params=loginctl disable-linger auth_test_user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None stdin=None
Feb 15 11:41:47 managed-node1 systemd[1]: Stopping user@2001.service - User Manager for UID 2001...
░░ Subject: A stop job for unit user@2001.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit user@2001.service has begun execution.
░░ 
░░ The job identifier is 7092.
Feb 15 11:41:47 managed-node1 systemd[39720]: Activating special unit exit.target...
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopping podman-pause-de828d78.scope...
░░ Subject: A stop job for unit UNIT has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has begun execution.
░░ 
░░ The job identifier is 2547.
Feb 15 11:41:47 managed-node1 systemd[39720]: Removed slice app-podman\x2dkube.slice - Slice /app/podman-kube.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2551 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: app-podman\x2dkube.slice: Consumed 34.303s CPU time, 72.3M memory peak.
░░ Subject: Resources consumed by unit runtime
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit UNIT completed and consumed the indicated resources.
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopped target default.target - Main User Target.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2550 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopped target basic.target - Basic System.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2552 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopped target paths.target - Paths.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2553 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopped target sockets.target - Sockets.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2560 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopped target timers.target - Timers.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2545 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopped grub-boot-success.timer - Mark boot as successful after the user session has run 2 minutes.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2561 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopped systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2557 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopping dbus-broker.service - D-Bus User Message Bus...
░░ Subject: A stop job for unit UNIT has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has begun execution.
░░ 
░░ The job identifier is 2548.
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopped systemd-tmpfiles-setup.service - Create User Files and Directories.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2554 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopped podman-pause-de828d78.scope.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2547 and the job result is done.
Feb 15 11:41:47 managed-node1 dbus-broker[39933]: Dispatched 18186 messages @ 5(±11)μs / message.
░░ Subject: Dispatched 18186 messages
░░ Defined-By: dbus-broker
░░ Support: https://groups.google.com/forum/#!forum/bus1-devel
░░ 
░░ This message is printed by dbus-broker when shutting down. It includes metric
░░ information collected during the runtime of dbus-broker.
░░ 
░░ The message lists the number of dispatched messages
░░ (in this case 18186) as well as the mean time to
░░ handling a single message. The time measurements exclude the time spent on
░░ writing to and reading from the kernel.
Feb 15 11:41:47 managed-node1 systemd[39720]: Stopped dbus-broker.service - D-Bus User Message Bus.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2548 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: Removed slice session.slice - User Core Session Slice.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2559 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: Removed slice user.slice - Slice /user.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2546 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: user.slice: Consumed 7.019s CPU time, 64.5M memory peak.
░░ Subject: Resources consumed by unit runtime
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit UNIT completed and consumed the indicated resources.
Feb 15 11:41:47 managed-node1 systemd[39720]: Closed dbus.socket - D-Bus User Message Bus Socket.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2556 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: Removed slice app.slice - User Application Slice.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 2555 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[39720]: app.slice: Consumed 34.331s CPU time, 72.4M memory peak.
░░ Subject: Resources consumed by unit runtime
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit UNIT completed and consumed the indicated resources.
Feb 15 11:41:47 managed-node1 systemd[39720]: Reached target shutdown.target - Shutdown.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 2543.
Feb 15 11:41:47 managed-node1 systemd[39720]: Finished systemd-exit.service - Exit the Session.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 2541.
Feb 15 11:41:47 managed-node1 systemd[39720]: Reached target exit.target - Exit the Session.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 2540.
Feb 15 11:41:47 managed-node1 systemd[1]: user@2001.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit user@2001.service has successfully entered the 'dead' state.
Feb 15 11:41:47 managed-node1 systemd[1]: Stopped user@2001.service - User Manager for UID 2001.
░░ Subject: A stop job for unit user@2001.service has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit user@2001.service has finished.
░░ 
░░ The job identifier is 7092 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[1]: user@2001.service: Consumed 44.463s CPU time, 93.8M memory peak.
░░ Subject: Resources consumed by unit runtime
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit user@2001.service completed and consumed the indicated resources.
Feb 15 11:41:47 managed-node1 systemd[1]: Stopping user-runtime-dir@2001.service - User Runtime Directory /run/user/2001...
░░ Subject: A stop job for unit user-runtime-dir@2001.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit user-runtime-dir@2001.service has begun execution.
░░ 
░░ The job identifier is 7091.
Feb 15 11:41:47 managed-node1 systemd[1]: run-user-2001.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit run-user-2001.mount has successfully entered the 'dead' state.
Feb 15 11:41:47 managed-node1 systemd[1]: user-runtime-dir@2001.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit user-runtime-dir@2001.service has successfully entered the 'dead' state.
Feb 15 11:41:47 managed-node1 systemd[1]: Stopped user-runtime-dir@2001.service - User Runtime Directory /run/user/2001.
░░ Subject: A stop job for unit user-runtime-dir@2001.service has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit user-runtime-dir@2001.service has finished.
░░ 
░░ The job identifier is 7091 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[1]: Removed slice user-2001.slice - User Slice of UID 2001.
░░ Subject: A stop job for unit user-2001.slice has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit user-2001.slice has finished.
░░ 
░░ The job identifier is 7093 and the job result is done.
Feb 15 11:41:47 managed-node1 systemd[1]: user-2001.slice: Consumed 44.491s CPU time, 93.8M memory peak.
░░ Subject: Resources consumed by unit runtime
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit user-2001.slice completed and consumed the indicated resources.
Feb 15 11:41:47 managed-node1 systemd-logind[654]: Removed session 7.
░░ Subject: Session 7 has been terminated
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ Documentation: sd-login(3)
░░ 
░░ A session with the ID 7 has been terminated.
Feb 15 11:41:48 managed-node1 python3.12[58839]: ansible-ansible.legacy.command Invoked with _raw_params=loginctl show-user --value -p State auth_test_user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:41:48 managed-node1 python3.12[58971]: ansible-ansible.legacy.systemd Invoked with name=systemd-logind state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 15 11:41:48 managed-node1 systemd[1]: Starting dnf-makecache.service - dnf makecache...
░░ Subject: A start job for unit dnf-makecache.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit dnf-makecache.service has begun execution.
░░ 
░░ The job identifier is 7095.
Feb 15 11:41:48 managed-node1 systemd[1]: Stopping systemd-logind.service - User Login Management...
░░ Subject: A stop job for unit systemd-logind.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit systemd-logind.service has begun execution.
░░ 
░░ The job identifier is 7173.
Feb 15 11:41:48 managed-node1 systemd[1]: systemd-logind.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit systemd-logind.service has successfully entered the 'dead' state.
Feb 15 11:41:48 managed-node1 systemd[1]: Stopped systemd-logind.service - User Login Management.
░░ Subject: A stop job for unit systemd-logind.service has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit systemd-logind.service has finished.
░░ 
░░ The job identifier is 7173 and the job result is done.
Feb 15 11:41:49 managed-node1 dnf[58973]: Beaker Client - RedHatEnterpriseLinux9           13 kB/s | 1.5 kB     00:00
Feb 15 11:41:49 managed-node1 dnf[58973]: Beaker harness                                   87 kB/s | 1.3 kB     00:00
Feb 15 11:41:49 managed-node1 dnf[58973]: Copr repo for beakerlib-libraries owned by bgon  50 kB/s | 1.8 kB     00:00
Feb 15 11:41:49 managed-node1 dnf[58973]: CentOS Stream 10 - BaseOS                        66 kB/s | 6.1 kB     00:00
Feb 15 11:41:49 managed-node1 dnf[58973]: CentOS Stream 10 - AppStream                    128 kB/s | 6.2 kB     00:00
Feb 15 11:41:49 managed-node1 python3.12[59122]: ansible-ansible.legacy.command Invoked with _raw_params=loginctl show-user --value -p State auth_test_user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:41:49 managed-node1 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm...
░░ Subject: A start job for unit modprobe@drm.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit modprobe@drm.service has begun execution.
░░ 
░░ The job identifier is 7253.
Feb 15 11:41:49 managed-node1 systemd[1]: modprobe@drm.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit modprobe@drm.service has successfully entered the 'dead' state.
Feb 15 11:41:49 managed-node1 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm.
░░ Subject: A start job for unit modprobe@drm.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit modprobe@drm.service has finished successfully.
░░ 
░░ The job identifier is 7253.
Feb 15 11:41:49 managed-node1 systemd[1]: Starting systemd-logind.service - User Login Management...
░░ Subject: A start job for unit systemd-logind.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit systemd-logind.service has begun execution.
░░ 
░░ The job identifier is 7174.
Feb 15 11:41:49 managed-node1 systemd-logind[59128]: New seat seat0.
░░ Subject: A new seat seat0 is now available
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ Documentation: sd-login(3)
░░ 
░░ A new seat seat0 has been configured and is now available.
Feb 15 11:41:49 managed-node1 systemd-logind[59128]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 15 11:41:49 managed-node1 systemd-logind[59128]: Watching system buttons on /dev/input/event1 (Sleep Button)
Feb 15 11:41:49 managed-node1 systemd-logind[59128]: Watching system buttons on /dev/input/event2 (AT Translated Set 2 keyboard)
Feb 15 11:41:49 managed-node1 systemd[1]: Started systemd-logind.service - User Login Management.
░░ Subject: A start job for unit systemd-logind.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit systemd-logind.service has finished successfully.
░░ 
░░ The job identifier is 7174.
Feb 15 11:41:49 managed-node1 dnf[58973]: CentOS Stream 10 - HighAvailability              30 kB/s | 6.4 kB     00:00
Feb 15 11:41:49 managed-node1 dnf[58973]: CentOS Stream 10 - Extras packages              253 kB/s | 6.6 kB     00:00
Feb 15 11:41:49 managed-node1 dnf[58973]: Metadata cache created.
Feb 15 11:41:49 managed-node1 systemd[1]: dnf-makecache.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit dnf-makecache.service has successfully entered the 'dead' state.
Feb 15 11:41:49 managed-node1 systemd[1]: Finished dnf-makecache.service - dnf makecache.
░░ Subject: A start job for unit dnf-makecache.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit dnf-makecache.service has finished successfully.
░░ 
░░ The job identifier is 7095.
Feb 15 11:41:50 managed-node1 python3.12[59268]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:41:51 managed-node1 python3.12[59401]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids auth_test_user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:41:51 managed-node1 python3.12[59533]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g auth_test_user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:41:54 managed-node1 python3.12[59927]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:41:54 managed-node1 python3.12[60060]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids auth_test_user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:41:55 managed-node1 python3.12[60192]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g auth_test_user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:41:58 managed-node1 python3.12[60979]: ansible-user Invoked with name=auth_test_user1 state=absent non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on managed-node1 update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 15 11:41:58 managed-node1 userdel[60981]: delete user 'auth_test_user1'
Feb 15 11:41:58 managed-node1 userdel[60981]: removed group 'auth_test_user1' owned by 'auth_test_user1'
Feb 15 11:41:58 managed-node1 userdel[60981]: removed shadow group 'auth_test_user1' owned by 'auth_test_user1'
Feb 15 11:41:58 managed-node1 python3.12[61112]: ansible-file Invoked with path=/home/auth_test_user1 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:41:59 managed-node1 systemd[1]: Stopping session-3.scope - Session 3 of User root...
░░ Subject: A stop job for unit session-3.scope has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit session-3.scope has begun execution.
░░ 
░░ The job identifier is 7336.
Feb 15 11:41:59 managed-node1 systemd[1]: Stopping session-6.scope - Session 6 of User root...
░░ Subject: A stop job for unit session-6.scope has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit session-6.scope has begun execution.
░░ 
░░ The job identifier is 7337.
Feb 15 11:41:59 managed-node1 sshd-session[4430]: error: mm_reap: preauth child terminated by signal 15
Feb 15 11:41:59 managed-node1 sshd-session[6626]: error: mm_reap: preauth child terminated by signal 15
Feb 15 11:41:59 managed-node1 sshd-session[6626]: pam_systemd(sshd:session): Failed to release session: No session '6' known
Feb 15 11:41:59 managed-node1 sshd-session[6626]: pam_unix(sshd:session): session closed for user root
Feb 15 11:41:59 managed-node1 sshd-session[4430]: pam_systemd(sshd:session): Failed to release session: No session '3' known
Feb 15 11:41:59 managed-node1 sshd-session[4430]: pam_unix(sshd:session): session closed for user root
Feb 15 11:41:59 managed-node1 systemd[1]: session-6.scope: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit session-6.scope has successfully entered the 'dead' state.
Feb 15 11:41:59 managed-node1 systemd[1]: Stopped session-6.scope - Session 6 of User root.
░░ Subject: A stop job for unit session-6.scope has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit session-6.scope has finished.
░░ 
░░ The job identifier is 7337 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[1]: session-6.scope: Consumed 3min 22.390s CPU time, 437.7M memory peak.
░░ Subject: Resources consumed by unit runtime
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit session-6.scope completed and consumed the indicated resources.
Feb 15 11:41:59 managed-node1 systemd[1]: session-3.scope: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit session-3.scope has successfully entered the 'dead' state.
Feb 15 11:41:59 managed-node1 systemd[1]: Stopped session-3.scope - Session 3 of User root.
░░ Subject: A stop job for unit session-3.scope has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit session-3.scope has finished.
░░ 
░░ The job identifier is 7336 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[1]: session-3.scope: Consumed 3.185s CPU time, 86.7M memory peak.
░░ Subject: Resources consumed by unit runtime
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit session-3.scope completed and consumed the indicated resources.
Feb 15 11:41:59 managed-node1 systemd[1]: Stopping user@0.service - User Manager for UID 0...
░░ Subject: A stop job for unit user@0.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit user@0.service has begun execution.
░░ 
░░ The job identifier is 7338.
Feb 15 11:41:59 managed-node1 systemd[4438]: Activating special unit exit.target...
Feb 15 11:41:59 managed-node1 systemd[4438]: Removed slice background.slice - User Background Tasks Slice.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 27 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[4438]: Stopped target default.target - Main User Target.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 31 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[4438]: Stopped target basic.target - Basic System.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 35 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[4438]: Stopped target paths.target - Paths.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 24 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[4438]: Stopped target sockets.target - Sockets.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 21 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[4438]: Stopped target timers.target - Timers.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 29 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[4438]: Stopped systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 33 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[4438]: Closed dbus.socket - D-Bus User Message Bus Socket.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 25 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[4438]: Stopped systemd-tmpfiles-setup.service - Create User Files and Directories.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 26 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[4438]: Removed slice app.slice - User Application Slice.
░░ Subject: A stop job for unit UNIT has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit UNIT has finished.
░░ 
░░ The job identifier is 34 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[4438]: Reached target shutdown.target - Shutdown.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 20.
Feb 15 11:41:59 managed-node1 systemd[4438]: Finished systemd-exit.service - Exit the Session.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 18.
Feb 15 11:41:59 managed-node1 systemd[4438]: Reached target exit.target - Exit the Session.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 17.
Feb 15 11:41:59 managed-node1 systemd[1]: user@0.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit user@0.service has successfully entered the 'dead' state.
Feb 15 11:41:59 managed-node1 systemd[1]: Stopped user@0.service - User Manager for UID 0.
░░ Subject: A stop job for unit user@0.service has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit user@0.service has finished.
░░ 
░░ The job identifier is 7338 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[1]: Stopping user-runtime-dir@0.service - User Runtime Directory /run/user/0...
░░ Subject: A stop job for unit user-runtime-dir@0.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit user-runtime-dir@0.service has begun execution.
░░ 
░░ The job identifier is 7335.
Feb 15 11:41:59 managed-node1 systemd[1]: run-user-0.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit run-user-0.mount has successfully entered the 'dead' state.
Feb 15 11:41:59 managed-node1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit user-runtime-dir@0.service has successfully entered the 'dead' state.
Feb 15 11:41:59 managed-node1 systemd[1]: Stopped user-runtime-dir@0.service - User Runtime Directory /run/user/0.
░░ Subject: A stop job for unit user-runtime-dir@0.service has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit user-runtime-dir@0.service has finished.
░░ 
░░ The job identifier is 7335 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[1]: Removed slice user-0.slice - User Slice of UID 0.
░░ Subject: A stop job for unit user-0.slice has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit user-0.slice has finished.
░░ 
░░ The job identifier is 7339 and the job result is done.
Feb 15 11:41:59 managed-node1 systemd[1]: user-0.slice: Consumed 3min 25.934s CPU time, 503.4M memory peak.
░░ Subject: Resources consumed by unit runtime
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit user-0.slice completed and consumed the indicated resources.
Feb 15 11:41:59 managed-node1 sshd-session[61227]: Accepted publickey for root from 10.31.42.96 port 49556 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE
Feb 15 11:41:59 managed-node1 systemd[1]: Created slice user-0.slice - User Slice of UID 0.
░░ Subject: A start job for unit user-0.slice has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit user-0.slice has finished successfully.
░░ 
░░ The job identifier is 7419.
Feb 15 11:41:59 managed-node1 systemd[1]: Starting user-runtime-dir@0.service - User Runtime Directory /run/user/0...
░░ Subject: A start job for unit user-runtime-dir@0.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit user-runtime-dir@0.service has begun execution.
░░ 
░░ The job identifier is 7341.
Feb 15 11:41:59 managed-node1 systemd-logind[59128]: New session 8 of user root.
░░ Subject: A new session 8 has been created for user root
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ Documentation: sd-login(3)
░░ 
░░ A new session with the ID 8 has been created for the user root.
░░ 
░░ The leading process of the session is 61227.
Feb 15 11:41:59 managed-node1 systemd[1]: Finished user-runtime-dir@0.service - User Runtime Directory /run/user/0.
░░ Subject: A start job for unit user-runtime-dir@0.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit user-runtime-dir@0.service has finished successfully.
░░ 
░░ The job identifier is 7341.
Feb 15 11:41:59 managed-node1 systemd[1]: Starting user@0.service - User Manager for UID 0...
░░ Subject: A start job for unit user@0.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit user@0.service has begun execution.
░░ 
░░ The job identifier is 7421.
Feb 15 11:41:59 managed-node1 systemd-logind[59128]: New session 9 of user root.
░░ Subject: A new session 9 has been created for user root
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ Documentation: sd-login(3)
░░ 
░░ A new session with the ID 9 has been created for the user root.
░░ 
░░ The leading process of the session is 61233.
Feb 15 11:41:59 managed-node1 (systemd)[61233]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Feb 15 11:41:59 managed-node1 systemd[61233]: Queued start job for default target default.target.
Feb 15 11:41:59 managed-node1 systemd[61233]: Created slice app.slice - User Application Slice.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 9.
Feb 15 11:41:59 managed-node1 systemd[61233]: grub-boot-success.timer - Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 6.
Feb 15 11:41:59 managed-node1 systemd[61233]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 7.
Feb 15 11:41:59 managed-node1 systemd[61233]: Reached target paths.target - Paths.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 3.
Feb 15 11:41:59 managed-node1 systemd[61233]: Reached target timers.target - Timers.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 5.
Feb 15 11:41:59 managed-node1 systemd[61233]: Starting dbus.socket - D-Bus User Message Bus Socket...
░░ Subject: A start job for unit UNIT has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has begun execution.
░░ 
░░ The job identifier is 12.
Feb 15 11:41:59 managed-node1 systemd[61233]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories...
░░ Subject: A start job for unit UNIT has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has begun execution.
░░ 
░░ The job identifier is 8.
Feb 15 11:41:59 managed-node1 systemd[61233]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 8.
Feb 15 11:41:59 managed-node1 systemd[61233]: Listening on dbus.socket - D-Bus User Message Bus Socket.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 12.
Feb 15 11:41:59 managed-node1 systemd[61233]: Reached target sockets.target - Sockets.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 11.
Feb 15 11:41:59 managed-node1 systemd[61233]: Reached target basic.target - Basic System.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 2.
Feb 15 11:41:59 managed-node1 systemd[61233]: Reached target default.target - Main User Target.
░░ Subject: A start job for unit UNIT has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit UNIT has finished successfully.
░░ 
░░ The job identifier is 1.
Feb 15 11:41:59 managed-node1 systemd[1]: Started user@0.service - User Manager for UID 0.
░░ Subject: A start job for unit user@0.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit user@0.service has finished successfully.
░░ 
░░ The job identifier is 7421.
Feb 15 11:41:59 managed-node1 systemd[61233]: Startup finished in 110ms.
░░ Subject: User manager start-up is now complete
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The user manager instance for user 0 has been started. All services queued
░░ for starting have been started. Note that other services might still be starting
░░ up or be started at any later time.
░░ 
░░ Startup of the manager took 110511 microseconds.
Feb 15 11:41:59 managed-node1 systemd[1]: Started session-8.scope - Session 8 of User root.
░░ Subject: A start job for unit session-8.scope has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit session-8.scope has finished successfully.
░░ 
░░ The job identifier is 7502.
Feb 15 11:41:59 managed-node1 sshd-session[61227]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)
Feb 15 11:42:03 managed-node1 python3.12[61423]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 15 11:42:04 managed-node1 python3.12[61592]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:42:05 managed-node1 python3.12[61723]: ansible-ansible.legacy.dnf Invoked with name=['python3-pyasn1', 'python3-cryptography', 'python3-dbus'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 15 11:42:06 managed-node1 python3.12[61855]: ansible-ansible.legacy.dnf Invoked with name=['certmonger', 'python3-packaging'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 15 11:42:07 managed-node1 python3.12[61987]: ansible-file Invoked with name=/etc/certmonger//pre-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//pre-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:07 managed-node1 python3.12[62118]: ansible-file Invoked with name=/etc/certmonger//post-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//post-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:08 managed-node1 python3.12[62249]: ansible-ansible.legacy.systemd Invoked with name=certmonger state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 15 11:42:09 managed-node1 python3.12[62382]: ansible-fedora.linux_system_roles.certificate_request Invoked with name=quadlet_demo dns=['localhost'] directory=/etc/pki/tls wait=True ca=self-sign __header=#
                                                 # Ansible managed
                                                 #
                                                 # system_role:certificate
                                                  provider_config_directory=/etc/certmonger provider=certmonger key_usage=['digitalSignature', 'keyEncipherment'] extended_key_usage=['id-kp-serverAuth', 'id-kp-clientAuth'] auto_renew=True ip=None email=None common_name=None country=None state=None locality=None organization=None organizational_unit=None contact_email=None key_size=None owner=None group=None mode=None principal=None run_before=None run_after=None
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:09 managed-node1 certmonger[62397]: Certificate in file "/etc/pki/tls/certs/quadlet_demo.crt" issued by CA and saved.
Feb 15 11:42:09 managed-node1 certmonger[10295]: 2025-02-15 11:42:09 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:10 managed-node1 python3.12[62528]: ansible-slurp Invoked with path=/etc/pki/tls/certs/quadlet_demo.crt src=/etc/pki/tls/certs/quadlet_demo.crt
Feb 15 11:42:10 managed-node1 python3.12[62659]: ansible-slurp Invoked with path=/etc/pki/tls/private/quadlet_demo.key src=/etc/pki/tls/private/quadlet_demo.key
Feb 15 11:42:10 managed-node1 python3.12[62790]: ansible-slurp Invoked with path=/etc/pki/tls/certs/quadlet_demo.crt src=/etc/pki/tls/certs/quadlet_demo.crt
Feb 15 11:42:11 managed-node1 python3.12[62921]: ansible-ansible.legacy.command Invoked with _raw_params=getcert stop-tracking -f /etc/pki/tls/certs/quadlet_demo.crt _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:42:11 managed-node1 certmonger[10295]: 2025-02-15 11:42:11 [10295] Wrote to /var/lib/certmonger/requests/20250215164209
Feb 15 11:42:12 managed-node1 python3.12[63053]: ansible-file Invoked with path=/etc/pki/tls/certs/quadlet_demo.crt state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:12 managed-node1 python3.12[63184]: ansible-file Invoked with path=/etc/pki/tls/private/quadlet_demo.key state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:12 managed-node1 python3.12[63315]: ansible-file Invoked with path=/etc/pki/tls/certs/quadlet_demo.crt state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:13 managed-node1 python3.12[63446]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:42:13 managed-node1 python3.12[63577]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:42:15 managed-node1 python3.12[63839]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:42:16 managed-node1 python3.12[63976]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None
Feb 15 11:42:16 managed-node1 python3.12[64108]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:42:18 managed-node1 python3.12[64241]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:42:19 managed-node1 python3.12[64372]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:42:19 managed-node1 python3.12[64503]: ansible-ansible.legacy.dnf Invoked with name=['firewalld'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 15 11:42:20 managed-node1 python3.12[64635]: ansible-systemd Invoked with name=firewalld masked=False daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Feb 15 11:42:21 managed-node1 python3.12[64768]: ansible-ansible.legacy.systemd Invoked with name=firewalld state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 15 11:42:21 managed-node1 systemd[1]: Reload requested from client PID 64771 ('systemctl') (unit session-8.scope)...
Feb 15 11:42:21 managed-node1 systemd[1]: Reloading...
Feb 15 11:42:21 managed-node1 systemd-rc-local-generator[64815]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:42:21 managed-node1 systemd[1]: Reloading finished in 205 ms.
Feb 15 11:42:21 managed-node1 systemd[1]: Starting firewalld.service - firewalld - dynamic firewall daemon...
░░ Subject: A start job for unit firewalld.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit firewalld.service has begun execution.
░░ 
░░ The job identifier is 7584.
Feb 15 11:42:21 managed-node1 systemd[1]: Started firewalld.service - firewalld - dynamic firewall daemon.
░░ Subject: A start job for unit firewalld.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit firewalld.service has finished successfully.
░░ 
░░ The job identifier is 7584.
Feb 15 11:42:21 managed-node1 kernel: Warning: Unmaintained driver is detected: ip_set
Feb 15 11:42:22 managed-node1 systemd[1]: Starting polkit.service - Authorization Manager...
░░ Subject: A start job for unit polkit.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit polkit.service has begun execution.
░░ 
░░ The job identifier is 7667.
Feb 15 11:42:22 managed-node1 polkitd[64921]: Started polkitd version 125
Feb 15 11:42:22 managed-node1 systemd[1]: Started polkit.service - Authorization Manager.
░░ Subject: A start job for unit polkit.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit polkit.service has finished successfully.
░░ 
░░ The job identifier is 7667.
Feb 15 11:42:22 managed-node1 python3.12[64994]: ansible-fedora.linux_system_roles.firewall_lib Invoked with port=['8000/tcp'] permanent=True runtime=True state=enabled __report_changed=True service=[] source_port=[] forward_port=[] rich_rule=[] source=[] interface=[] interface_pci_id=[] icmp_block=[] timeout=0 ipset_entries=[] protocol=[] helper_module=[] destination=[] firewalld_conf=None masquerade=None icmp_block_inversion=None target=None zone=None set_default_zone=None ipset=None ipset_type=None description=None short=None
Feb 15 11:42:23 managed-node1 python3.12[65125]: ansible-fedora.linux_system_roles.firewall_lib Invoked with port=['9000/tcp'] permanent=True runtime=True state=enabled __report_changed=True service=[] source_port=[] forward_port=[] rich_rule=[] source=[] interface=[] interface_pci_id=[] icmp_block=[] timeout=0 ipset_entries=[] protocol=[] helper_module=[] destination=[] firewalld_conf=None masquerade=None icmp_block_inversion=None target=None zone=None set_default_zone=None ipset=None ipset_type=None description=None short=None
Feb 15 11:42:23 managed-node1 rsyslogd[904]: imjournal: journal files changed, reloading...  [v8.2412.0-1.el10 try https://www.rsyslog.com/e/0 ]
Feb 15 11:42:23 managed-node1 rsyslogd[904]: imjournal: journal files changed, reloading...  [v8.2412.0-1.el10 try https://www.rsyslog.com/e/0 ]
Feb 15 11:42:30 managed-node1 python3.12[65735]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:42:31 managed-node1 python3.12[65868]: ansible-file Invoked with path=/etc/containers/systemd state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:31 managed-node1 python3.12[65999]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/systemd/quadlet-demo.network follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Feb 15 11:42:32 managed-node1 python3.12[66104]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1739637751.501507-21049-169973974175633/.source.network dest=/etc/containers/systemd/quadlet-demo.network owner=root group=0 mode=0644 _original_basename=quadlet-demo.network follow=False checksum=e57c08d49aff4bae8daab138d913aeddaa8682a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:32 managed-node1 python3.12[66235]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:42:32 managed-node1 systemd[1]: Reload requested from client PID 66236 ('systemctl') (unit session-8.scope)...
Feb 15 11:42:32 managed-node1 systemd[1]: Reloading...
Feb 15 11:42:33 managed-node1 systemd-rc-local-generator[66280]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:42:33 managed-node1 systemd[1]: Reloading finished in 206 ms.
Feb 15 11:42:33 managed-node1 python3.12[66420]: ansible-systemd Invoked with name=quadlet-demo-network.service scope=system state=started daemon_reload=False daemon_reexec=False no_block=False enabled=None force=None masked=None
Feb 15 11:42:33 managed-node1 systemd[1]: Starting quadlet-demo-network.service...
░░ Subject: A start job for unit quadlet-demo-network.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit quadlet-demo-network.service has begun execution.
░░ 
░░ The job identifier is 7746.
Feb 15 11:42:33 managed-node1 quadlet-demo-network[66424]: systemd-quadlet-demo
Feb 15 11:42:33 managed-node1 systemd[1]: Finished quadlet-demo-network.service.
░░ Subject: A start job for unit quadlet-demo-network.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit quadlet-demo-network.service has finished successfully.
░░ 
░░ The job identifier is 7746.
Feb 15 11:42:34 managed-node1 python3.12[66562]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:42:36 managed-node1 python3.12[66695]: ansible-file Invoked with path=/etc/containers/systemd state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:36 managed-node1 python3.12[66826]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/systemd/quadlet-demo-mysql.volume follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Feb 15 11:42:37 managed-node1 python3.12[66931]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1739637756.4084275-21235-278404765771819/.source.volume dest=/etc/containers/systemd/quadlet-demo-mysql.volume owner=root group=0 mode=0644 _original_basename=quadlet-demo-mysql.volume follow=False checksum=585f8cbdf0ec73000f9227dcffbef71e9552ea4a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:37 managed-node1 python3.12[67062]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:42:37 managed-node1 systemd[1]: Reload requested from client PID 67063 ('systemctl') (unit session-8.scope)...
Feb 15 11:42:37 managed-node1 systemd[1]: Reloading...
Feb 15 11:42:37 managed-node1 systemd-rc-local-generator[67107]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:42:37 managed-node1 systemd[1]: Reloading finished in 202 ms.
Feb 15 11:42:38 managed-node1 python3.12[67247]: ansible-systemd Invoked with name=quadlet-demo-mysql-volume.service scope=system state=started daemon_reload=False daemon_reexec=False no_block=False enabled=None force=None masked=None
Feb 15 11:42:38 managed-node1 systemd[1]: Starting quadlet-demo-mysql-volume.service...
░░ Subject: A start job for unit quadlet-demo-mysql-volume.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit quadlet-demo-mysql-volume.service has begun execution.
░░ 
░░ The job identifier is 7830.
Feb 15 11:42:38 managed-node1 podman[67251]: 2025-02-15 11:42:38.435732006 -0500 EST m=+0.026388714 volume create systemd-quadlet-demo-mysql
Feb 15 11:42:38 managed-node1 quadlet-demo-mysql-volume[67251]: systemd-quadlet-demo-mysql
Feb 15 11:42:38 managed-node1 systemd[1]: Finished quadlet-demo-mysql-volume.service.
░░ Subject: A start job for unit quadlet-demo-mysql-volume.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit quadlet-demo-mysql-volume.service has finished successfully.
░░ 
░░ The job identifier is 7830.
Feb 15 11:42:39 managed-node1 python3.12[67390]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:42:41 managed-node1 python3.12[67523]: ansible-file Invoked with path=/tmp/quadlet_demo state=directory owner=root group=root mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:47 managed-node1 podman[67662]: 2025-02-15 11:42:47.799636359 -0500 EST m=+5.724936962 image pull dd3b2a5dcb48ff61113592ed5ddd762581be4387c7bc552375a2159422aa6bf5 quay.io/linux-system-roles/mysql:5.6
Feb 15 11:42:48 managed-node1 python3.12[67990]: ansible-file Invoked with path=/etc/containers/systemd state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:48 managed-node1 python3.12[68121]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/systemd/quadlet-demo-mysql.container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Feb 15 11:42:48 managed-node1 python3.12[68226]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/systemd/quadlet-demo-mysql.container owner=root group=0 mode=0644 src=/root/.ansible/tmp/ansible-tmp-1739637768.4011524-21676-87794506354656/.source.container _original_basename=.9qwbcsja follow=False checksum=ca62b2ad3cc9afb5b5371ebbf797b9bc4fd7edd4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:49 managed-node1 python3.12[68357]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:42:49 managed-node1 systemd[1]: Reload requested from client PID 68358 ('systemctl') (unit session-8.scope)...
Feb 15 11:42:49 managed-node1 systemd[1]: Reloading...
Feb 15 11:42:49 managed-node1 systemd-rc-local-generator[68404]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:42:49 managed-node1 systemd[1]: Reloading finished in 206 ms.
Feb 15 11:42:50 managed-node1 python3.12[68542]: ansible-systemd Invoked with name=quadlet-demo-mysql.service scope=system state=started daemon_reload=False daemon_reexec=False no_block=False enabled=None force=None masked=None
Feb 15 11:42:50 managed-node1 systemd[1]: Starting quadlet-demo-mysql.service...
░░ Subject: A start job for unit quadlet-demo-mysql.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit quadlet-demo-mysql.service has begun execution.
░░ 
░░ The job identifier is 7914.
Feb 15 11:42:50 managed-node1 podman[68546]: 2025-02-15 11:42:50.4098477 -0500 EST m=+0.043914783 container create 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2 (image=quay.io/linux-system-roles/mysql:5.6, name=quadlet-demo-mysql, PODMAN_SYSTEMD_UNIT=quadlet-demo-mysql.service)
Feb 15 11:42:50 managed-node1 kernel: podman2: port 1(veth1) entered blocking state
Feb 15 11:42:50 managed-node1 kernel: podman2: port 1(veth1) entered disabled state
Feb 15 11:42:50 managed-node1 kernel: veth1: entered allmulticast mode
Feb 15 11:42:50 managed-node1 kernel: veth1: entered promiscuous mode
Feb 15 11:42:50 managed-node1 kernel: podman2: port 1(veth1) entered blocking state
Feb 15 11:42:50 managed-node1 kernel: podman2: port 1(veth1) entered forwarding state
Feb 15 11:42:50 managed-node1 (udev-worker)[68557]: Network interface NamePolicy= disabled on kernel command line.
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.4418] device (podman2): carrier: link connected
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.4422] manager: (podman2): new Bridge device (/org/freedesktop/NetworkManager/Devices/9)
Feb 15 11:42:50 managed-node1 (udev-worker)[68558]: Network interface NamePolicy= disabled on kernel command line.
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.4441] device (veth1): carrier: link connected
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.4447] manager: (veth1): new Veth device (/org/freedesktop/NetworkManager/Devices/10)
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.4758] device (podman2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.4762] device (podman2): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.4768] device (podman2): Activation: starting connection 'podman2' (6e662d28-97a3-423f-9e98-30158975289f)
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.4769] device (podman2): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.4771] device (podman2): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.4781] device (podman2): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.4786] device (podman2): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 15 11:42:50 managed-node1 podman[68546]: 2025-02-15 11:42:50.390614529 -0500 EST m=+0.024681845 image pull dd3b2a5dcb48ff61113592ed5ddd762581be4387c7bc552375a2159422aa6bf5 quay.io/linux-system-roles/mysql:5.6
Feb 15 11:42:50 managed-node1 systemd[1]: Starting NetworkManager-dispatcher.service - Network Manager Script Dispatcher Service...
░░ Subject: A start job for unit NetworkManager-dispatcher.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit NetworkManager-dispatcher.service has begun execution.
░░ 
░░ The job identifier is 8000.
Feb 15 11:42:50 managed-node1 systemd[1]: Started NetworkManager-dispatcher.service - Network Manager Script Dispatcher Service.
░░ Subject: A start job for unit NetworkManager-dispatcher.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit NetworkManager-dispatcher.service has finished successfully.
░░ 
░░ The job identifier is 8000.
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.5114] device (podman2): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.5116] device (podman2): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 15 11:42:50 managed-node1 NetworkManager[733]:   [1739637770.5121] device (podman2): Activation: successful, device activated.
Feb 15 11:42:50 managed-node1 systemd[1]: Started run-p68595-i68895.scope - [systemd-run] /usr/libexec/podman/aardvark-dns --config /run/containers/networks/aardvark-dns -p 53 run.
░░ Subject: A start job for unit run-p68595-i68895.scope has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit run-p68595-i68895.scope has finished successfully.
░░ 
░░ The job identifier is 8079.
Feb 15 11:42:50 managed-node1 systemd[1]: Started 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.timer - [systemd-run] /usr/bin/podman healthcheck run 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2.
░░ Subject: A start job for unit 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.timer has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.timer has finished successfully.
░░ 
░░ The job identifier is 8085.
Feb 15 11:42:50 managed-node1 podman[68546]: 2025-02-15 11:42:50.61097696 -0500 EST m=+0.245044089 container init 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2 (image=quay.io/linux-system-roles/mysql:5.6, name=quadlet-demo-mysql, PODMAN_SYSTEMD_UNIT=quadlet-demo-mysql.service)
Feb 15 11:42:50 managed-node1 systemd[1]: Started quadlet-demo-mysql.service.
░░ Subject: A start job for unit quadlet-demo-mysql.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit quadlet-demo-mysql.service has finished successfully.
░░ 
░░ The job identifier is 7914.
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50+00:00 [Note] [Entrypoint]: Entrypoint script for MySQL Server 5.6.51-1debian9 started.
Feb 15 11:42:50 managed-node1 podman[68546]: 2025-02-15 11:42:50.6385512 -0500 EST m=+0.272618417 container start 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2 (image=quay.io/linux-system-roles/mysql:5.6, name=quadlet-demo-mysql, PODMAN_SYSTEMD_UNIT=quadlet-demo-mysql.service)
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68546]: 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50+00:00 [Note] [Entrypoint]: Switching to dedicated user 'mysql'
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50+00:00 [Note] [Entrypoint]: Entrypoint script for MySQL Server 5.6.51-1debian9 started.
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50+00:00 [Note] [Entrypoint]: Initializing database files
Feb 15 11:42:50 managed-node1 podman[68611]: 2025-02-15 11:42:50.806422421 -0500 EST m=+0.145509769 container health_status 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2 (image=quay.io/linux-system-roles/mysql:5.6, name=quadlet-demo-mysql, health_status=healthy, health_failing_streak=0, health_log=, PODMAN_SYSTEMD_UNIT=quadlet-demo-mysql.service)
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 0 [Warning] TIMESTAMP with implicit DEFAULT value is deprecated. Please use --explicit_defaults_for_timestamp server option (see documentation for more details).
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 0 [Note] Ignoring --secure-file-priv value as server is running with --bootstrap.
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 0 [Note] /usr/sbin/mysqld (mysqld 5.6.51) starting as process 42 ...
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Using atomics to ref count buffer pool pages
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: The InnoDB memory heap is disabled
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Memory barrier is not used
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Compressed tables use zlib 1.2.11
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Using Linux native AIO
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Using CPU crc32 instructions
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Initializing buffer pool, size = 128.0M
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Completed initialization of buffer pool
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: The first specified data file ./ibdata1 did not exist: a new database to be created!
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Setting file ./ibdata1 size to 12 MB
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Database physically writes the file full: wait...
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Setting log file ./ib_logfile101 size to 48 MB
Feb 15 11:42:50 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:50 42 [Note] InnoDB: Setting log file ./ib_logfile1 size to 48 MB
Feb 15 11:42:51 managed-node1 python3.12[68804]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] InnoDB: Renaming log file ./ib_logfile101 to ./ib_logfile0
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Warning] InnoDB: New log files created, LSN=45781
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] InnoDB: Doublewrite buffer not found: creating new
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] InnoDB: Doublewrite buffer created
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] InnoDB: 128 rollback segment(s) are active.
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Warning] InnoDB: Creating foreign key constraint system tables.
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] InnoDB: Foreign key constraint system tables created
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] InnoDB: Creating tablespace and datafile system tables.
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] InnoDB: Tablespace and datafile system tables created.
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] InnoDB: 5.6.51 started; log sequence number 0
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] RSA private key file not found: /var/lib/mysql//private_key.pem. Some authentication plugins will not work.
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] RSA public key file not found: /var/lib/mysql//public_key.pem. Some authentication plugins will not work.
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] Binlog end
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] InnoDB: FTS optimize thread exiting.
Feb 15 11:42:51 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:51 42 [Note] InnoDB: Starting shutdown...
Feb 15 11:42:53 managed-node1 python3.12[68948]: ansible-file Invoked with path=/etc/containers/systemd state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 42 [Note] InnoDB: Shutdown completed; log sequence number 1625977
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 0 [Warning] TIMESTAMP with implicit DEFAULT value is deprecated. Please use --explicit_defaults_for_timestamp server option (see documentation for more details).
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 0 [Note] Ignoring --secure-file-priv value as server is running with --bootstrap.
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 0 [Note] /usr/sbin/mysqld (mysqld 5.6.51) starting as process 65 ...
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: Using atomics to ref count buffer pool pages
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: The InnoDB memory heap is disabled
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: Memory barrier is not used
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: Compressed tables use zlib 1.2.11
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: Using Linux native AIO
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: Using CPU crc32 instructions
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: Initializing buffer pool, size = 128.0M
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: Completed initialization of buffer pool
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: Highest supported file format is Barracuda.
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: 128 rollback segment(s) are active.
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: Waiting for purge to start
Feb 15 11:42:53 managed-node1 python3.12[69079]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/systemd/envoy-proxy-configmap.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: 5.6.51 started; log sequence number 1625977
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] RSA private key file not found: /var/lib/mysql//private_key.pem. Some authentication plugins will not work.
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] RSA public key file not found: /var/lib/mysql//public_key.pem. Some authentication plugins will not work.
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] Binlog end
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: FTS optimize thread exiting.
Feb 15 11:42:53 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:53 65 [Note] InnoDB: Starting shutdown...
Feb 15 11:42:54 managed-node1 python3.12[69208]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1739637773.3253858-21862-27786019115190/.source.yml dest=/etc/containers/systemd/envoy-proxy-configmap.yml owner=root group=0 mode=0644 _original_basename=envoy-proxy-configmap.yml follow=False checksum=d681c7d56f912150d041873e880818b22a90c188 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:54 managed-node1 python3.12[69339]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:42:54 managed-node1 systemd[1]: Reload requested from client PID 69340 ('systemctl') (unit session-8.scope)...
Feb 15 11:42:54 managed-node1 systemd[1]: Reloading...
Feb 15 11:42:54 managed-node1 systemd-rc-local-generator[69382]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:42:54 managed-node1 systemd[1]: Reloading finished in 218 ms.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 65 [Note] InnoDB: Shutdown completed; log sequence number 1625987
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: PLEASE REMEMBER TO SET A PASSWORD FOR THE MySQL root USER !
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: To do so, start the server, then issue the following commands:
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]:   /usr/bin/mysqladmin -u root password 'new-password'
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]:   /usr/bin/mysqladmin -u root -h 3aa715549e27 password 'new-password'
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: Alternatively you can run:
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]:   /usr/bin/mysql_secure_installation
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: which will also give you the option of removing the test
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: databases and anonymous user created by default.  This is
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: strongly recommended for production servers.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: See the manual for more instructions.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: Please report any problems at http://bugs.mysql.com/
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: The latest information about MySQL is available on the web at
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]:   http://www.mysql.com
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: Support MySQL by buying support/licenses at http://shop.mysql.com
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: Note: new default config file not created.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: Please make sure your config file is current
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: WARNING: Default config file /etc/mysql/my.cnf exists on the system
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: This file will be read by default by the MySQL server
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: If you do not want to use this, either remove it, or use the
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: --defaults-file argument to mysqld_safe when starting the server
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55+00:00 [Note] [Entrypoint]: Database files initialized
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55+00:00 [Note] [Entrypoint]: Starting temporary server
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55+00:00 [Note] [Entrypoint]: Waiting for server startup
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 0 [Warning] TIMESTAMP with implicit DEFAULT value is deprecated. Please use --explicit_defaults_for_timestamp server option (see documentation for more details).
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 0 [Note] mysqld (mysqld 5.6.51) starting as process 90 ...
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] Plugin 'FEDERATED' is disabled.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: Using atomics to ref count buffer pool pages
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: The InnoDB memory heap is disabled
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: Memory barrier is not used
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: Compressed tables use zlib 1.2.11
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: Using Linux native AIO
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: Using CPU crc32 instructions
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: Initializing buffer pool, size = 128.0M
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: Completed initialization of buffer pool
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: Highest supported file format is Barracuda.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: 128 rollback segment(s) are active.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: Waiting for purge to start
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] InnoDB: 5.6.51 started; log sequence number 1625987
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Warning] No existing UUID has been found, so we assume that this is the first time that this server has been started. Generating a new UUID: e82b556c-ebbb-11ef-840a-deda2c12aa68.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] RSA private key file not found: /var/lib/mysql//private_key.pem. Some authentication plugins will not work.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] RSA public key file not found: /var/lib/mysql//public_key.pem. Some authentication plugins will not work.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Warning] Insecure configuration for --pid-file: Location '/var/run/mysqld' in the path is accessible to all OS users. Consider choosing a different directory.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Warning] 'user' entry 'root@3aa715549e27' ignored in --skip-name-resolve mode.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Warning] 'user' entry '@3aa715549e27' ignored in --skip-name-resolve mode.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Warning] 'proxies_priv' entry '@ root@3aa715549e27' ignored in --skip-name-resolve mode.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] Event Scheduler: Loaded 0 events
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:55 90 [Note] mysqld: ready for connections.
Feb 15 11:42:55 managed-node1 quadlet-demo-mysql[68602]: Version: '5.6.51'  socket: '/var/run/mysqld/mysqld.sock'  port: 0  MySQL Community Server (GPL)
Feb 15 11:42:56 managed-node1 python3.12[69553]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:42:56 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:56+00:00 [Note] [Entrypoint]: Temporary server started.
Feb 15 11:42:56 managed-node1 quadlet-demo-mysql[68602]: Warning: Unable to load '/usr/share/zoneinfo/iso3166.tab' as time zone. Skipping it.
Feb 15 11:42:56 managed-node1 quadlet-demo-mysql[68602]: Warning: Unable to load '/usr/share/zoneinfo/leap-seconds.list' as time zone. Skipping it.
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: Warning: Unable to load '/usr/share/zoneinfo/zone.tab' as time zone. Skipping it.
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: Warning: Unable to load '/usr/share/zoneinfo/zone1970.tab' as time zone. Skipping it.
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Warning] 'proxies_priv' entry '@ root@3aa715549e27' ignored in --skip-name-resolve mode.
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57+00:00 [Note] [Entrypoint]: Stopping temporary server
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] mysqld: Normal shutdown
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Giving 0 client threads a chance to die gracefully
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Event Scheduler: Purging the queue. 0 events
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down slave threads
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Forcefully disconnecting 0 remaining clients
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Binlog end
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'partition'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'PERFORMANCE_SCHEMA'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_SYS_DATAFILES'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_SYS_TABLESPACES'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_SYS_FOREIGN_COLS'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_SYS_FOREIGN'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_SYS_FIELDS'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_SYS_COLUMNS'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_SYS_INDEXES'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_SYS_TABLESTATS'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_SYS_TABLES'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_FT_INDEX_TABLE'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_FT_INDEX_CACHE'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_FT_CONFIG'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_FT_BEING_DELETED'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_FT_DELETED'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_FT_DEFAULT_STOPWORD'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_METRICS'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_BUFFER_POOL_STATS'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_BUFFER_PAGE_LRU'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_BUFFER_PAGE'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_CMP_PER_INDEX_RESET'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_CMP_PER_INDEX'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_CMPMEM_RESET'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_CMPMEM'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_CMP_RESET'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_CMP'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_LOCK_WAITS'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_LOCKS'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'INNODB_TRX'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] Shutting down plugin 'InnoDB'
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] InnoDB: FTS optimize thread exiting.
Feb 15 11:42:57 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:57 90 [Note] InnoDB: Starting shutdown...
Feb 15 11:42:58 managed-node1 python3.12[69703]: ansible-file Invoked with path=/etc/containers/systemd state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:58 managed-node1 python3.12[69834]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/systemd/quadlet-demo.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Feb 15 11:42:59 managed-node1 python3.12[69939]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/systemd/quadlet-demo.yml owner=root group=0 mode=0644 src=/root/.ansible/tmp/ansible-tmp-1739637778.5718408-22093-261865611902226/.source.yml _original_basename=.4657999m follow=False checksum=998dccde0483b1654327a46ddd89cbaa47650370 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] InnoDB: Shutdown completed; log sequence number 1625997
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] Shutting down plugin 'BLACKHOLE'
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] Shutting down plugin 'ARCHIVE'
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] Shutting down plugin 'MRG_MYISAM'
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] Shutting down plugin 'MyISAM'
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] Shutting down plugin 'MEMORY'
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] Shutting down plugin 'CSV'
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] Shutting down plugin 'sha256_password'
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] Shutting down plugin 'mysql_old_password'
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] Shutting down plugin 'mysql_native_password'
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] Shutting down plugin 'binlog'
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59 90 [Note] mysqld: Shutdown complete
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:59 managed-node1 python3.12[70070]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:42:59 managed-node1 systemd[1]: Reload requested from client PID 70071 ('systemctl') (unit session-8.scope)...
Feb 15 11:42:59 managed-node1 systemd[1]: Reloading...
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59+00:00 [Note] [Entrypoint]: Temporary server stopped
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:42:59+00:00 [Note] [Entrypoint]: MySQL init process done. Ready for start up.
Feb 15 11:42:59 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:42:59 managed-node1 systemd-rc-local-generator[70118]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 0 [Warning] TIMESTAMP with implicit DEFAULT value is deprecated. Please use --explicit_defaults_for_timestamp server option (see documentation for more details).
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 0 [Note] mysqld (mysqld 5.6.51) starting as process 1 ...
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] Plugin 'FEDERATED' is disabled.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: Using atomics to ref count buffer pool pages
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: The InnoDB memory heap is disabled
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: Memory barrier is not used
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: Compressed tables use zlib 1.2.11
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: Using Linux native AIO
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: Using CPU crc32 instructions
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: Initializing buffer pool, size = 128.0M
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: Completed initialization of buffer pool
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: Highest supported file format is Barracuda.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: 128 rollback segment(s) are active.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: Waiting for purge to start
Feb 15 11:43:00 managed-node1 systemd[1]: Reloading finished in 359 ms.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] InnoDB: 5.6.51 started; log sequence number 1625997
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] RSA private key file not found: /var/lib/mysql//private_key.pem. Some authentication plugins will not work.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] RSA public key file not found: /var/lib/mysql//public_key.pem. Some authentication plugins will not work.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] Server hostname (bind-address): '*'; port: 3306
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] IPv6 is available.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note]   - '::' resolves to '::';
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] Server socket created on IP: '::'.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Warning] Insecure configuration for --pid-file: Location '/var/run/mysqld' in the path is accessible to all OS users. Consider choosing a different directory.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Warning] 'proxies_priv' entry '@ root@3aa715549e27' ignored in --skip-name-resolve mode.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] Event Scheduler: Loaded 0 events
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:43:00 1 [Note] mysqld: ready for connections.
Feb 15 11:43:00 managed-node1 quadlet-demo-mysql[68602]: Version: '5.6.51'  socket: '/var/run/mysqld/mysqld.sock'  port: 3306  MySQL Community Server (GPL)
Feb 15 11:43:00 managed-node1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit NetworkManager-dispatcher.service has successfully entered the 'dead' state.
Feb 15 11:43:01 managed-node1 python3.12[70281]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:43:02 managed-node1 python3.12[70414]: ansible-slurp Invoked with path=/etc/containers/systemd/quadlet-demo.yml src=/etc/containers/systemd/quadlet-demo.yml
Feb 15 11:43:03 managed-node1 python3.12[70545]: ansible-file Invoked with path=/tmp/httpd3 state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:43:03 managed-node1 python3.12[70676]: ansible-file Invoked with path=/tmp/httpd3-create state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:43:16 managed-node1 podman[70817]: 2025-02-15 11:43:16.682040904 -0500 EST m=+12.432282479 image pull fcf3e41b8864a14d75a6d0627d3d02154e28a153aa57e8baa392cd744ffa0d0b quay.io/linux-system-roles/wordpress:4.8-apache
Feb 15 11:43:21 managed-node1 podman[71305]: 2025-02-15 11:43:21.0338671 -0500 EST m=+0.177784431 container health_status 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2 (image=quay.io/linux-system-roles/mysql:5.6, name=quadlet-demo-mysql, health_status=healthy, health_failing_streak=0, health_log=, PODMAN_SYSTEMD_UNIT=quadlet-demo-mysql.service)
Feb 15 11:43:21 managed-node1 podman[71240]: 2025-02-15 11:43:21.619379653 -0500 EST m=+4.355962260 image pull 5af2585e22ed1562885d9407efab74010090427be79048c2cd6a226517cc1e1d quay.io/linux-system-roles/envoyproxy:v1.25.0
Feb 15 11:43:22 managed-node1 python3.12[71523]: ansible-file Invoked with path=/etc/containers/systemd state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:43:22 managed-node1 python3.12[71654]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/systemd/quadlet-demo.kube follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Feb 15 11:43:22 managed-node1 python3.12[71759]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1739637802.223046-23059-18128663460698/.source.kube dest=/etc/containers/systemd/quadlet-demo.kube owner=root group=0 mode=0644 _original_basename=quadlet-demo.kube follow=False checksum=7a5c73a5d935a42431c87bcdbeb8a04ed0909dc7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:43:23 managed-node1 python3.12[71890]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:43:23 managed-node1 systemd[1]: Reload requested from client PID 71891 ('systemctl') (unit session-8.scope)...
Feb 15 11:43:23 managed-node1 systemd[1]: Reloading...
Feb 15 11:43:23 managed-node1 systemd-rc-local-generator[71938]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:43:23 managed-node1 systemd[1]: Reloading finished in 214 ms.
Feb 15 11:43:24 managed-node1 python3.12[72075]: ansible-systemd Invoked with name=quadlet-demo.service scope=system state=started daemon_reload=False daemon_reexec=False no_block=False enabled=None force=None masked=None
Feb 15 11:43:24 managed-node1 systemd[1]: Starting quadlet-demo.service...
░░ Subject: A start job for unit quadlet-demo.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit quadlet-demo.service has begun execution.
░░ 
░░ The job identifier is 8319.
Feb 15 11:43:24 managed-node1 quadlet-demo[72079]: Pods stopped:
Feb 15 11:43:24 managed-node1 quadlet-demo[72079]: Pods removed:
Feb 15 11:43:24 managed-node1 quadlet-demo[72079]: Secrets removed:
Feb 15 11:43:24 managed-node1 quadlet-demo[72079]: Volumes removed:
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.454285059 -0500 EST m=+0.033127256 volume create wp-pv-claim
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.480096652 -0500 EST m=+0.058938983 container create 381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386 (image=localhost/podman-pause:5.3.1-1733097600, name=a96f3a51b8d1-service, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.48816024 -0500 EST m=+0.067002447 volume create envoy-proxy-config
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.49478512 -0500 EST m=+0.073627324 volume create envoy-certificates
Feb 15 11:43:24 managed-node1 systemd[1]: Created slice machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice - cgroup machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice.
░░ Subject: A start job for unit machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice has finished successfully.
░░ 
░░ The job identifier is 8406.
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.542028647 -0500 EST m=+0.120870856 container create a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08 (image=localhost/podman-pause:5.3.1-1733097600, name=5bc2e99d5832-infra, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service, io.buildah.version=1.38.0)
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.547859456 -0500 EST m=+0.126701673 pod create 5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147 (image=, name=quadlet-demo)
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.589288638 -0500 EST m=+0.168130844 container create 8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074 (image=quay.io/linux-system-roles/wordpress:4.8-apache, name=quadlet-demo-wordpress, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.617163033 -0500 EST m=+0.196005242 container create 0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b (image=quay.io/linux-system-roles/envoyproxy:v1.25.0, name=quadlet-demo-envoy, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.617514623 -0500 EST m=+0.196356841 container restart 381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386 (image=localhost/podman-pause:5.3.1-1733097600, name=a96f3a51b8d1-service, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.556340925 -0500 EST m=+0.135183388 image pull fcf3e41b8864a14d75a6d0627d3d02154e28a153aa57e8baa392cd744ffa0d0b quay.io/linux-system-roles/wordpress:4.8-apache
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.59275652 -0500 EST m=+0.171598871 image pull 5af2585e22ed1562885d9407efab74010090427be79048c2cd6a226517cc1e1d quay.io/linux-system-roles/envoyproxy:v1.25.0
Feb 15 11:43:24 managed-node1 systemd[1]: Started libpod-381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386.scope - libcrun container.
░░ Subject: A start job for unit libpod-381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386.scope has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit libpod-381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386.scope has finished successfully.
░░ 
░░ The job identifier is 8412.
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.665587545 -0500 EST m=+0.244429884 container init 381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386 (image=localhost/podman-pause:5.3.1-1733097600, name=a96f3a51b8d1-service, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.668078994 -0500 EST m=+0.246921367 container start 381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386 (image=localhost/podman-pause:5.3.1-1733097600, name=a96f3a51b8d1-service, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:24 managed-node1 kernel: podman2: port 2(veth2) entered blocking state
Feb 15 11:43:24 managed-node1 kernel: podman2: port 2(veth2) entered disabled state
Feb 15 11:43:24 managed-node1 kernel: veth2: entered allmulticast mode
Feb 15 11:43:24 managed-node1 kernel: veth2: entered promiscuous mode
Feb 15 11:43:24 managed-node1 kernel: podman2: port 2(veth2) entered blocking state
Feb 15 11:43:24 managed-node1 kernel: podman2: port 2(veth2) entered forwarding state
Feb 15 11:43:24 managed-node1 kernel: podman2: port 2(veth2) entered disabled state
Feb 15 11:43:24 managed-node1 kernel: podman2: port 2(veth2) entered blocking state
Feb 15 11:43:24 managed-node1 kernel: podman2: port 2(veth2) entered forwarding state
Feb 15 11:43:24 managed-node1 (udev-worker)[72094]: Network interface NamePolicy= disabled on kernel command line.
Feb 15 11:43:24 managed-node1 NetworkManager[733]:   [1739637804.7005] manager: (veth2): new Veth device (/org/freedesktop/NetworkManager/Devices/11)
Feb 15 11:43:24 managed-node1 NetworkManager[733]:   [1739637804.7036] device (veth2): carrier: link connected
Feb 15 11:43:24 managed-node1 systemd[1]: Started libpod-a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08.scope - libcrun container.
░░ Subject: A start job for unit libpod-a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08.scope has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit libpod-a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08.scope has finished successfully.
░░ 
░░ The job identifier is 8418.
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.771294382 -0500 EST m=+0.350136714 container init a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08 (image=localhost/podman-pause:5.3.1-1733097600, name=5bc2e99d5832-infra, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service, io.buildah.version=1.38.0)
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.774069097 -0500 EST m=+0.352911349 container start a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08 (image=localhost/podman-pause:5.3.1-1733097600, name=5bc2e99d5832-infra, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service, io.buildah.version=1.38.0)
Feb 15 11:43:24 managed-node1 systemd[1]: Started libpod-8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074.scope - libcrun container.
░░ Subject: A start job for unit libpod-8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074.scope has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit libpod-8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074.scope has finished successfully.
░░ 
░░ The job identifier is 8425.
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.825719992 -0500 EST m=+0.404562227 container init 8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074 (image=quay.io/linux-system-roles/wordpress:4.8-apache, name=quadlet-demo-wordpress, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.8283805 -0500 EST m=+0.407222804 container start 8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074 (image=quay.io/linux-system-roles/wordpress:4.8-apache, name=quadlet-demo-wordpress, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:24 managed-node1 quadlet-demo-wordpress[72128]: WordPress not found in /var/www/html - copying now...
Feb 15 11:43:24 managed-node1 systemd[1]: Started libpod-0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b.scope - libcrun container.
░░ Subject: A start job for unit libpod-0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b.scope has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit libpod-0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b.scope has finished successfully.
░░ 
░░ The job identifier is 8432.
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.89851689 -0500 EST m=+0.477359169 container init 0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b (image=quay.io/linux-system-roles/envoyproxy:v1.25.0, name=quadlet-demo-envoy, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.901581125 -0500 EST m=+0.480423449 container start 0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b (image=quay.io/linux-system-roles/envoyproxy:v1.25.0, name=quadlet-demo-envoy, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:24 managed-node1 podman[72079]: 2025-02-15 11:43:24.908769706 -0500 EST m=+0.487611910 pod start 5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147 (image=, name=quadlet-demo)
Feb 15 11:43:24 managed-node1 quadlet-demo[72079]: Volumes:
Feb 15 11:43:24 managed-node1 quadlet-demo[72079]: wp-pv-claim
Feb 15 11:43:24 managed-node1 quadlet-demo[72079]: Pod:
Feb 15 11:43:24 managed-node1 quadlet-demo[72079]: 5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147
Feb 15 11:43:24 managed-node1 quadlet-demo[72079]: Containers:
Feb 15 11:43:24 managed-node1 quadlet-demo[72079]: 8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074
Feb 15 11:43:24 managed-node1 quadlet-demo[72079]: 0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b
Feb 15 11:43:24 managed-node1 systemd[1]: Started quadlet-demo.service.
░░ Subject: A start job for unit quadlet-demo.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit quadlet-demo.service has finished successfully.
░░ 
░░ The job identifier is 8319.
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:404] initializing epoch 0 (base id=0, hot restart version=11.104)
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:406] statically linked extensions:
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.compression.compressor: envoy.compression.brotli.compressor, envoy.compression.gzip.compressor, envoy.compression.zstd.compressor
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.config.validators: envoy.config.validators.minimum_clusters, envoy.config.validators.minimum_clusters_validator
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.matching.action: envoy.matching.actions.format_string, filter-chain-name
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.transport_sockets.downstream: envoy.transport_sockets.alts, envoy.transport_sockets.quic, envoy.transport_sockets.raw_buffer, envoy.transport_sockets.starttls, envoy.transport_sockets.tap, envoy.transport_sockets.tcp_stats, envoy.transport_sockets.tls, raw_buffer, starttls, tls
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.transport_sockets.upstream: envoy.transport_sockets.alts, envoy.transport_sockets.http_11_proxy, envoy.transport_sockets.internal_upstream, envoy.transport_sockets.quic, envoy.transport_sockets.raw_buffer, envoy.transport_sockets.starttls, envoy.transport_sockets.tap, envoy.transport_sockets.tcp_stats, envoy.transport_sockets.tls, envoy.transport_sockets.upstream_proxy_protocol, raw_buffer, starttls, tls
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.filters.http.upstream: envoy.buffer, envoy.filters.http.admission_control, envoy.filters.http.buffer, envoy.filters.http.upstream_codec
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.access_loggers.extension_filters: envoy.access_loggers.extension_filters.cel
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.tls.cert_validator: envoy.tls.cert_validator.default, envoy.tls.cert_validator.spiffe
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.filters.udp_listener: envoy.filters.udp.dns_filter, envoy.filters.udp_listener.udp_proxy
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.listener_manager_impl: envoy.listener_manager_impl.default
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.internal_redirect_predicates: envoy.internal_redirect_predicates.allow_listed_routes, envoy.internal_redirect_predicates.previous_routes, envoy.internal_redirect_predicates.safe_cross_scheme
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.rate_limit_descriptors: envoy.rate_limit_descriptors.expr
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.http.stateful_header_formatters: envoy.http.stateful_header_formatters.preserve_case, preserve_case
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.tracers: envoy.dynamic.ot, envoy.tracers.datadog, envoy.tracers.dynamic_ot, envoy.tracers.opencensus, envoy.tracers.opentelemetry, envoy.tracers.skywalking, envoy.tracers.xray, envoy.tracers.zipkin, envoy.zipkin
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.retry_priorities: envoy.retry_priorities.previous_priorities
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.dubbo_proxy.protocols: dubbo
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.request_id: envoy.request_id.uuid
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.route.early_data_policy: envoy.route.early_data_policy.default
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.thrift_proxy.transports: auto, framed, header, unframed
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.http.stateful_session: envoy.http.stateful_session.cookie, envoy.http.stateful_session.header
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.resource_monitors: envoy.resource_monitors.fixed_heap, envoy.resource_monitors.injected_resource
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.path.match: envoy.path.match.uri_template.uri_template_matcher
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.quic.proof_source: envoy.quic.proof_source.filter_chain
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.matching.network.input: envoy.matching.inputs.application_protocol, envoy.matching.inputs.destination_ip, envoy.matching.inputs.destination_port, envoy.matching.inputs.direct_source_ip, envoy.matching.inputs.dns_san, envoy.matching.inputs.server_name, envoy.matching.inputs.source_ip, envoy.matching.inputs.source_port, envoy.matching.inputs.source_type, envoy.matching.inputs.subject, envoy.matching.inputs.transport_protocol, envoy.matching.inputs.uri_san
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.access_loggers: envoy.access_loggers.file, envoy.access_loggers.http_grpc, envoy.access_loggers.open_telemetry, envoy.access_loggers.stderr, envoy.access_loggers.stdout, envoy.access_loggers.tcp_grpc, envoy.access_loggers.wasm, envoy.file_access_log, envoy.http_grpc_access_log, envoy.open_telemetry_access_log, envoy.stderr_access_log, envoy.stdout_access_log, envoy.tcp_grpc_access_log, envoy.wasm_access_log
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.955][1][info][main] [source/server/server.cc:408]   envoy.compression.decompressor: envoy.compression.brotli.decompressor, envoy.compression.gzip.decompressor, envoy.compression.zstd.decompressor
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.matching.network.custom_matchers: envoy.matching.custom_matchers.trie_matcher
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.resolvers: envoy.ip
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.stats_sinks: envoy.dog_statsd, envoy.graphite_statsd, envoy.metrics_service, envoy.stat_sinks.dog_statsd, envoy.stat_sinks.graphite_statsd, envoy.stat_sinks.hystrix, envoy.stat_sinks.metrics_service, envoy.stat_sinks.statsd, envoy.stat_sinks.wasm, envoy.statsd
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.connection_handler: envoy.connection_handler.default
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.matching.input_matchers: envoy.matching.matchers.consistent_hashing, envoy.matching.matchers.ip
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.thrift_proxy.protocols: auto, binary, binary/non-strict, compact, twitter
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.dubbo_proxy.filters: envoy.filters.dubbo.router
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.guarddog_actions: envoy.watchdog.abort_action, envoy.watchdog.profile_action
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.matching.http.custom_matchers: envoy.matching.custom_matchers.trie_matcher
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.http.original_ip_detection: envoy.http.original_ip_detection.custom_header, envoy.http.original_ip_detection.xff
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.quic.server.crypto_stream: envoy.quic.crypto_stream.server.quiche
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.rbac.matchers: envoy.rbac.matchers.upstream_ip_port
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.regex_engines: envoy.regex_engines.google_re2
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.956][1][info][main] [source/server/server.cc:408]   envoy.http.early_header_mutation: envoy.http.early_header_mutation.header_mutation
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.udp_packet_writer: envoy.udp_packet_writer.default, envoy.udp_packet_writer.gso
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.quic.connection_id_generator: envoy.quic.deterministic_connection_id_generator
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.bootstrap: envoy.bootstrap.internal_listener, envoy.bootstrap.wasm, envoy.extensions.network.socket_interface.default_socket_interface
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.matching.common_inputs: envoy.matching.common_inputs.environment_variable
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.health_checkers: envoy.health_checkers.redis, envoy.health_checkers.thrift
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.wasm.runtime: envoy.wasm.runtime.null, envoy.wasm.runtime.v8
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.filters.network: envoy.echo, envoy.ext_authz, envoy.filters.network.connection_limit, envoy.filters.network.direct_response, envoy.filters.network.dubbo_proxy, envoy.filters.network.echo, envoy.filters.network.ext_authz, envoy.filters.network.http_connection_manager, envoy.filters.network.local_ratelimit, envoy.filters.network.mongo_proxy, envoy.filters.network.ratelimit, envoy.filters.network.rbac, envoy.filters.network.redis_proxy, envoy.filters.network.sni_cluster, envoy.filters.network.sni_dynamic_forward_proxy, envoy.filters.network.tcp_proxy, envoy.filters.network.thrift_proxy, envoy.filters.network.wasm, envoy.filters.network.zookeeper_proxy, envoy.http_connection_manager, envoy.mongo_proxy, envoy.ratelimit, envoy.redis_proxy, envoy.tcp_proxy
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.common.key_value: envoy.key_value.file_based
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.matching.http.input: envoy.matching.inputs.destination_ip, envoy.matching.inputs.destination_port, envoy.matching.inputs.direct_source_ip, envoy.matching.inputs.dns_san, envoy.matching.inputs.request_headers, envoy.matching.inputs.request_trailers, envoy.matching.inputs.response_headers, envoy.matching.inputs.response_trailers, envoy.matching.inputs.server_name, envoy.matching.inputs.source_ip, envoy.matching.inputs.source_port, envoy.matching.inputs.source_type, envoy.matching.inputs.status_code_class_input, envoy.matching.inputs.status_code_input, envoy.matching.inputs.subject, envoy.matching.inputs.uri_san
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.clusters: envoy.cluster.eds, envoy.cluster.logical_dns, envoy.cluster.original_dst, envoy.cluster.static, envoy.cluster.strict_dns, envoy.clusters.aggregate, envoy.clusters.dynamic_forward_proxy, envoy.clusters.redis
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.upstreams: envoy.filters.connection_pools.tcp.generic
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.thrift_proxy.filters: envoy.filters.thrift.header_to_metadata, envoy.filters.thrift.payload_to_metadata, envoy.filters.thrift.rate_limit, envoy.filters.thrift.router
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.filters.http: envoy.bandwidth_limit, envoy.buffer, envoy.cors, envoy.csrf, envoy.ext_authz, envoy.ext_proc, envoy.fault, envoy.filters.http.adaptive_concurrency, envoy.filters.http.admission_control, envoy.filters.http.alternate_protocols_cache, envoy.filters.http.aws_lambda, envoy.filters.http.aws_request_signing, envoy.filters.http.bandwidth_limit, envoy.filters.http.buffer, envoy.filters.http.cache, envoy.filters.http.cdn_loop, envoy.filters.http.composite, envoy.filters.http.compressor, envoy.filters.http.cors, envoy.filters.http.csrf, envoy.filters.http.custom_response, envoy.filters.http.decompressor, envoy.filters.http.dynamic_forward_proxy, envoy.filters.http.ext_authz, envoy.filters.http.ext_proc, envoy.filters.http.fault, envoy.filters.http.file_system_buffer, envoy.filters.http.gcp_authn, envoy.filters.http.grpc_http1_bridge, envoy.filters.http.grpc_http1_reverse_bridge, envoy.filters.http.grpc_json_transcoder, envoy.filters.http.grpc_stats, envoy.filters.http.grpc_web, envoy.filters.http.header_to_metadata, envoy.filters.http.health_check, envoy.filters.http.ip_tagging, envoy.filters.http.jwt_authn, envoy.filters.http.local_ratelimit, envoy.filters.http.lua, envoy.filters.http.match_delegate, envoy.filters.http.oauth2, envoy.filters.http.on_demand, envoy.filters.http.original_src, envoy.filters.http.rate_limit_quota, envoy.filters.http.ratelimit, envoy.filters.http.rbac, envoy.filters.http.router, envoy.filters.http.set_metadata, envoy.filters.http.stateful_session, envoy.filters.http.tap, envoy.filters.http.wasm, envoy.grpc_http1_bridge, envoy.grpc_json_transcoder, envoy.grpc_web, envoy.health_check, envoy.ip_tagging, envoy.local_rate_limit, envoy.lua, envoy.rate_limit, envoy.router
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.http.custom_response: envoy.extensions.http.custom_response.local_response_policy, envoy.extensions.http.custom_response.redirect_policy
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.filters.listener: envoy.filters.listener.http_inspector, envoy.filters.listener.original_dst, envoy.filters.listener.original_src, envoy.filters.listener.proxy_protocol, envoy.filters.listener.tls_inspector, envoy.listener.http_inspector, envoy.listener.original_dst, envoy.listener.original_src, envoy.listener.proxy_protocol, envoy.listener.tls_inspector
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.formatter: envoy.formatter.metadata, envoy.formatter.req_without_query
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.upstream_options: envoy.extensions.upstreams.http.v3.HttpProtocolOptions, envoy.extensions.upstreams.tcp.v3.TcpProtocolOptions, envoy.upstreams.http.http_protocol_options, envoy.upstreams.tcp.tcp_protocol_options
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.dubbo_proxy.serializers: dubbo.hessian2
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.http.cache: envoy.extensions.http.cache.file_system_http_cache, envoy.extensions.http.cache.simple
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.network.dns_resolver: envoy.network.dns_resolver.cares, envoy.network.dns_resolver.getaddrinfo
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.retry_host_predicates: envoy.retry_host_predicates.omit_canary_hosts, envoy.retry_host_predicates.omit_host_metadata, envoy.retry_host_predicates.previous_hosts
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.http.header_validators: envoy.http.header_validators.envoy_default
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.grpc_credentials: envoy.grpc_credentials.aws_iam, envoy.grpc_credentials.default, envoy.grpc_credentials.file_based_metadata
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.load_balancing_policies: envoy.load_balancing_policies.least_request, envoy.load_balancing_policies.random, envoy.load_balancing_policies.round_robin
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   network.connection.client: default, envoy_internal
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   envoy.path.rewrite: envoy.path.rewrite.uri_template.uri_template_rewriter
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.957][1][info][main] [source/server/server.cc:408]   quic.http_server_connection: quic.http_server_connection.default
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.964][1][info][main] [source/server/server.cc:456] HTTP header map info:
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.967][1][info][main] [source/server/server.cc:459]   request header map: 672 bytes: :authority,:method,:path,:protocol,:scheme,accept,accept-encoding,access-control-request-headers,access-control-request-method,access-control-request-private-network,authentication,authorization,cache-control,cdn-loop,connection,content-encoding,content-length,content-type,expect,grpc-accept-encoding,grpc-timeout,if-match,if-modified-since,if-none-match,if-range,if-unmodified-since,keep-alive,origin,pragma,proxy-connection,proxy-status,referer,te,transfer-encoding,upgrade,user-agent,via,x-client-trace-id,x-envoy-attempt-count,x-envoy-decorator-operation,x-envoy-downstream-service-cluster,x-envoy-downstream-service-node,x-envoy-expected-rq-timeout-ms,x-envoy-external-address,x-envoy-force-trace,x-envoy-hedge-on-per-try-timeout,x-envoy-internal,x-envoy-ip-tags,x-envoy-is-timeout-retry,x-envoy-max-retries,x-envoy-original-path,x-envoy-original-url,x-envoy-retriable-header-names,x-envoy-retriable-status-codes,x-envoy-retry-grpc-on,x-envoy-retry-on,x-envoy-upstream-alt-stat-name,x-envoy-upstream-rq-per-try-timeout-ms,x-envoy-upstream-rq-timeout-alt-response,x-envoy-upstream-rq-timeout-ms,x-envoy-upstream-stream-duration-ms,x-forwarded-client-cert,x-forwarded-for,x-forwarded-host,x-forwarded-port,x-forwarded-proto,x-ot-span-context,x-request-id
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.967][1][info][main] [source/server/server.cc:459]   request trailer map: 120 bytes: 
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.967][1][info][main] [source/server/server.cc:459]   response header map: 432 bytes: :status,access-control-allow-credentials,access-control-allow-headers,access-control-allow-methods,access-control-allow-origin,access-control-allow-private-network,access-control-expose-headers,access-control-max-age,age,cache-control,connection,content-encoding,content-length,content-type,date,etag,expires,grpc-message,grpc-status,keep-alive,last-modified,location,proxy-connection,proxy-status,server,transfer-encoding,upgrade,vary,via,x-envoy-attempt-count,x-envoy-decorator-operation,x-envoy-degraded,x-envoy-immediate-health-check-fail,x-envoy-ratelimited,x-envoy-upstream-canary,x-envoy-upstream-healthchecked-cluster,x-envoy-upstream-service-time,x-request-id
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.967][1][info][main] [source/server/server.cc:459]   response trailer map: 144 bytes: grpc-message,grpc-status
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.974][1][info][main] [source/server/server.cc:819] runtime: {}
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.975][1][info][admin] [source/server/admin/admin.cc:67] admin address: 0.0.0.0:9901
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.975][1][info][config] [source/server/configuration_impl.cc:131] loading tracing configuration
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.975][1][info][config] [source/server/configuration_impl.cc:91] loading 0 static secret(s)
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.976][1][info][config] [source/server/configuration_impl.cc:97] loading 1 cluster(s)
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.977][1][info][config] [source/server/configuration_impl.cc:101] loading 1 listener(s)
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.982][1][info][config] [source/server/configuration_impl.cc:113] loading stats configuration
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.982][1][info][runtime] [source/common/runtime/runtime_impl.cc:463] RTDS has finished initialization
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.982][1][info][upstream] [source/common/upstream/cluster_manager_impl.cc:226] cm init: all clusters initialized
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.982][1][warning][main] [source/server/server.cc:794] there is no configured limit to the number of allowed active connections. Set a limit via the runtime key overload.global_downstream_max_connections
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.983][1][info][main] [source/server/server.cc:896] all clusters initialized. initializing init manager
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.983][1][info][config] [source/extensions/listener_managers/listener_manager/listener_manager_impl.cc:852] all dependencies initialized. starting workers
Feb 15 11:43:24 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:24.984][1][info][main] [source/server/server.cc:915] starting main dispatch loop
Feb 15 11:43:25 managed-node1 quadlet-demo-wordpress[72128]: Complete! WordPress has been successfully copied to /var/www/html
Feb 15 11:43:25 managed-node1 python3.12[72358]: ansible-ansible.legacy.command Invoked with _raw_params=ls -alrtF /etc/containers/systemd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:26 managed-node1 python3.12[72582]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:26 managed-node1 quadlet-demo-wordpress[72128]: AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 192.168.30.3. Set the 'ServerName' directive globally to suppress this message
Feb 15 11:43:26 managed-node1 quadlet-demo-wordpress[72128]: AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 192.168.30.3. Set the 'ServerName' directive globally to suppress this message
Feb 15 11:43:26 managed-node1 quadlet-demo-wordpress[72128]: [Sat Feb 15 16:43:26.394280 2025] [mpm_prefork:notice] [pid 1] AH00163: Apache/2.4.10 (Debian) PHP/5.6.32 configured -- resuming normal operations
Feb 15 11:43:26 managed-node1 quadlet-demo-wordpress[72128]: [Sat Feb 15 16:43:26.394597 2025] [core:notice] [pid 1] AH00094: Command line: 'apache2 -D FOREGROUND'
Feb 15 11:43:26 managed-node1 python3.12[72738]: ansible-ansible.legacy.command Invoked with _raw_params=podman volume ls _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:27 managed-node1 python3.12[72877]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod ps --ctr-ids --ctr-names --ctr-status _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:27 managed-node1 python3.12[73016]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail; systemctl list-units | grep quadlet _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:28 managed-node1 python3.12[73150]: ansible-get_url Invoked with url=https://localhost:8000 dest=/run/out mode=0600 validate_certs=False force=False http_agent=ansible-httpget use_proxy=True force_basic_auth=False use_gssapi=False backup=False checksum= timeout=10 unredirected_headers=[] decompress=True use_netrc=True unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None headers=None tmp_dest=None ciphers=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:43:28 managed-node1 quadlet-demo-wordpress[72128]: 127.0.0.1 - - [15/Feb/2025:16:43:28 +0000] "GET / HTTP/1.1" 302 324 "-" "ansible-httpget"
Feb 15 11:43:29 managed-node1 quadlet-demo-wordpress[72128]: 127.0.0.1 - - [15/Feb/2025:16:43:28 +0000] "GET /wp-admin/install.php HTTP/1.1" 200 11984 "-" "ansible-httpget"
Feb 15 11:43:29 managed-node1 python3.12[73286]: ansible-ansible.legacy.command Invoked with _raw_params=cat /run/out _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:30 managed-node1 python3.12[73418]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:30 managed-node1 python3.12[73558]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod ps --ctr-ids --ctr-names --ctr-status _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:31 managed-node1 python3.12[73697]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail; systemctl list-units --all | grep quadlet _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:31 managed-node1 python3.12[73831]: ansible-ansible.legacy.command Invoked with _raw_params=ls -alrtF /etc/systemd/system _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:34 managed-node1 python3.12[74094]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:35 managed-node1 python3.12[74231]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:43:37 managed-node1 python3.12[74364]: ansible-ansible.legacy.dnf Invoked with name=['firewalld'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 15 11:43:38 managed-node1 python3.12[74496]: ansible-systemd Invoked with name=firewalld masked=False daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Feb 15 11:43:38 managed-node1 python3.12[74629]: ansible-ansible.legacy.systemd Invoked with name=firewalld state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 15 11:43:39 managed-node1 python3.12[74762]: ansible-fedora.linux_system_roles.firewall_lib Invoked with port=['8000/tcp'] permanent=True runtime=True state=enabled __report_changed=True service=[] source_port=[] forward_port=[] rich_rule=[] source=[] interface=[] interface_pci_id=[] icmp_block=[] timeout=0 ipset_entries=[] protocol=[] helper_module=[] destination=[] firewalld_conf=None masquerade=None icmp_block_inversion=None target=None zone=None set_default_zone=None ipset=None ipset_type=None description=None short=None
Feb 15 11:43:40 managed-node1 python3.12[74893]: ansible-fedora.linux_system_roles.firewall_lib Invoked with port=['9000/tcp'] permanent=True runtime=True state=enabled __report_changed=True service=[] source_port=[] forward_port=[] rich_rule=[] source=[] interface=[] interface_pci_id=[] icmp_block=[] timeout=0 ipset_entries=[] protocol=[] helper_module=[] destination=[] firewalld_conf=None masquerade=None icmp_block_inversion=None target=None zone=None set_default_zone=None ipset=None ipset_type=None description=None short=None
Feb 15 11:43:46 managed-node1 python3.12[75439]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:43:47 managed-node1 python3.12[75572]: ansible-systemd Invoked with name=quadlet-demo.service scope=system state=stopped enabled=False force=True daemon_reload=False daemon_reexec=False no_block=False masked=None
Feb 15 11:43:47 managed-node1 systemd[1]: Reload requested from client PID 75575 ('systemctl') (unit session-8.scope)...
Feb 15 11:43:47 managed-node1 systemd[1]: Reloading...
Feb 15 11:43:47 managed-node1 systemd-rc-local-generator[75613]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:43:47 managed-node1 systemd[1]: Reloading finished in 228 ms.
Feb 15 11:43:47 managed-node1 systemd[1]: Starting logrotate.service - Rotate log files...
░░ Subject: A start job for unit logrotate.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit logrotate.service has begun execution.
░░ 
░░ The job identifier is 8439.
Feb 15 11:43:47 managed-node1 systemd[1]: Stopping quadlet-demo.service...
░░ Subject: A stop job for unit quadlet-demo.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit quadlet-demo.service has begun execution.
░░ 
░░ The job identifier is 8517.
Feb 15 11:43:47 managed-node1 systemd[1]: libpod-381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386.scope: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit libpod-381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386.scope has successfully entered the 'dead' state.
Feb 15 11:43:47 managed-node1 conmon[72088]: conmon 381dcd9d839fd1347d6f : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386.scope/container/memory.events
Feb 15 11:43:47 managed-node1 podman[75632]: 2025-02-15 11:43:47.840876656 -0500 EST m=+0.036950554 container died 381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386 (image=localhost/podman-pause:5.3.1-1733097600, name=a96f3a51b8d1-service, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:47 managed-node1 systemd[1]: logrotate.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit logrotate.service has successfully entered the 'dead' state.
Feb 15 11:43:47 managed-node1 systemd[1]: Finished logrotate.service - Rotate log files.
░░ Subject: A start job for unit logrotate.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit logrotate.service has finished successfully.
░░ 
░░ The job identifier is 8439.
Feb 15 11:43:47 managed-node1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386-userdata-shm.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit var-lib-containers-storage-overlay\x2dcontainers-381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386-userdata-shm.mount has successfully entered the 'dead' state.
Feb 15 11:43:47 managed-node1 systemd[1]: var-lib-containers-storage-overlay-33f384f3e733096cffc835ab2c8e27ed0eb68d2050221d8c3bd803f8d9ed817e-merged.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit var-lib-containers-storage-overlay-33f384f3e733096cffc835ab2c8e27ed0eb68d2050221d8c3bd803f8d9ed817e-merged.mount has successfully entered the 'dead' state.
Feb 15 11:43:47 managed-node1 podman[75632]: 2025-02-15 11:43:47.961827773 -0500 EST m=+0.157901615 container cleanup 381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386 (image=localhost/podman-pause:5.3.1-1733097600, name=a96f3a51b8d1-service, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.003780869 -0500 EST m=+0.024667521 pod stop 5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147 (image=, name=quadlet-demo)
Feb 15 11:43:48 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:48.036][1][warning][main] [source/server/server.cc:854] caught ENVOY_SIGTERM
Feb 15 11:43:48 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:48.036][1][info][main] [source/server/server.cc:985] shutting down server instance
Feb 15 11:43:48 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:48.036][1][info][main] [source/server/server.cc:920] main dispatch loop exited
Feb 15 11:43:48 managed-node1 quadlet-demo-envoy[72136]: [2025-02-15 16:43:48.038][1][info][main] [source/server/server.cc:972] exiting
Feb 15 11:43:48 managed-node1 systemd[1]: libpod-a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08.scope: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit libpod-a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08.scope has successfully entered the 'dead' state.
Feb 15 11:43:48 managed-node1 systemd[1]: libpod-0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b.scope: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit libpod-0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b.scope has successfully entered the 'dead' state.
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.059474493 -0500 EST m=+0.080361206 container died 0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b (image=quay.io/linux-system-roles/envoyproxy:v1.25.0, name=quadlet-demo-envoy, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.078365019 -0500 EST m=+0.099251715 container died a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08 (image=localhost/podman-pause:5.3.1-1733097600, name=5bc2e99d5832-infra, PODMAN_SYSTEMD_UNIT=quadlet-demo.service, io.buildah.version=1.38.0)
Feb 15 11:43:48 managed-node1 kernel: podman2: port 2(veth2) entered disabled state
Feb 15 11:43:48 managed-node1 kernel: veth2 (unregistering): left allmulticast mode
Feb 15 11:43:48 managed-node1 kernel: veth2 (unregistering): left promiscuous mode
Feb 15 11:43:48 managed-node1 kernel: podman2: port 2(veth2) entered disabled state
Feb 15 11:43:48 managed-node1 quadlet-demo-wordpress[72128]: [Sat Feb 15 16:43:48.100362 2025] [mpm_prefork:notice] [pid 1] AH00169: caught SIGTERM, shutting down
Feb 15 11:43:48 managed-node1 systemd[1]: var-lib-containers-storage-overlay-2db7685769f998ba63c1e2109f05c8eb97105d04dacc98c77beeedbd1cfbcf1f-merged.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit var-lib-containers-storage-overlay-2db7685769f998ba63c1e2109f05c8eb97105d04dacc98c77beeedbd1cfbcf1f-merged.mount has successfully entered the 'dead' state.
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.125342088 -0500 EST m=+0.146229090 container cleanup 0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b (image=quay.io/linux-system-roles/envoyproxy:v1.25.0, name=quadlet-demo-envoy, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:48 managed-node1 systemd[1]: libpod-8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074.scope: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit libpod-8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074.scope has successfully entered the 'dead' state.
Feb 15 11:43:48 managed-node1 systemd[1]: libpod-8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074.scope: Consumed 1.132s CPU time, 88.4M memory peak.
░░ Subject: Resources consumed by unit runtime
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit libpod-8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074.scope completed and consumed the indicated resources.
Feb 15 11:43:48 managed-node1 conmon[72128]: conmon 8454ef009c748bf82c4e : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice/libpod-8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074.scope/container/memory.events
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.13892336 -0500 EST m=+0.159810025 container died 8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074 (image=quay.io/linux-system-roles/wordpress:4.8-apache, name=quadlet-demo-wordpress, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:48 managed-node1 systemd[1]: run-netns-netns\x2d92efa637\x2da076\x2d0041\x2d1154\x2dee3b372b1384.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit run-netns-netns\x2d92efa637\x2da076\x2d0041\x2d1154\x2dee3b372b1384.mount has successfully entered the 'dead' state.
Feb 15 11:43:48 managed-node1 systemd[1]: var-lib-containers-storage-overlay-07621a5a1f20ba7e7aeac628c64e67b022400aff5a899ed9ab94774033382c94-merged.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit var-lib-containers-storage-overlay-07621a5a1f20ba7e7aeac628c64e67b022400aff5a899ed9ab94774033382c94-merged.mount has successfully entered the 'dead' state.
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.199764355 -0500 EST m=+0.220650969 container cleanup a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08 (image=localhost/podman-pause:5.3.1-1733097600, name=5bc2e99d5832-infra, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service, io.buildah.version=1.38.0)
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.211582895 -0500 EST m=+0.232469454 container cleanup 8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074 (image=quay.io/linux-system-roles/wordpress:4.8-apache, name=quadlet-demo-wordpress, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:48 managed-node1 systemd[1]: Removed slice machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice - cgroup machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice.
░░ Subject: A stop job for unit machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice has finished.
░░ 
░░ The job identifier is 8518 and the job result is done.
Feb 15 11:43:48 managed-node1 systemd[1]: machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice: Consumed 1.236s CPU time, 111.4M memory peak, 26M written to disk.
░░ Subject: Resources consumed by unit runtime
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice completed and consumed the indicated resources.
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.240911041 -0500 EST m=+0.261797633 container remove 8454ef009c748bf82c4e7d4be52839e2342c033266a80f9f990e28abce0b6074 (image=quay.io/linux-system-roles/wordpress:4.8-apache, name=quadlet-demo-wordpress, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.262778326 -0500 EST m=+0.283664921 container remove 0cdb8c8a720852b5d4be1bb96e20f8c3acaf098b559f0bce677ee6f7a7af8d7b (image=quay.io/linux-system-roles/envoyproxy:v1.25.0, name=quadlet-demo-envoy, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.290063159 -0500 EST m=+0.310949744 container remove a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08 (image=localhost/podman-pause:5.3.1-1733097600, name=5bc2e99d5832-infra, pod_id=5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147, PODMAN_SYSTEMD_UNIT=quadlet-demo.service, io.buildah.version=1.38.0)
Feb 15 11:43:48 managed-node1 systemd[1]: machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice: Failed to open /run/systemd/transient/machine-libpod_pod_5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147.slice: No such file or directory
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.299157884 -0500 EST m=+0.320044437 pod remove 5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147 (image=, name=quadlet-demo)
Feb 15 11:43:48 managed-node1 podman[75645]: 2025-02-15 11:43:48.320892161 -0500 EST m=+0.341778754 container remove 381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386 (image=localhost/podman-pause:5.3.1-1733097600, name=a96f3a51b8d1-service, PODMAN_SYSTEMD_UNIT=quadlet-demo.service)
Feb 15 11:43:48 managed-node1 quadlet-demo[75645]: Pods stopped:
Feb 15 11:43:48 managed-node1 quadlet-demo[75645]: 5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147
Feb 15 11:43:48 managed-node1 quadlet-demo[75645]: Pods removed:
Feb 15 11:43:48 managed-node1 quadlet-demo[75645]: 5bc2e99d583254662cd7801707ba0e98319f17c8b431a00c1d4ce65affed2147
Feb 15 11:43:48 managed-node1 quadlet-demo[75645]: Secrets removed:
Feb 15 11:43:48 managed-node1 quadlet-demo[75645]: Volumes removed:
Feb 15 11:43:48 managed-node1 quadlet-demo[75645]: time="2025-02-15T11:43:48-05:00" level=error msg="Checking whether service of container 381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386 can be stopped: no container with ID 381dcd9d839fd1347d6f6bc9dcfa3b1d749f692859ef84516f4557efbbba5386 found in database: no such container"
Feb 15 11:43:48 managed-node1 systemd[1]: quadlet-demo.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit quadlet-demo.service has successfully entered the 'dead' state.
Feb 15 11:43:48 managed-node1 systemd[1]: Stopped quadlet-demo.service.
░░ Subject: A stop job for unit quadlet-demo.service has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit quadlet-demo.service has finished.
░░ 
░░ The job identifier is 8517 and the job result is done.
Feb 15 11:43:48 managed-node1 python3.12[75821]: ansible-stat Invoked with path=/etc/containers/systemd/quadlet-demo.kube follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:43:48 managed-node1 systemd[1]: var-lib-containers-storage-overlay-0f9d58b2dd8f05cb1b506f2a3d8d3663d53c73b0d4332ae52c15209696b47dd8-merged.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit var-lib-containers-storage-overlay-0f9d58b2dd8f05cb1b506f2a3d8d3663d53c73b0d4332ae52c15209696b47dd8-merged.mount has successfully entered the 'dead' state.
Feb 15 11:43:48 managed-node1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08-userdata-shm.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit var-lib-containers-storage-overlay\x2dcontainers-a5c265751b66e04753cdac215bb8d1502763e027fb55a49dc90d47167dcf3b08-userdata-shm.mount has successfully entered the 'dead' state.
Feb 15 11:43:49 managed-node1 python3.12[76085]: ansible-file Invoked with path=/etc/containers/systemd/quadlet-demo.kube state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:43:50 managed-node1 python3.12[76216]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:43:50 managed-node1 systemd[1]: Reload requested from client PID 76217 ('systemctl') (unit session-8.scope)...
Feb 15 11:43:50 managed-node1 systemd[1]: Reloading...
Feb 15 11:43:50 managed-node1 systemd-rc-local-generator[76252]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:43:50 managed-node1 systemd[1]: Reloading finished in 212 ms.
Feb 15 11:43:51 managed-node1 python3.12[76401]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune --all -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:51 managed-node1 podman[76402]: 2025-02-15 11:43:51.135439939 -0500 EST m=+0.039875155 image untag 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f localhost:5000/libpod/testimage:20210610
Feb 15 11:43:51 managed-node1 podman[76402]: 2025-02-15 11:43:51.135445852 -0500 EST m=+0.039881108 image untag 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610
Feb 15 11:43:51 managed-node1 podman[76402]: 2025-02-15 11:43:51.115099416 -0500 EST m=+0.019534403 image remove 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f 
Feb 15 11:43:51 managed-node1 podman[76402]: 2025-02-15 11:43:51.145028824 -0500 EST m=+0.049463830 image untag 54c8b4fe9ef10b679d92adad3fdcaa4ca5ba12c8f6858ab62e334a2608edc257 localhost/podman-pause:5.3.1-1733097600
Feb 15 11:43:51 managed-node1 podman[76402]: 2025-02-15 11:43:51.13545453 -0500 EST m=+0.039889615 image remove 54c8b4fe9ef10b679d92adad3fdcaa4ca5ba12c8f6858ab62e334a2608edc257 
Feb 15 11:43:51 managed-node1 podman[76402]: 2025-02-15 11:43:51.694907901 -0500 EST m=+0.599342885 image untag fcf3e41b8864a14d75a6d0627d3d02154e28a153aa57e8baa392cd744ffa0d0b quay.io/linux-system-roles/wordpress:4.8-apache
Feb 15 11:43:51 managed-node1 podman[76402]: 2025-02-15 11:43:51.14503744 -0500 EST m=+0.049472402 image remove fcf3e41b8864a14d75a6d0627d3d02154e28a153aa57e8baa392cd744ffa0d0b 
Feb 15 11:43:51 managed-node1 podman[76402]: 2025-02-15 11:43:51.845429252 -0500 EST m=+0.749864420 image untag 5af2585e22ed1562885d9407efab74010090427be79048c2cd6a226517cc1e1d quay.io/linux-system-roles/envoyproxy:v1.25.0
Feb 15 11:43:51 managed-node1 podman[76402]: 2025-02-15 11:43:51.69491959 -0500 EST m=+0.599354548 image remove 5af2585e22ed1562885d9407efab74010090427be79048c2cd6a226517cc1e1d 
Feb 15 11:43:52 managed-node1 podman[76459]: 2025-02-15 11:43:52.295189018 -0500 EST m=+0.065232261 container health_status 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2 (image=quay.io/linux-system-roles/mysql:5.6, name=quadlet-demo-mysql, health_status=healthy, health_failing_streak=1, health_log=, PODMAN_SYSTEMD_UNIT=quadlet-demo-mysql.service)
Feb 15 11:43:52 managed-node1 systemd[1]: 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.service: Main process exited, code=exited, status=125/n/a
░░ Subject: Unit process exited
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ An ExecStart= process belonging to unit 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.service has exited.
░░ 
░░ The process' exit code is 'exited' and its exit status is 125.
Feb 15 11:43:52 managed-node1 systemd[1]: 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.service: Failed with result 'exit-code'.
░░ Subject: Unit failed
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.service has entered the 'failed' state with result 'exit-code'.
Feb 15 11:43:52 managed-node1 python3.12[76548]: ansible-ansible.legacy.command Invoked with _raw_params=podman images -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:53 managed-node1 python3.12[76686]: ansible-ansible.legacy.command Invoked with _raw_params=podman volume ls -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:53 managed-node1 python3.12[76825]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --noheading _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:53 managed-node1 python3.12[76963]: ansible-ansible.legacy.command Invoked with _raw_params=podman network ls -n -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:43:55 managed-node1 python3.12[77376]: ansible-service_facts Invoked
Feb 15 11:43:57 managed-node1 python3.12[77616]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:43:59 managed-node1 python3.12[77749]: ansible-stat Invoked with path=/etc/containers/systemd/quadlet-demo.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:44:00 managed-node1 python3.12[78013]: ansible-file Invoked with path=/etc/containers/systemd/quadlet-demo.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:44:00 managed-node1 python3.12[78144]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:44:00 managed-node1 systemd[1]: Reload requested from client PID 78145 ('systemctl') (unit session-8.scope)...
Feb 15 11:44:00 managed-node1 systemd[1]: Reloading...
Feb 15 11:44:00 managed-node1 systemd-rc-local-generator[78183]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:44:00 managed-node1 systemd[1]: Reloading finished in 211 ms.
Feb 15 11:44:01 managed-node1 podman[78330]: 2025-02-15 11:44:01.371827209 -0500 EST m=+0.025924776 volume remove envoy-proxy-config
Feb 15 11:44:01 managed-node1 podman[78469]: 2025-02-15 11:44:01.756489397 -0500 EST m=+0.025895964 volume remove envoy-certificates
Feb 15 11:44:02 managed-node1 podman[78608]: 2025-02-15 11:44:02.175516737 -0500 EST m=+0.061098800 volume remove wp-pv-claim
Feb 15 11:44:02 managed-node1 python3.12[78746]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune --all -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:03 managed-node1 python3.12[78885]: ansible-ansible.legacy.command Invoked with _raw_params=podman images -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:03 managed-node1 python3.12[79023]: ansible-ansible.legacy.command Invoked with _raw_params=podman volume ls -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:04 managed-node1 python3.12[79162]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --noheading _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:04 managed-node1 python3.12[79300]: ansible-ansible.legacy.command Invoked with _raw_params=podman network ls -n -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:05 managed-node1 python3.12[79714]: ansible-service_facts Invoked
Feb 15 11:44:08 managed-node1 python3.12[79954]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:44:09 managed-node1 python3.12[80087]: ansible-stat Invoked with path=/etc/containers/systemd/envoy-proxy-configmap.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:44:10 managed-node1 python3.12[80351]: ansible-file Invoked with path=/etc/containers/systemd/envoy-proxy-configmap.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:44:11 managed-node1 python3.12[80482]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:44:11 managed-node1 systemd[1]: Reload requested from client PID 80483 ('systemctl') (unit session-8.scope)...
Feb 15 11:44:11 managed-node1 systemd[1]: Reloading...
Feb 15 11:44:11 managed-node1 systemd-rc-local-generator[80527]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:44:11 managed-node1 systemd[1]: Reloading finished in 210 ms.
Feb 15 11:44:12 managed-node1 python3.12[80668]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune --all -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:12 managed-node1 python3.12[80807]: ansible-ansible.legacy.command Invoked with _raw_params=podman images -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:13 managed-node1 python3.12[80945]: ansible-ansible.legacy.command Invoked with _raw_params=podman volume ls -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:13 managed-node1 python3.12[81084]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --noheading _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:14 managed-node1 python3.12[81222]: ansible-ansible.legacy.command Invoked with _raw_params=podman network ls -n -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:15 managed-node1 python3.12[81637]: ansible-service_facts Invoked
Feb 15 11:44:18 managed-node1 python3.12[81877]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:44:19 managed-node1 python3.12[82011]: ansible-systemd Invoked with name=quadlet-demo-mysql.service scope=system state=stopped enabled=False force=True daemon_reload=False daemon_reexec=False no_block=False masked=None
Feb 15 11:44:19 managed-node1 systemd[1]: Reload requested from client PID 82014 ('systemctl') (unit session-8.scope)...
Feb 15 11:44:19 managed-node1 systemd[1]: Reloading...
Feb 15 11:44:19 managed-node1 systemd-rc-local-generator[82059]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:44:19 managed-node1 systemd[1]: Reloading finished in 211 ms.
Feb 15 11:44:19 managed-node1 systemd[1]: Stopping quadlet-demo-mysql.service...
░░ Subject: A stop job for unit quadlet-demo-mysql.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit quadlet-demo-mysql.service has begun execution.
░░ 
░░ The job identifier is 8597.
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] mysqld: Normal shutdown
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Giving 0 client threads a chance to die gracefully
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Event Scheduler: Purging the queue. 0 events
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down slave threads
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Forcefully disconnecting 0 remaining clients
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Binlog end
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'partition'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'PERFORMANCE_SCHEMA'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_SYS_DATAFILES'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_SYS_TABLESPACES'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_SYS_FOREIGN_COLS'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_SYS_FOREIGN'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_SYS_FIELDS'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_SYS_COLUMNS'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_SYS_INDEXES'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_SYS_TABLESTATS'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_SYS_TABLES'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_FT_INDEX_TABLE'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_FT_INDEX_CACHE'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_FT_CONFIG'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_FT_BEING_DELETED'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_FT_DELETED'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_FT_DEFAULT_STOPWORD'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_METRICS'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_BUFFER_POOL_STATS'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_BUFFER_PAGE_LRU'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_BUFFER_PAGE'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_CMP_PER_INDEX_RESET'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_CMP_PER_INDEX'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_CMPMEM_RESET'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_CMPMEM'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_CMP_RESET'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_CMP'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_LOCK_WAITS'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_LOCKS'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'INNODB_TRX'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] Shutting down plugin 'InnoDB'
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] InnoDB: FTS optimize thread exiting.
Feb 15 11:44:19 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:19 1 [Note] InnoDB: Starting shutdown...
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] InnoDB: Shutdown completed; log sequence number 1626007
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] Shutting down plugin 'BLACKHOLE'
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] Shutting down plugin 'ARCHIVE'
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] Shutting down plugin 'MRG_MYISAM'
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] Shutting down plugin 'MyISAM'
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] Shutting down plugin 'MEMORY'
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] Shutting down plugin 'CSV'
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] Shutting down plugin 'sha256_password'
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] Shutting down plugin 'mysql_old_password'
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] Shutting down plugin 'mysql_native_password'
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] Shutting down plugin 'binlog'
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 2025-02-15 16:44:21 1 [Note] mysqld: Shutdown complete
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[68602]: 
Feb 15 11:44:21 managed-node1 podman[82070]: 2025-02-15 11:44:21.1386753 -0500 EST m=+1.174228535 container died 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2 (image=quay.io/linux-system-roles/mysql:5.6, name=quadlet-demo-mysql, PODMAN_SYSTEMD_UNIT=quadlet-demo-mysql.service)
Feb 15 11:44:21 managed-node1 systemd[1]: 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.timer: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.timer has successfully entered the 'dead' state.
Feb 15 11:44:21 managed-node1 systemd[1]: Stopped 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.timer - [systemd-run] /usr/bin/podman healthcheck run 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2.
░░ Subject: A stop job for unit 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.timer has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-6a1e826a610e9a27.timer has finished.
░░ 
░░ The job identifier is 8598 and the job result is done.
Feb 15 11:44:21 managed-node1 kernel: podman2: port 1(veth1) entered disabled state
Feb 15 11:44:21 managed-node1 kernel: veth1 (unregistering): left allmulticast mode
Feb 15 11:44:21 managed-node1 kernel: veth1 (unregistering): left promiscuous mode
Feb 15 11:44:21 managed-node1 kernel: podman2: port 1(veth1) entered disabled state
Feb 15 11:44:21 managed-node1 systemd[1]: run-p68595-i68895.scope: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit run-p68595-i68895.scope has successfully entered the 'dead' state.
Feb 15 11:44:21 managed-node1 NetworkManager[733]:   [1739637861.2008] device (podman2): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 15 11:44:21 managed-node1 systemd[1]: Starting NetworkManager-dispatcher.service - Network Manager Script Dispatcher Service...
░░ Subject: A start job for unit NetworkManager-dispatcher.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit NetworkManager-dispatcher.service has begun execution.
░░ 
░░ The job identifier is 8600.
Feb 15 11:44:21 managed-node1 systemd[1]: Started NetworkManager-dispatcher.service - Network Manager Script Dispatcher Service.
░░ Subject: A start job for unit NetworkManager-dispatcher.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A start job for unit NetworkManager-dispatcher.service has finished successfully.
░░ 
░░ The job identifier is 8600.
Feb 15 11:44:21 managed-node1 systemd[1]: run-netns-netns\x2d62e44cf3\x2d48a4\x2daeaa\x2df87f\x2dd232a8a5ffed.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit run-netns-netns\x2d62e44cf3\x2d48a4\x2daeaa\x2df87f\x2dd232a8a5ffed.mount has successfully entered the 'dead' state.
Feb 15 11:44:21 managed-node1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-userdata-shm.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit var-lib-containers-storage-overlay\x2dcontainers-3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2-userdata-shm.mount has successfully entered the 'dead' state.
Feb 15 11:44:21 managed-node1 systemd[1]: var-lib-containers-storage-overlay-0b23a57f41ef8e040143c03a17f1414d2cb2e7456a5dec261b80613000ca1e6b-merged.mount: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit var-lib-containers-storage-overlay-0b23a57f41ef8e040143c03a17f1414d2cb2e7456a5dec261b80613000ca1e6b-merged.mount has successfully entered the 'dead' state.
Feb 15 11:44:21 managed-node1 podman[82070]: 2025-02-15 11:44:21.297860569 -0500 EST m=+1.333413813 container remove 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2 (image=quay.io/linux-system-roles/mysql:5.6, name=quadlet-demo-mysql, PODMAN_SYSTEMD_UNIT=quadlet-demo-mysql.service)
Feb 15 11:44:21 managed-node1 quadlet-demo-mysql[82070]: 3aa715549e276fe24eb41aa89bc30b540eac890aee56d61ffaf5f101952cefb2
Feb 15 11:44:21 managed-node1 systemd[1]: quadlet-demo-mysql.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit quadlet-demo-mysql.service has successfully entered the 'dead' state.
Feb 15 11:44:21 managed-node1 systemd[1]: Stopped quadlet-demo-mysql.service.
░░ Subject: A stop job for unit quadlet-demo-mysql.service has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit quadlet-demo-mysql.service has finished.
░░ 
░░ The job identifier is 8597 and the job result is done.
Feb 15 11:44:21 managed-node1 systemd[1]: quadlet-demo-mysql.service: Consumed 2.841s CPU time, 611.8M memory peak.
░░ Subject: Resources consumed by unit runtime
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit quadlet-demo-mysql.service completed and consumed the indicated resources.
Feb 15 11:44:21 managed-node1 python3.12[82248]: ansible-stat Invoked with path=/etc/containers/systemd/quadlet-demo-mysql.container follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:44:22 managed-node1 python3.12[82512]: ansible-file Invoked with path=/etc/containers/systemd/quadlet-demo-mysql.container state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:44:23 managed-node1 python3.12[82644]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:44:23 managed-node1 systemd[1]: Reload requested from client PID 82645 ('systemctl') (unit session-8.scope)...
Feb 15 11:44:23 managed-node1 systemd[1]: Reloading...
Feb 15 11:44:23 managed-node1 systemd-rc-local-generator[82686]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:44:23 managed-node1 systemd[1]: Reloading finished in 204 ms.
Feb 15 11:44:24 managed-node1 python3.12[82969]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune --all -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:24 managed-node1 podman[82970]: 2025-02-15 11:44:24.793085513 -0500 EST m=+0.242022814 image untag dd3b2a5dcb48ff61113592ed5ddd762581be4387c7bc552375a2159422aa6bf5 quay.io/linux-system-roles/mysql:5.6
Feb 15 11:44:24 managed-node1 podman[82970]: 2025-02-15 11:44:24.568066137 -0500 EST m=+0.017003531 image remove dd3b2a5dcb48ff61113592ed5ddd762581be4387c7bc552375a2159422aa6bf5 
Feb 15 11:44:25 managed-node1 python3.12[83108]: ansible-ansible.legacy.command Invoked with _raw_params=podman images -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:25 managed-node1 python3.12[83246]: ansible-ansible.legacy.command Invoked with _raw_params=podman volume ls -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:26 managed-node1 python3.12[83384]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --noheading _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:26 managed-node1 python3.12[83523]: ansible-ansible.legacy.command Invoked with _raw_params=podman network ls -n -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:28 managed-node1 python3.12[83939]: ansible-service_facts Invoked
Feb 15 11:44:30 managed-node1 python3.12[84178]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:44:31 managed-node1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit NetworkManager-dispatcher.service has successfully entered the 'dead' state.
Feb 15 11:44:31 managed-node1 python3.12[84312]: ansible-systemd Invoked with name=quadlet-demo-mysql-volume.service scope=system state=stopped enabled=False force=True daemon_reload=False daemon_reexec=False no_block=False masked=None
Feb 15 11:44:32 managed-node1 systemd[1]: Reload requested from client PID 84315 ('systemctl') (unit session-8.scope)...
Feb 15 11:44:32 managed-node1 systemd[1]: Reloading...
Feb 15 11:44:32 managed-node1 systemd-rc-local-generator[84358]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:44:32 managed-node1 systemd[1]: Reloading finished in 205 ms.
Feb 15 11:44:32 managed-node1 systemd[1]: quadlet-demo-mysql-volume.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit quadlet-demo-mysql-volume.service has successfully entered the 'dead' state.
Feb 15 11:44:32 managed-node1 systemd[1]: Stopped quadlet-demo-mysql-volume.service.
░░ Subject: A stop job for unit quadlet-demo-mysql-volume.service has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit quadlet-demo-mysql-volume.service has finished.
░░ 
░░ The job identifier is 8679 and the job result is done.
Feb 15 11:44:32 managed-node1 python3.12[84501]: ansible-stat Invoked with path=/etc/containers/systemd/quadlet-demo-mysql.volume follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:44:33 managed-node1 python3.12[84765]: ansible-file Invoked with path=/etc/containers/systemd/quadlet-demo-mysql.volume state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:44:34 managed-node1 python3.12[84896]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:44:34 managed-node1 systemd[1]: Reload requested from client PID 84897 ('systemctl') (unit session-8.scope)...
Feb 15 11:44:34 managed-node1 systemd[1]: Reloading...
Feb 15 11:44:34 managed-node1 systemd-rc-local-generator[84935]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:44:34 managed-node1 systemd[1]: Reloading finished in 203 ms.
Feb 15 11:44:35 managed-node1 podman[85082]: 2025-02-15 11:44:35.20195288 -0500 EST m=+0.027739564 volume remove systemd-quadlet-demo-mysql
Feb 15 11:44:35 managed-node1 python3.12[85221]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune --all -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:36 managed-node1 python3.12[85360]: ansible-ansible.legacy.command Invoked with _raw_params=podman images -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:36 managed-node1 python3.12[85499]: ansible-ansible.legacy.command Invoked with _raw_params=podman volume ls -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:37 managed-node1 python3.12[85638]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --noheading _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:37 managed-node1 python3.12[85776]: ansible-ansible.legacy.command Invoked with _raw_params=podman network ls -n -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:39 managed-node1 python3.12[86190]: ansible-service_facts Invoked
Feb 15 11:44:42 managed-node1 python3.12[86430]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:44:44 managed-node1 python3.12[86563]: ansible-systemd Invoked with name=quadlet-demo-network.service scope=system state=stopped enabled=False force=True daemon_reload=False daemon_reexec=False no_block=False masked=None
Feb 15 11:44:44 managed-node1 systemd[1]: Reload requested from client PID 86566 ('systemctl') (unit session-8.scope)...
Feb 15 11:44:44 managed-node1 systemd[1]: Reloading...
Feb 15 11:44:44 managed-node1 systemd-rc-local-generator[86610]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:44:44 managed-node1 systemd[1]: Reloading finished in 203 ms.
Feb 15 11:44:45 managed-node1 systemd[1]: quadlet-demo-network.service: Deactivated successfully.
░░ Subject: Unit succeeded
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ The unit quadlet-demo-network.service has successfully entered the 'dead' state.
Feb 15 11:44:45 managed-node1 systemd[1]: Stopped quadlet-demo-network.service.
░░ Subject: A stop job for unit quadlet-demo-network.service has finished
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░ 
░░ A stop job for unit quadlet-demo-network.service has finished.
░░ 
░░ The job identifier is 8680 and the job result is done.
Feb 15 11:44:45 managed-node1 python3.12[86752]: ansible-stat Invoked with path=/etc/containers/systemd/quadlet-demo.network follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 15 11:44:46 managed-node1 python3.12[87016]: ansible-file Invoked with path=/etc/containers/systemd/quadlet-demo.network state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 15 11:44:46 managed-node1 python3.12[87147]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None
Feb 15 11:44:46 managed-node1 systemd[1]: Reload requested from client PID 87148 ('systemctl') (unit session-8.scope)...
Feb 15 11:44:46 managed-node1 systemd[1]: Reloading...
Feb 15 11:44:47 managed-node1 systemd-rc-local-generator[87186]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 15 11:44:47 managed-node1 systemd[1]: Reloading finished in 199 ms.
Feb 15 11:44:48 managed-node1 python3.12[87470]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune --all -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:48 managed-node1 python3.12[87609]: ansible-ansible.legacy.command Invoked with _raw_params=podman images -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:49 managed-node1 python3.12[87748]: ansible-ansible.legacy.command Invoked with _raw_params=podman volume ls -n _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:49 managed-node1 python3.12[87886]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --noheading _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:50 managed-node1 python3.12[88025]: ansible-ansible.legacy.command Invoked with _raw_params=podman network ls -n -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:51 managed-node1 python3.12[88440]: ansible-service_facts Invoked
Feb 15 11:44:54 managed-node1 python3.12[88680]: ansible-ansible.legacy.command Invoked with _raw_params=exec 1>&2
                                                 set -x
                                                 set -o pipefail
                                                 systemctl list-units --plain -l --all | grep quadlet || :
                                                 systemctl list-unit-files --all | grep quadlet || :
                                                 systemctl list-units --plain --failed -l --all | grep quadlet || :
                                                  _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 15 11:44:54 managed-node1 python3.12[88818]: ansible-ansible.legacy.command Invoked with _raw_params=journalctl -ex _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
PLAY RECAP *********************************************************************
managed-node1              : ok=411  changed=44   unreachable=0    failed=1    skipped=443  rescued=1    ignored=0   
TASKS RECAP ********************************************************************
Saturday 15 February 2025  11:44:55 -0500 (0:00:00.448)       0:02:52.961 ***** 
=============================================================================== 
fedora.linux_system_roles.podman : Ensure container images are present -- 17.88s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18 
fedora.linux_system_roles.podman : Ensure container images are present --- 6.40s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18 
fedora.linux_system_roles.podman : For testing and debugging - services --- 3.13s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197 
fedora.linux_system_roles.podman : For testing and debugging - services --- 2.23s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197 
fedora.linux_system_roles.podman : Stop and disable service ------------- 2.22s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 
fedora.linux_system_roles.podman : For testing and debugging - services --- 2.12s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197 
fedora.linux_system_roles.podman : For testing and debugging - services --- 2.09s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197 
fedora.linux_system_roles.podman : For testing and debugging - services --- 2.07s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197 
fedora.linux_system_roles.podman : For testing and debugging - services --- 2.07s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:197 
Gathering Facts --------------------------------------------------------- 1.52s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:9 
Check web --------------------------------------------------------------- 1.44s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:121 
fedora.linux_system_roles.podman : Stop and disable service ------------- 1.42s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 
fedora.linux_system_roles.certificate : Slurp the contents of the files --- 1.35s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:152 
fedora.linux_system_roles.podman : Remove volumes ----------------------- 1.25s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:99 
fedora.linux_system_roles.firewall : Configure firewall ----------------- 1.22s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:71 
fedora.linux_system_roles.podman : Prune images no longer in use -------- 1.18s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:120 
fedora.linux_system_roles.podman : Start service ------------------------ 1.16s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:110 
fedora.linux_system_roles.certificate : Ensure certificate role dependencies are installed --- 1.16s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:5 
fedora.linux_system_roles.podman : Gather the package facts ------------- 1.12s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 
fedora.linux_system_roles.certificate : Remove files -------------------- 1.10s
/tmp/collections-WJe/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:181