Security auditing in the cloud often devolves into an exercise in “alert fatigue.” Traditional tools like Security Command Center or sprawling shell scripts produce massive CSV exports that are exhausting to parse and difficult to prioritize.

Enter the AI-driven approach. By using an agent like Gemini-CLI as an active “Security Co-pilot,” you can move away from static checklists toward an interactive, iterative discovery process. Gemini-CLI can ingest complex JSON outputs, understand IAM relationships contextually, and help you hunt down misconfigurations in real-time.

However, handing an AI the keys to your cloud infrastructure introduces new risks—specifically, the danger of AI “hallucinations” executing destructive commands. In this post, we’ll explore how to safely use Gemini-CLI to audit your GCP projects, utilizing GCP’s Resource Manager, Cloud Asset Inventory, and Privileged Access Manager (PAM) to ensure safety.

1. Setting the Stage: Safety Guardrails with GCP PAM

The golden rule of AI infrastructure management: Never give your AI agent permanent Owner or Editor permissions.

If an LLM hallucinates a gcloud command or misunderstands a prompt, it could accidentally delete a critical production bucket or open a firewall to the world. To solve this, we separate the Discovery Phase from the Remediation Phase using Privileged Access Manager (PAM).

  1. The “Read-Only” Baseline: Configure your local environment (or the Service Account Gemini-CLI assumes) with minimal roles like Viewer and Security Reviewer. This allows the agent to run discovery commands without the ability to mutate state.
  2. Defining PAM Entitlements: Set up Just-In-Time (JIT) entitlements in GCP for roles like Storage Admin or Compute Admin. Require manual human approval for these elevations.
  3. The Workflow: When Gemini-CLI discovers a vulnerability, you ask it to propose a fix. You then request a 15-minute elevation via PAM, manually review the AI’s proposed gcloud command, and execute it only when elevated.

Prompt Example:

“I see a public bucket in the output. I don’t have permission to fix it. Propose the exact gcloud command to remove allUsers, and I will request a 15-minute Storage Admin elevation via PAM to execute it.”

2. Phase 1: High-Level Inventory (Resource & Asset Manager)

When dropping into a new GCP environment, you need a bird’s-eye view. Gemini-CLI excels at parsing the deeply nested outputs of GCP’s Resource Manager and Cloud Asset Inventory (CAI).

Start by mapping the hierarchy:

Prompt: “Use gcloud projects list and gcloud resource-manager folders list to map out our environment. Identify any projects that look like abandoned test environments.”

Next, use CAI to hunt for “Shadow IT” or forgotten resources that expand your attack surface. CAI outputs huge JSON blocks that are painful for humans to read but trivial for Gemini to summarize.

Prompt: “Run gcloud asset search-all-resources --scope=projects/MY_PROJECT --format=json to find all compute instances. Summarize which ones have public IPs and have been running for more than a year without a restart.”

3. Phase 2: Hardening the Data Layer (GCS Buckets)

Cloud Storage misconfigurations are the number one cause of cloud data breaches. We can instruct Gemini-CLI to actively query for loose GCS permissions.

Hunting for “Public” Bindings:

Prompt: “Audit all GCS buckets in this project. Run the necessary commands to list their IAM policies. Flag any buckets that have bindings allowing allUsers or allAuthenticatedUsers. For each flagged bucket, tell me if the bucket name contains sensitive keywords like ‘backup’, ‘db’, or ‘creds’.”

Checking Uniform Bucket-Level Access (UBLA): Legacy fine-grained ACLs are a security nightmare because they hide access grants outside of standard IAM policies.

Prompt: “List all buckets where ‘Uniform Bucket-Level Access’ is disabled. Explain the security risks of leaving this disabled for these specific buckets, and provide the command to enforce UBLA.”

Querying for “Loose” Permissions:

Prompt: “Find any buckets where roles/storage.objectAdmin is granted to entire Google Workspace domains (e.g., domain:example.com) rather than specific, least-privileged service accounts.”

4. Phase 3: Identity & Access (IAM) Discovery

GCP’s project-first IAM is excellent, but permission creep still happens. Use Gemini-CLI to cast a broad net and then drill down into specifics.

The “Broad Net” Query:

Prompt: “List all users or service accounts in this project with primitive ‘Owner’ or ‘Editor’ roles. Cross-reference this with recent activity logs to see if they actually need these broad permissions.”

Service Account Hygiene:

Prompt: “Find all Service Account keys created more than 90 days ago. Check Cloud Logging to see if these keys have been used in the last 30 days. If not, generate the gcloud commands to disable (not delete) them.”

5. Phase 4: Network & Perimeter Security

Finally, assess the network perimeter. Firewall rules and exposed endpoints are prime targets.

Firewall Hygiene:

Prompt: “Use gcloud compute firewall-rules list. Identify any rules that allow ingress from 0.0.0.0/0 on sensitive management ports like 22 (SSH), 3389 (RDP), or 5432 (Postgres). Suggest replacement rules using Identity-Aware Proxy (IAP) instead.”

Public-Facing Services:

Prompt: “Audit all Cloud Run services. Which ones allow unauthenticated invocations? Are any of these internal-facing APIs that should be locked down behind a load balancer with IAP?”

6. Conclusion: Human-in-the-Loop Remediation

The true power of using Gemini-CLI for security auditing isn’t just in finding the problems—it’s in the speed of generating the fix.

Once the agent highlights an issue, you use the “Dry Run” Protocol. Have the AI output the exact gcloud or Terraform code to remediate the vulnerability. Review the code, use GCP PAM to temporarily elevate your privileges, and apply the fix.

AI doesn’t replace security expertise, but it dramatically accelerates the “search and destroy” mission of infrastructure technical debt. By pairing the analytical speed of Gemini with the safety guardrails of PAM, you can keep your GCP environments lean, secure, and tightly controlled.