-

groundcover Expands AI Observability to Support Agentic Workflows in Google Cloud

AI-native companies can now deploy the first observability platform designed for LLM scale directly to their Google Cloud environments

SAN FRANCISCO--(BUSINESS WIRE)--groundcover, the leading bring-your-own-cloud (BYOC) observability platform, today announced a major expansion of its AI Observability capability, adding native support for agentic AI systems fully compatible with Google Vertex AI. The update is automatically available to all groundcover customers at no additional cost and allows users to trace every LLM interaction. With this release, engineering and platform teams can add observability to production environments at the speed with which language model services are incorporated into modern applications.

As organizations rapidly integrate LLMs into production systems, they’re encountering a new kind of visibility gap. Traditional observability tools were designed for deterministic software, not systems where dynamic prompts drive outputs. As a result, teams often struggle to understand how AI-powered features behave in real-world environments, including what inputs are driving outcomes, how responses vary, and how usage impacts cost. This lack of visibility makes it difficult to ensure reliability, optimize performance, and confidently scale AI-driven applications. Addressing this challenge requires a fundamentally different approach to observability, one that captures the full context of LLM interactions and traces how outputs are generated across increasingly complex, multi-step systems.

Since launching LLM Observability in August 2025, groundcover has been running in production AI environments across its customer base, capturing LLM interactions automatically via its patented eBPF sensor with no instrumentation required and all data remaining inside the customer’s cloud. This release extends that foundation to address what production deployments revealed as the next unsolved problem: visibility into multi-step agentic systems.

“Our customers made it clear that their LLM calls have been invisible to the teams that manage the observability of their production systems,” said Orr Benjamin, VP of Product at groundcover. “They’ve been searching for a way to systematically understand their LLM calls by prompts, responses, and cost. They deployed groundcover for its traditional observability features, and we built AI Observability as a direct response to their demands for scale and mission-critical workload monitoring.”

What’s new

  • Agent trace visibility: groundcover now surfaces complete agent execution traces — every model call, every tool invocation with its arguments and results, and the reasoning path connecting them. Configurable focus levels let engineers work at the right altitude, from provider-level aggregates down to individual span detail.
  • Accurate cost attribution including prompt caching: Token costs are tracked at the span level and account for most edge cases of pricing complexity of modern LLM APIs, correctly distinguishing between regular input tokens, cache creation tokens, and cache read tokens. Teams can see what individual agent runs and sessions actually cost.
  • Google Vertex AI support: groundcover's automatic capture now extends to teams building on Google Cloud's managed AI infrastructure, with all observability data remaining inside the customer's own environment, and zero instrumentation.

AI Observability is now generally available and automatically deployed to all customers, and is being released in sync with Google Cloud Next on April 22-24 (Mandalay Bay, Las Vegas, NV). With this release, groundcover is also now fully compatible with Google Vertex on Google Cloud. Schedule a meeting with our team or stop by booth #5301.

“We have years of experience helping customers with meaningful transformations and modernizations on Google Cloud, and this release from groundcover is particularly exciting,” said Guilhem Tesseyre, CTO and co-founder of Zencore. “Customers can start with the AI Observability data automatically gathered by the groundcover eBPF sensor, and the OTel native aspect of the platform means any strategic changes they need to their observability is simple to design.”

With 200+ customers deploying the groundcover platform to their cloud, the strong traction enjoyed by the BYOC leader will accelerate the adoption of the AI Observability capabilities in this release, and signals a rapid shift to adopting observability designed to meet the scale of AI and Agentic systems.

About groundcover

groundcover is a cloud native observability platform powered by eBPF. It runs inside the customer’s cloud and provides complete visibility into applications, infrastructure, networks and AI systems without operational overhead. The platform offers unlimited data coverage at a fraction of the cost of legacy observability tools. Learn more at https://www.groundcover.com.

Contacts

Media Contact
Chris Churilo, VP of Marketing
chris.churilo@groundcover.com

groundcover


Release Versions

Contacts

Media Contact
Chris Churilo, VP of Marketing
chris.churilo@groundcover.com

More News From groundcover

groundcover Showcases AI-Native Observability at Google Cloud Next 2026

SAN FRANCISCO--(BUSINESS WIRE)--groundcover, the BYOC-driven observability platform for modern architectures, today announced its presence at Google Cloud Next 2026, taking place April 22-24 at Mandalay Bay Convention Center in Las Vegas. Attendees can visit groundcover at Booth 5301 to experience its AI-native observability platform and the expansion of its Agent Mode to support Google Vertex AI, deepening its native AI capabilities for teams on Google Cloud Platform (GCP). At the event, groun...

groundcover Brings AI-Native Observability to Production Analysis, Running Natively in Customer Clouds

SAN FRANCISCO--(BUSINESS WIRE)--groundcover, the BYOC-driven observability platform for modern architectures, today announced the general availability of groundcover AI Mode, a native AI capability designed to help engineering teams investigate production incidents and analyze infrastructure behavior directly inside their own cloud environments. AI Mode runs natively within the customer’s own AWS infrastructure via Amazon Bedrock, ensuring that logs, traces, and production telemetry never leave...

groundcover Extends Zero Instrumentation LLM Observability with Support for Amazon Bedrock and Bedrock AgentCore

SAN FRANCISCO--(BUSINESS WIRE)--groundcover, the BYOC-powered observability platform for cloud-native environments, today announced expanded support for Amazon Bedrock and Bedrock AgentCore. Engineering and AI platform teams can now monitor Bedrock foundation models and agentic workflows with the same real-time, code-free visibility that groundcover already delivers for OpenAI, Anthropic and other leading AI providers. groundcover captures telemetry at the kernel layer, which allows teams to tr...
Back to Newsroom