FAQs

Learn more about Prove AI’s capabilities for managing AI models and agents, its underlying technology and compatibility.

GENERAL

What is Prove AI?

Prove AI is an open access tool that helps AI engineers improve the quality of GenAI solutions through case management, guided troubleshooting and automation (reinforcement learning). Our dashboard is purpose-built for GenAI, and improves the quality of responses, which lead to faster resolution time for issues.

What does Prove AI v0.1 do?

Prove AI version 0.1 is an observability control plane for Gen AI systems. Specifically, it allows you to:

  • Collect GenAI-native metrics and traces
  • Organize them in a way that matches how LLM systems actually behave
  • Keep all of your data OTel-compliant and self-hosted

What value does Prove AI deliver to AI engineers?

Prove AI provides the foundation for faster and more transparent GenAI issue resolution, freeing up actual development time. Among other features that Prove AI currently delivers:

  • Faster time to production of the new metrics that engineering teams need to track to prove ROI on their investment in AI models and agents;
  • Managed setup of OpenTelemetry (OTel) and Prometheus resulting in lower ongoing labor costs associated with managing the hosts and software upgrades/interoperability;
  • Support for 80+ (and counting) GenAI performance metrics, including end-to-end request latency, time to first token, queue vs. influence time and token throughput and cache behavior.

TECHNOLOGY, HOSTING & COMPATIBILITY

Which technologies is Prove AI built on?

Prove AI is built on top of the open source OpenTelemetry and Prometheus projects.

Is Prove AI open source?

No, Prove AI is proprietary software, but its event collection, storage and audit logs are built on open source software.

Does Prove AI host any data from my GenAI solutions?

No – with Prove AI, you always self-host your data. Prove AI will never be able to read or access your AI data assets. Specifically, Prove AI collects events from your existing GenAI pipeline and only requires you to establish a quality metric and implement a few data collection points to receive a case/session view of issues, an end-to-end trace and a means of action to follow up on them.

Which large language model runners is Prove AI compatible with?

Prove AI v0.1 is compatible with both VLLM and Ollama. Support for additional model runners will be available in coming iterations.

How about tooling dependencies?

Prove AI was built with flexibility in mind – there are no specific tooling dependencies. You can use your preferred GenAI tooling without any limitations or restrictions.

Can I run Prove AI on my own infrastructure?

Yes. Prove AI v0.1 runs as an Azure managed image or an Amazon Machine Image (AMI), on your choice of instance.

What if I don’t already have infrastructure in place? Can Prove AI help me set it up?

Prove AI can deploy a small number of the most popular free, open source packages for evaluations, vector data and pipeline orchestration. However, you will have to create your own OpenAI, Anthropic or local (Llama) instances.

Which ticketing systems do you integrate with?

GitHub and Jira.

Does Prove AI run in pilot / demo environments?

No – Prove AI is built specifically for production environments. Our software assumes that you have a running GenAI model on Bedrock, Azure, OpenAI, Anthropic, or local hardware.

Can I customize or access where the telemetry data is stored?

Yes. Prove AI is built on Prometheus for data storage, enabling you to push stored telemetry events to a self-managed Prometheus instance. Prove AI can be configured to support this, as well as sending a copy of all event data to either an external Prometheus instance or an externally-managed instance.

If I already have an observability and telemetry platform, can Prove AI work on top of it?

Yes. Prove AI is built on Open Telemetry (OTel). If you already have an OTel instance you wish to use as your telemetry framework, Prove AI can attach to your instance and use it in place of the one bundled with the product. For observability products on OTel, we can collect information from their event stream or collect select events using the provided API/SDK examples.

Can I choose where to write the audit log?

Yes. We provide audit logging to your choice of SQL or ledger databases. Additionally, we support audit logging to the Hedera decentralized ledger.

How does this help me with agentic AI?

You can collect events from multiple LLMs for a single issue/use case. You can capture intermediate states and inter-agent communication immediately using the code examples provided.

Is this only for RAG GenAI or does it also help for fine-tuned or non-textual models?

Prove AI helps with any type of AI application. It observes and helps with deployment of RAG, fine-tuned and non-text models.

Does this only help with GenAI or does it also help with standard AI/ML models too?

Prove AI helps with any type of AI application.

BUILDING WITH PROVE AI

How can I get in touch?

Reach us here to schedule a technical walkthrough.