Deeployed Peer

Dashboard Analytics

Understand your code review velocity, AI adoption, and team productivity through detailed dashboard metrics.

Dashboard Analytics

The Deeployed Peer Dashboard provides a multi-layered view of your development process, combining AI performance metrics with general team productivity insights.

Real-time Suggestions

Track how developers interact with Peer's real-time assistance. Every suggestion provided by Peer goes through a specific lifecycle based on author actions.

  • Pending: The reviewer has appended the suggestion to their review but it hasn't been committed yet.
  • Committed: Success! The author has accepted and committed the suggestion directly into the codebase.
  • Deleted: A suggestion is marked as deleted if the reviewer ignores it (does not append).
  • Rejected: The author has explicitly reviewed the suggestion and decided to dismiss it or the PR is closed without the suggestion being committed.

PR & Approval Analytics

Monitor how often your team utilizes Peer's core automation features to streamline their workflow.

  • PR Descriptions: Tracks the number of AI-generated summaries that were used versus those that were dismissed.
  • Reviews Generated: The total volume of automated technical reviews performed by Peer across your organization.
  • Conditional Approval: Displays the success rate of automated merge conditions, helping you identify frequently failing quality gates.

Token Analytics

Token analytics help you understand the infrastructure cost and scale of your AI operations.

  • Input Tokens: The volume of code context sent to the AI for analysis.
  • Output Tokens: The volume of suggestions, summaries, and comments generated by the AI.
  • Usage by PR Length: Correlates token consumption with the size of your code changes, allowing for better budget forecasting.

General Insights (Non-Tool Analytics)

Located on the Insights page, these metrics track your team's overall code review health, regardless of whether Peer was used. This helps you identify systemic bottlenecks.

  • Velocity: Tracking PRs Opened vs. Merged to monitor clinical throughput.
  • Response Time: The average time it takes for a human or bot to provide the first piece of feedback on a change.
  • Peer Collaboration: Average number of developers reviewing each PR and total comment distribution.

By comparing Peer usage with general insights, you can measure the exact ROI of AI assistance on your team's average turnaround time and code quality.


Note: All dashboard data can be filtered by Organization, Repository, or Developer to get granular insights into specific team performances.