Cursor Adoption

To integrate Cursor and start seeing the following metrics, see here.

Hivel's Cursor Adoption feature offers detailed insights into how well the Cursor AI is being utilized within a team or whole organization and how useful it is proving to be.


Overview

User Coverage Graph

  • This graph shows the trend of active and inactive users over different time periods (weekly, monthly, quarterly).

  • It provides a high-level overview of Cursor adoption across teams and helps identify periods of higher or lower engagement.

Suggestions Acceptance Rate

  • This metric shows how useful users find Cursor’s suggestions. A higher acceptance rate indicates that users are finding the suggestions useful enough to directly accept them.

AI vs Manual Code

  • This graph helps you understand the volume of AI-generated code versus manually written code in your codebase.

  • Even if the suggestions acceptance rate is high, this graph reveals whether users are integrating Cursor deeply into their workflow or just using it occasionally.

  • A lower percentage of AI-generated code indicates that users might be using Cursor as an afterthought rather than as a central part of their workflow.

AI Models and Extensions

  • These charts show:

    • AI Models: Usage across different AI models in Cursor.

    • Extensions: Visualizes AI-generated code by programming languages

  • Note: Both charts exclude tab completions, focusing only on the core usage.

A few high-level usage metrics

Total Paid Seats:

  • This represents the number of licensed users in the current subscription cycle.

  • It is not tied to specific dates or teams, just the total number of users for the subscription.

Active Users:

  • Active users are those who have engaged with Cursor during the selected time frame. Activities include logging in, accepting suggestions, writing code, or making requests.

Inactive Users:

  • Inactive users are those who have not performed any actions within the selected time frame.

The Active + Inactive count may differ from the Total Paid Seats because it is based on the selected timeframe and teams, which may not always match the total license count.

Total Bubot Calls:

  • This counts the number of times Bugbot (Cursor's code review tool) has been used during the selected time frame.


AI Usage Analysis

The AI Usage Analysis tab breaks down individual user activity, showing:

  • The total lines of code each user has written.

  • The percentage of code that was AI-generated versus manually written.

Cohorts for AI Usage

  • High Usage (>=70%): These are the top users who rely heavily on AI to assist in coding.

  • Medium Usage (30-69%): These users make moderate use of AI in their coding.

  • Low Usage (<30%): These users are minimally relying on AI.

This segmentation is helpful for identifying your AI "trailblazers" (those with high usage) who can share best practices and tips with others who have low usage. By fostering collaboration between cohorts, your organization can maximize the value gained from Cursor.


By tracking these metrics and visualizations, you can gain valuable insights into how teams are using Cursor, who is benefiting most from its features, and where there may be opportunities to encourage greater adoption or improve workflow integration.

Last updated

Was this helpful?