I wrote the book on developer marketing. Literally. Picks and Shovels hit #1 on Amazon.

Get your copy
||12m read

How do you measure Developer Advocacy?

Measuring our work is important. Let's face it: not every company gets the role of Developer Relations. I'm not going to belabor the point, but in lean times demonstrating your value to the organizati...

How do you measure Developer Advocacy?

Measuring our work is important. Let’s face it: not every company gets the role of Developer Relations. I’m not going to belabor the point, but in lean times demonstrating your value to the organization has merit. At the same time, measuring what matters gives you room to experiment, improve, and scale your efforts.

As I've written, I ask Developer Advocates to focus on four areas. Each of the four areas - product feedback, content, community engagement, and events - can be measured. And, just as there are best practices for the tactics themselves, there are some guiding principles and best practices for how we measure each tactic.

Whenever I look to define metrics, I like to start by asking myself what success looks like in each of the four core areas:

  • Capture and represent customer feedback.

    • Developer Relations feedback is accepted and considered valid by the product team.

    • Developer Relations feedback is prioritized by the product team.

    • Developer Relations feedback is acted upon by the product team.

  • Build great content.

    • Developer Relations content reaches lots of new customers.

    • New customers touched by the content activate the product at a higher rate than the control.

    • New customers touched by the content consume more of or spend more money on the product than the control.

    • Existing customers return to our docs

  • Connect with community leaders.

    • Grow first-time and repeat attendees to virtual meetups

    • Community leaders welcome developer relations contributions to their open source projects

    • Community leaders provide feedback on products or services

  • Attend and speak at events.

    • For physical events, you can measure booth traffic and lead acquisition, as well as activity based on attendance at your talks.

    • For virtual events, apply classic TV metrics such as number of viewers, time spent watching, most popular section, and so on.

One of the things that attracted me to Developer Relations 20 years ago was that it was a great opportunity to exercise the creative aspects of my personality (I was a drama major and a screenwriter when I was younger) with my more mathematical and algorithmic side (I was also a Computer Science major). You are what you choose to measure, as the old axiom goes. And one of the things I would not want to do is subtract the creative aspects of the job. With that in mind, we want to minimize process, maximize insight, and retain creativity.

Measuring feedback

With customer feedback, if your Developer Relations team uses the same task management service as the engineering team (they should!), then you can run queries on their submissions and determine if those submissions meet the criteria for success you’ve defined:

  • Number of bug reports submitted and accepted
  • Number of customer feature requests considered

Ultimately, what you're looking for is two things: is the quality of feedback being contributed to the product team high and does the Developer Advocate have sufficient rapport with members of the product team to ensure that their feedback is considered seriously?

Measuring content

I measure content and event impact in very similar ways -- and my approach requires knowledge about basic web measurement techniques: URL query parameters.

What you're trying to do is append any link to your site (domains you control/own) with a tracking code, so that you see how many people arrived at your site as a result of a given activity, determine which channel is most effective at driving growth and adoption, and invest your time and resources in the highest performing tactics.

It's important to note that this strategy does not associate visitors with any personally identifiable information - it merely allows you to see anonymous individuals' traffic and engagement.

In modern times, the best marketing people are quasi-data scientists who run queries to identify which routes to product signup and usage are the best. When Developer Relations does this, it opens up a whole world of nerdy growth hacking, given the loop between content production and measuring efficacy is tightly contained within one person or team in the organization.

You can use any tool to or parameter structure for your tracking codes. The key is to have flexibility, so that you can see instances of tactics, not just the tactic itself. In this example, we’ll use the Google UTM format (via Google Campaign URL Builder). The important thing is to adopt a standard, consistent schema, so that you can run queries later to determine the efficacy of general tactics, specific pieces of content, publications, or events, and even specific team members.

As a refresher, Google UTM links consist of three primary fields, and while most frequently used by marketing people, we can easily adapt it for Developer Relations’ purposes.

For each of the three primary fields, I’ve listed a schema I suggest for Developer Relations:

  • utm_source - the specific content that’s referring traffic (where you’re using the link).

    • These are specific, like the actual name of an event or session (“event-session-year”), piece of content ("title-of-blog-post-authorname"), publication ("partnername-blog"), podcast (“name-of-podcast”), organic social media (“companyname”), paid promotion on social media (“companyname-paid”)
  • utm_medium - the type of content/distribution channel that’s referring traffic (what you’re publishing or sharing the link via)

    • blog

    • email

    • event

    • website

    • youtube

    • twitter

    • linkedin

    • reddit

  • utm_campaign - the larger campaign or initiative.

    • This includes things like new releases (“1-1-release”), major projects (“amazing-feature-launch”), events (“aws-reinvent-2023”), etc.

Let’s talk for a moment about how content flows to your site. Let’s say you write an article about Using GraphQL with PostgreSQL and post it to the DEV community. Within the article, you link to your product documentation. You would append the following UTM parameters:

  • utm_source: graphql-article-authorname
  • utm_medium: blog
  • utm_campaign: dev-community

These parameters allow you to filter visitors based on what led them to your site, and, depending on your website and product analytics tooling, actions they took on your properties. That way, you can look at your signed up users and see which inbound tracking codes may have led them to your site (in, say, the last 90 days). It’s impossible to identify causality in a world where a customer may interact with you in multiple ways before deciding to sign up (so-called “multi-touch attribution”). But you can identify correlation and take note of which tactics are most commonly present in your most recent customer sign-ups (or any customer cohort you define).

One thing you may want to consider is having one or two people in your team manage these UTM parameters. You want to be consistent in how you use them so that you can analyze the aggregate data effectively.

Measuring community

In community engagement, our highest priority is authenticity. The second-highest priority is to be helpful. I’m a very firm believer that engaging with community leaders (and others in the communities of importance to you) cannot be driven by a quid pro quo.

You can’t expect anything in return.

If you show up with an authentic desire to learn and help, you will do your job. If you try to associate the time you spend in the community with sales or other metrics, it can lead you away from an authentic desire to learn and help and instead down a path of self-centered desire to grow your business.

Measuring events

Events are slightly more complicated. Often, events at which you speak or attend are hosted by third-parties (if they're virtual) or are in the physical world.

If you're hosting a booth or other form of physical presence at the event, you'll probably measure booth traffic, leads captured, or something similar.

If you're giving a talk, you need to get creative. In my post about What do Developer Advocates do?, I talked about turning an event talk into multiple pieces of content: blog posts, white papers, videos, live streams, whatever it takes. Reuse the effort you put into creating your talk.

Then, during your talk, reference your related content. For example, "I wrote a blog post on this topic if you'd like to read more."

I like to use a link shortener (such as bit.ly, or a branded one like Bl.ink) coupled with the aforementioned UTM parameters so that I can drive traffic to my website along with the correct attribution. In this case, utm_source would be title-of-talk-speakername, utm_medium would be event, and utm_campaign would be conference-name-2023.

The measurement won't be precise. Many people in the audience may be on their phones when they first check out your shortlink, and that won't necessarily follow them if they check it out later when they're at their desk. But you'll at least be able to see how many people were plugged into your talk and sufficiently interested to visit your content.

Measuring AI visibility

The way developers discover products has fundamentally changed. According to the StackOverflow 2025 Developer Survey, 85% of developers regularly use AI tools for coding and development. They're not just using AI to write code - they're using tools like ChatGPT, Perplexity, and Claude to research products, debug problems, and discover solutions.

For the first time in two decades, overall Google traffic is declining across industries while usage of LLMs is exploding. This means your Developer Advocacy efforts need to track a new dimension: AI visibility.

Here are the key metrics to track:

  • AI Share of Voice: The percentage of relevant AI answers that mention your brand. Top-performing brands capture 15%+ share across their core query sets.
  • Citation rate: The share of AI-generated answers that link to your domain as a source. This is the AI equivalent of backlinks.
  • Time-to-citation: Days from publishing content to first AI citation. Quality, structured content gets cited faster.
  • Entity accuracy: How accurately AI systems represent your brand, product, and key features. Incorrect information in AI responses can be damaging.

Several tools now exist to track these metrics automatically:

  • Otterly.AI: Tracks brand mentions and citations across Google AI Overviews, ChatGPT, Perplexity, Gemini, and Copilot. Used by 15,000+ marketing professionals.
  • Profound: Enterprise-focused analytics across ChatGPT, Perplexity, and Google AI Overviews. Built for Fortune 1000 companies with SOC 2 compliance.
  • HubSpot AEO Grader: Free tool that reveals whether your company appears in AI results for GPT-4o, Perplexity, and Gemini, plus competitive comparison.
  • Semrush Enterprise AIO: Provides share of voice, brand sentiment, and mention tracking across AI platforms.
  • Peec AI: Scores visibility, analyzes sentiment, and shows which sources AI platforms use when mentioning you.

Manual tracking works too: run a set of 20-30 queries relevant to your product space through ChatGPT, Perplexity, and Claude weekly. Track how often you appear and in what context.

To improve these metrics, your content strategy needs to evolve:

  • Structure content for machine readability. Use clear headings, bullet points, FAQ sections, and comparison tables. AI models thrive on well-organized information.
  • Implement schema markup. JSON-LD structured data (Article, FAQPage, HowTo schemas) helps AI systems understand and cite your content correctly.
  • Provide direct answers. Front-load key information. AI engines favor content that provides immediate, clear answers rather than burying information deep in long articles.

The good news is that content cited by AI systems tends to rank highly in traditional Google search as well. Optimizing for AI visibility strengthens your SEO.

Pulling it all together

Once you and your team appends tracking codes to all your links inbound to your product or service, you can begin to run queries using your favorite analytics tool. Common questions I like to ask are:

  • How many new sign-ups is Developer Relations driving overall? What activities are driving the majority?
  • Are Twitch live-coding streams more effective at driving traffic or net new visitors than webinars?
  • Which third-party sites are most effective at driving new users?
  • Which types of content are most effective at driving new users?
  • Is billg better at events or content?
  • What is our AI Share of Voice compared to competitors?
  • Which content pieces are being cited most frequently by AI tools?

You’ll definitely want to create a dashboard so that everyone on your team can quickly slice and dice this data to get insights on what is working. One of my core management principles is to eliminate all forms of information asymmetry in an organization. To that end, I’m also a big believer that all dashboards should be shared broadly in an organization.

Measuring Developer Advocates

So far, we've spoken about measuring the output of Developer Advocacy itself. But is this a useful proxy for measuring the efficacy of a Developer Advocate as well?

It depends.

I believe metrics give you part of the picture. But beyond metrics, the biggest thing I look for in Developer Advocates is whether or not they're listening to customers in a way that shapes their work. Are they publishing content and looking to see how it's received? Are they adjusting their approach constantly to identify what works or are they blindly following a preordained path?

The measurement tools I've described here are guidelines more than they are a checklist. Use these tools to help you formulate a go to market plan that is highly responsive to customer needs.

Summary

For the bulk of my 20+ years in Developer Relations and Developer Marketing, measuring content, events, and community activities has been extraordinarily difficult. Developer Relations has always seemed like an ambiguous role, and I’ve certainly been part of organizations for whom developers were the third- or fourth-most important constituency. Even though my career advice would strongly advise Developer Relations professionals to select organizations that prioritize developers long-term when job-hunting, it’s likely that no matter how much importance your organization places on developers that you’ll need to justify your existence, so to speak.

A concerted effort around measuring what matters will enable you to focus your efforts on building and growing a community around your product and demonstrate quantifiable impact.

I believe that the topics I’ve discussed here are applicable to all teams, regardless of size or organizational maturity. I’d love to learn more about how your Developer Relations teams are measuring what matters and demonstrating impact in your organization.

Prashant Sridharan
Prashant Sridharan

Developer marketing expert with 30+ years of experience at Sun Microsystems, Microsoft, AWS, Meta, Twitter, and Supabase. Author of Picks and Shovels, the Amazon #1 bestseller on developer marketing.

Picks and Shovels: Marketing to Developers During the AI Gold Rush

Want the complete playbook?

Picks and Shovels is the definitive guide to developer marketing. Amazon #1 bestseller with practical strategies from 30 years of marketing to developers.

Don't get left behind by AI

Sign up for the Strategic Nerds Newsletter and get expert advice on Developer Marketing and Developer Relations so you can navigate technical marketing in the AI era.