Skip to main content

· 4 min read
Zach Bouzan-Kaloustian

Emergence of PLG

For the past two years I was working in a high-growth, boot strapped, e-commerce company. How big? Wish I was allowed to say, but let’s just say it was substantial. It seems that during this time, with my head in proverbial the e-commerce sand, an emerging terminology has come to the forefront of SaaS companies: Product Lead Growth, or PLG.

What is PLG?

Admittedly, I didn’t understand it at first. What was this elusive PLG, and why was the market abuzz? After I read a few articles, I came to interpret PLG as a self-service company that also has product-market fit, and thus is able to onboard many customers in a frictionless way without a sales-assisted motion. Then it hit me, this is what DigitalOcean is, and was, all about. A simple, frictionless experience for developers to buy cloud servers. Also connected to PLG, I think that any e-commerce company that has an obsessive focus on top of funnel optimization, and site conversion, is doing PLG as well. So in my combined last two roles, and I have about 8 years of experience with PLG 🙂

My Expertise: Building teams at a PLG Company

This lead me to realize that my expertise is in building post-sales* teams at PLG companies, across Operations, Success, Support, and Trust & Safety. To me, the post-sales work can be thought about in 3 ways:

  1. Proactive: Generating an outbound contact for the purpose of selling more, retention, or to further a relationship (Customer Success).
  2. Reactive: When there’s friction or issues in the experience that cause an issue/ticket/report (Support, Trust & Safety)
  3. Efficiency-Driven: Developing a more efficient process or method to add business value (Operations)

TLDR for Executives & Operators

Inherently, none of these activities are unique to PLG. The distinction comes in the cross-functional relationships and processes that you build. For example, take a user-submitted bug report submitted to Support, that is verified and then fixed by an engineering team. This process sounds very simple, IFTTT-driven, but it is actually very complex and vital in a PLG company. My rough estimate is that I’ve been at companies that have onboarded and served >1 million customers, so I’ve had the chance to build, tear-down, and rebuild loads of the processes. Here are some non-obvious takeaways from those experiences:

  1. The best companies create processes that have a short time to acknowledge an issue, and this quick reaction will be positively noticed by your customers. I don’t care as much about the time to fix an issue because it’s less related to cross-functional processes, and could be a signal of other issues (hiring, eng prioritization, etc.).
  2. Keep a scorecard of your victories. Everyone keeps a backlog of work to do, but I’ve seen fewer teams keep a list of wins, AKA: product improvements that were driven by customer feedback. Review these with your teams, and celebrate the teams who ship for your customers, it will go along way to building relationships.
  3. Teach your post-sales teams how to say “no”. To do any type of post-sales work requires empathy, and a lot of support teams acutely feel a customer’s pain, which makes them effective in their role. The best support teams have empathy AND a strong sense of what the business might need, so they’re able to say “no” to requests that don’t align and they can offer an alternative. One report from a VIP customer vs a report from a novice are not created equally. Learning to tell the difference is a learnable and teachable skill, and the quality of customer feedback in your organization is incumbent upon the leader to create that clarity.
  4. It’s everyone’s role to care about the customer’s experience. If you’re interviewing at a company that puts all the responsibility for customer happiness on post-sales then you should run for the hills. Instead, find a company where every employee acts in the best interest of the customer and make sure to get examples. It needs to be part of a company’s DNA. Data Grouping...

So that’s about it. Some learning, a personal realization, and some lessons. What do you think I missed, got right, or got wrong?

*Post-Sales: For the vast majority of customers, we actually did no selling, and our interactions were overwhelmingly with existing customers, less than 5% were pre-sales. It’s also interesting to note that sales did not work (at all) for DigitalOcean while I was there, and at Wild Alaskan, the majority of pre-sales questions could be handled via self-service and product improvements.

· 6 min read
Zach Bouzan-Kaloustian

An overview of Customer Effort Score (CES) as told through an interview with a Head of Global Support

Post Summary: A CES Overview

This is an anonymized interview that I had with the Head of a Global Customer Support team. He was labeled as an expert in Customer Effort Score (CES), so I reached out to him in Support Driven, a fast-growing Slack group where Support Professionals can connect with one another. I wanted to better understand CES, how to implement it, and generally find another connection in the support world.

Throughout our discussion I received more than I was expecting. He blew me away with how thoroughly he explained the concepts along with how thoughtfully they researched and implemented CES. This post is my way to give back to the Support Driven Community.

In this interview we learn more about his decision to use CES, what CES is, and an idea of how to implement it. Without further ado, here’s the interview.

Interview

Tell us about your team:

We’re a team of ~50 people, working 24/7, doing about ~10k tickets per month! We’re spread between [California, South America, and Eastern Europe]. Our team is extremely knowledgable about our product and the questions we receive are very in-depth. We support [sic] software, and answer all conceivable questions about how to use the product, best practices, etc. A typical customer response is 3–5 paragraphs, written free-form and customized for each inquiry. Our team really knows the product inside and out!

In order to keep this all running, we support our agents with 3 tier-1 support managers, a manager for tier-2, a documentation writer, and an extensive training team of 3 people. The training team onboards 2–5 people per month, and it takes 2–2.5 months to become fully proficient!

What type of performance do you measure?

We measure the typical things such as Time to First Response. We also have targets to answer all 90% of chats within 30 seconds and 90% of calls within 60 seconds. In the past we measured CSAT which we’ve replaced with Customer Effort Score (CES).

What inspired you to implement CES?

Last year I read the book called The Effortless Experience: Conquering the New Battleground for Customer Loyalty (presentation), and chapter 6 completely changed how I thought about measuring the customer experience; It inspired me! It provides different ways to phrase questions and receive results for customer feedback. It’s very scientific yet easy to understand. The whole book is great!

The book claims that CSAT provides zero insight into the real meaningful indicators for a business. This made an impact on me because we were facing the same challenge as many other Zendesk customers. The Zendesk CAST benchmark results show that everyone scores ~95%. It was a very inaccurate signal without real meaning for us. In 2015 we did achieve a 96.5% CSAT rating! ← Great work!

zdcsat

Zendesk’s Built in CSAT Email

The core element of CES is to understand a customers likelihood to repurchase, and we wanted a way to tie back a customer’s experience to sales which is our business’s lifeblood. This led us to start investigating how to implement CES.

Tell us what CES is?

Customer Effort Score is a question that asks if “the company made it easy for me to handle my issue” and allows the customer to answer on a range from “strongly disagree” to “strongly agree”.

CES Surey

What was the implementation process like?

Towards the end of 2015 we started to do research for 3rd party tools and found that no one really has an out of the box customer effort survey integrated with Zendesk Triggers. We chose to implement a customer email via Survey Gizmo.

CES Email First CES Survey Version from January 2016

Upon implementation, we immediately started to see a substantial decrease in our response rate. With Zendesk, we saw an 18% response rate when asking customers for CSAT. The reply rate dropped to 8% because no one wants the promise of a “2 minute survey” that will actually take 10 minutes. Fortunately, we had engineering support and we started to build our own tool in January which would embed the question directly in the email.

Updated CES Email

Updated CES Email

We custom-built a service that would listen for the Zendesk solved ticket trigger. The service sits between Zendesk and Marketo, which we use to email customers. When a customer selects their response [in the email] it saves the feedback and they land on a page where they can provide comments.

Our response rate shot up to 22%, which was even greater than our CSAT measurement!

Capture Additional Comments Example of how to capture comments in a custom-built tool

How do you calculate CES?

When we capture a customer’s response we push the rating and comments into Zendesk via their API. We get a lot of comments and they’re added directly to the original ticket in the form of internal notes. We perform data analysis using GoodData. If a customer Agrees or Strongly Agrees, we consider it a positive interaction, everything else is a thumbs down.

They look at the percentage they receive in each category: CES Dashboard Sample CES Dashboard

Ideally, more customers would respond with “Agree” over “Strongly Agree” because Strongly Agree indicates that it’s not a complicated question and could be solved effectively with self-service. If a high percentage of strongly agree, it might be the difference between hiring a 20 person team vs. 30 person team. In the book, it says the ideal graph shape is a bell-curve.

When a customer emails us, our agents are responsible for tagging a ticket for the main topic. It’s a mandatory field in order to solve a ticket. When we do analysis we tie CES to the ticket, agent(s), and product feature to identify where the holes are.

What insight has CES provided?

From an insights standpoint, we haven’t drawn out anything super insightful yet because we know [the existing] challenging items for the customers. The biggest difference from CSAT is that CES is a stronger signal, it’s so much more distinct. With CES the same topic will be overwhelmingly 50% positive vs 80% with CSAT.

CES Dashboard Sample CES Dashboard in GoodData

Rapid fire, what else should we know about CES?

  • We want to redo [the email] on a linear scale & not vertical.

  • Make sure your answer links are clickable in the email.

  • Think about the time to solve the ticket and send the email, we wait 24 hours. One partner from New Zealand found that 10 hours was ideal. You’ll want to wait long enough to know that they won’t follow up with another question.

  • You may uncover organizational challenges with low scores. We have our lowest satisfaction score when we pass to [sic] because they don’t follow up immediately.

If you have comments, questions, or further insight please reach out.

· One min read
Zach Bouzan-Kaloustian

The CX Handbook Blog is full of additional resources and thoughts about CX. Please reach out if you want to contribute, have questions, or want to suggest a topic.