Skip to content
Best Practices Engineering Investment and Business Alignment

How SailPoint Uses Jellyfish To Foster Data-Driven Conversations

On April 27, 2023, consulting firm McKinsey released a framework meant to break through the developer productivity “black box.” Some components of this framework were based on well-reputed methodologies like DORA and SPACE, and others proposed new ideas, such as a “Developer Velocity Index benchmark.”

The developer world was incensed:

  • Dan North, the originator of Behaviour-Driven Development, said the “cornerstone of the paper is unfortunately simply misguided.”
  • Dave Farley, co-author of Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation, said, “Apart from the use of DORA metrics in this model, the rest is pretty much astrology.”
  • Kent Beck, one of the original signatories to the Agile Manifesto, said, “The McKinsey framework will most likely do far more harm than good to organizations – and to the engineering culture at companies.”

The fervor comes from a fear that executives will use software productivity measurements in a way that will hurt developer morale and, ultimately, productivity, too. 

What often gets lost in the discourse is that the problem isn’t the data or the metrics themselves – it’s in the leaders and the tools they use. With the right tooling and methodologies, data can be empowering, and developers can jump aboard. 

Taylor Wingfield, Product Operations Manager at SailPoint, joined us to talk about her experience using Jellyfish, how she’s helped developers and business leaders have more nuanced conversations around productivity and developer work, and how she’s navigated the tensions these conversations sometimes inspire. 

1. Establish Consistent Metrics

Metrics are meant to provide clarity, but the wrong ones, inconsistently applied, can create more chaos than order. 

Before implementing Jellyfish, Wingfield struggled to track productivity due to inconsistencies between teams, making manual tracking work necessary. From a high-level view, Wingfield found that the teams appeared to work similarly, “but the details of how each team was working were pretty different.” 

First, the team set about standardizing Jira. Next, the team worked on implementing Jellyfish and examining deliverables over time. “As we look at what we’ve planned for the quarter and look at those deliverables in Jellyfish, we can see how many folks we have allocated toward those initiatives.” 

Wingfield is careful to explain that this isn’t an exercise in surveillance. By standardizing work measurements, program managers can look for signs that people are overloaded with unplanned work. While these reviews can vary by level and/or program, SailPoint program managers typically review data weekly or bi-weekly as they track program progress and status.

From this perspective, program managers can find, Wingfield says, “signs that people might be getting pulled off into hidden work or feeling overwhelmed because we didn’t allocate enough effort there originally.”

Over time, these metrics become even more helpful as teams refine their processes. Wingfield cites Epics as an example. A team is expected to take on many tickets at the beginning of a quarter. But at some point, “if we’re at the last week of the quarter and we’re still seeing that amount of new stuff get added, then we might need to refine how we’re working in our Epics.” 

With this data, change would be easier to see and advocate for. “It’s a lot easier to figure out solutions when you’re looking at data rather than people’s feelings. It’s a lot easier to do that when we can actually look at data and say, oh, this was a new requirement that we added, and we knew that it’s going to take a toll on our final delivery date, but we’re okay with that because of X, Y, and Z versus, well, it’s just taking longer than expected.”

2. Build a Feedback Loop Between Engineering and Leadership

By adding data to the mix, Wingfield found that conversations between engineers and business leaders were increasingly productive. The key was building and maintaining a feedback loop that kept everyone engaged in ongoing discussions. 

The more often you’re sharing data with your leadership team, Wingfield explains, the more you can facilitate questions around improving efficiencies. Winfield notes that from the engineering side, these conversations are often initiated by director-level leadership and above. 

Wingfield explains that each data point is evidence of something, not proof of anything. For example, Wingfield might initiate a conversation with a manager about an engineer not performing at 100%. This isn’t a disciplinary conversation; it’s an investigatory one. 

“Sometimes the answer is just that they didn’t realize they got pulled into incidents, escalation tickets, or hidden work.” They might have even been doing work that somebody didn’t bring up in standup because it wasn’t related to the work that was going on.

As Wingfield says, relating the feelings of these teams, “‘If we’re going to be looking at this, we want to make sure that we are getting it right, and that we have a good feedback loop for when we feel like something is not right.’” 

3. Improve Data Over Time

Even good data can eventually become stale, just as valuable metrics tracking methods can fall out of step with evolving processes. Engineering leaders should improve data over time to facilitate better productivity conversations between engineers and business leaders. 

Wingfield and SailPoint, for example, found low-hanging fruit early. However, more improvements became possible after cleaning up their data and standardizing their metrics. 

The initial engagement and alignment could have disappeared over time if formerly representative metrics had become less representative. The excitement around good first steps could have turned to cynicism if second steps and other iterations didn’t follow. 

To avoid this fate, the SailPoint team listened to the feedback loops they established. Two major questions emerged:

  • How do we make our engineering data more accurate and more representative of our actual work?
  • How can we facilitate those improvements and efficiencies going forward?

“Those two questions really worked together, in harmony, for our business, making us better and continuously sharing data with our leadership team,” Wingfield says.

In practice, the SailPoint team collected and analyzed feedback from teams around these questions. The team would then put together proposals and seek consensus on where the company could update how it collects and categorizes data. Inconsistencies indicate areas where SailPoint might need to adjust its processes. 

By asking these questions, answering them, and improving how they answer them over time, Wingfield built and maintained stronger alignment across teams. 

The adoption of Jellyfish further cemented this alignment because the data enriching these conversations was much better than any data before. “The data accuracy in Jellyfish was much more real-time, much more representative of the work the teams were actually doing and the efforts they were putting into different areas. We could look at particular tickets or particular deliverables and break things down and compare how we were measuring that work previously.”

Asking the Right Questions

When approached with the right mindset and methodologies, data can empower engineering teams by demonstrating the value of their work and making it easier to communicate with non-technical stakeholders. 

With Jellyfish, companies like SailPoint are equipped to ask the right questions at the right moments to catch a project before it goes off track. 

Watch the entire on-demand webinar to hear more about how Wingfield and SailPoint use Jellyfish to track engineering work in a human-centered and efficient way.