Jellyfish Reveals AI’s Real Impact on Engineering Teams with Industry’s Largest Ongoing Benchmark Study

The intelligence platform for AI-driven engineering also joins Massachusetts AI Coalition, partners with Augment Code, and publishes joint research on AI coding tools with Harvard Economics

BOSTON, March 17, 2026 — Jellyfish, the leading Software Engineering Intelligence and AI Impact Platform, announced Jellyfish AI Engineering Trends, the industry’s most comprehensive quantitative analysis of AI transformation in software engineering. Based on data from more than 700 companies, 200,000 engineers, and 20 million pull requests (PRs), the benchmark site sheds light on the true effect AI tools are having in software development teams and provides opportunities for engineering leaders to take best practices to their teams.

More than half of the companies in Jellyfish’s regularly-updated study use AI coding tools consistently, and 64% generate a majority of their code with AI assistance. The incentive is clear: over the last 3 months, companies in the top quartile of AI adoption have seen 2x the PR throughput of low adopters. Notably, autonomous agent activity (PRs generated entirely by AI agents) is still low overall, but is growing exponentially.

“AI coding tools are now the default option for engineering teams, and the productivity gains are real,” said Nicholas Arcolano, Ph.D., head of research at Jellyfish. “Enterprises can use our metrics as an objective baseline to benchmark their organization’s AI adoption and impact. The data shows a clear link between deep AI tool integration and measurable improvements in delivery throughput and engineering outcomes. The most aggressive adopters are pulling away from the pack.”

An AI Maturity Test for Engineering Leaders

While the AI Engineering Trends page allows anyone to assess their maturity against industry standards, Jellyfish customers gain a distinct operational advantage. Jellyfish AI Impact capabilities automatically parse and visualize your organization’s unique data, delivering custom dashboards that map the exact correlation between your team’s AI adoption and engineering productivity.

“Jellyfish helped us uncover deeper insights into AI usage, adoption, and impact across our engineering teams,” said Julia Gan, Sr. Director, Technical Program Management and Engineering Chief of Staff at Box. “We now have a clearer understanding of velocity and capacity, and can see how different AI tools affect different teams. This visibility allows us to benchmark our practices against industry standards and demonstrate to key stakeholders how Box is leveraging AI responsibly and productively.”

Jellyfish also recently partnered with Augment Code — which works across the IDE, CLI, code review — to bring AI telemetry to joint customers like TaskRabbit, so they can see exactly how Augment drives real productivity gains and business impact across their engineering teams.

“Jellyfish has allowed us to prove AI’s strategic value to the business,” said Tom Osowski, Engineering Manager, Partnerships at TaskRabbit. “After integrating Augment, our Jellyfish data showed a 50% decrease in issue cycle time and a 2x increase in both deployment rates and Epics resolved per month. Having that level of visibility transformed AI from an experimental tool into a proven engine for business growth. Our teams are shipping code faster and delivering twice the value to our customers in half the time.”

The platform’s AI momentum has been bolstered by:

  • An analysis in partnership with OpenAI: Jellyfish and OpenAI published a study that explored the real-world impact impact of AI on software development, including coding assistant and code review agent adoption by tool, new data on the growth of AI generated code, and the impact of AI on PR throughput and cycle time.
  • Joining the Massachusetts AI Coalition: Jellyfish joined a private-sector initiative to unite the founders, operators, students, investors, supporters, and builders that will define the next era of AI in Massachusetts. Ryan Kuchova and Luke Stevens of Jellyfish hosted a technical event for the Coalition that explored the world of agentic coding one year after the industry-wide adoption of Cursor.
  • Joint research with Harvard Economics: Researchers at Harvard Economics analyzed Jellyfish data covering 100,000 software engineers across 500 companies and found AI is making coding faster and code quality doesn’t seem to be suffering.
  • Additions to the leadership team: Jellyfish appointed Chris Ward, who helped scale Vercel from $6 million to nearly $300 million in annual revenue, as Chief Revenue Officer to help accelerate the platform’s hypergrowth as software companies seek to maximize the return on their AI investments.
  • Improvements to the Jellyfish MCP: The latest updates to the Jellyfish MCP give users a better experience with AI-powered engineering insights by ensuring data now outputs in TOON format instead of JSON, rebuilding the codebase in Node.js, and offering support for Docker.

To see Jellyfish AI Impact in action, schedule a demo.

About Jellyfish

Jellyfish is the leading Software Engineering Intelligence Platform, helping 500+ companies including DraftKings, Keller Williams and Blue Yonder, leverage AI to transform how they build software. By turning fragmented data into context-rich guidance, Jellyfish enables better decision-making across planning, developer experience and delivery so R&D teams can deliver stronger business outcomes.