AI Benchmarks
January 28, 2026 @ 1:00pm EDT
1 Hour
Speakers
Nicholas Arcolano
Head of Research, Jellyfish
Nik Albarran
Product Researcher, Jellyfish
AI is transforming software engineering, but the results most teams see don’t always match the hype. Some organizations are doubling productivity, while others struggle to see meaningful gains, even with high AI adoption.
So what’s actually working in the real world, and why does the impact vary so widely?
In this webinar Jellyfish’s Head of Research, Nicholas Arcolano, along with Product Researcher, Nik Albarran, share data-backed insights from one of the largest real-world studies of AI usage in software engineering. Drawing on analysis of 20 million pull requests from 200,000 developers across 1,000 companies, our team will break down what’s really happening as teams adopt AI coding tools and agents.
You’ll learn:
- What good AI adoption looks like in practice and how to measure it
- The productivity gains engineering leaders should realistically expect
- How AI is changing PR throughput, cycle time, and developer workflows
- Why some teams see outsized gains while others stall, and the role architecture plays
- What to do when AI adoption isn’t delivering the results you were promised
If you’re responsible for AI outcomes, not experiments, this webinar will give you the clarity you need to move forward with confidence in 2026. Register now.