Image source: The Motley Fool.
Date
Thursday, May 7, 2026 at 5 p.m. ET
Call participants
- Chief Executive Officer and Cofounder — Shlomi Ben Haim
- Chief Financial Officer — Ed Grabscheid
- Head of Investor Relations — Jeffrey Schreiner
Need a quote from a Motley Fool analyst? Email [email protected]
Takeaways
- Total revenue -- $154 million with 26% year-over-year growth, attributed to cloud strength, security product demand, and Enterprise Plus expansion.
- Cloud revenue -- $78.9 million, up 50% year over year, now 51% of total revenue versus 43% a year ago, reflecting accelerated cloud adoption.
- On-prem revenue -- $75.1 million, up 8% year over year, as hybrid and self-managed deployments remain preferred by certain customer segments.
- Enterprise Plus mix -- 58% of total revenues, compared to 55% previously, growing 33% year over year from increased adoption and higher customer commitments.
- Large customer metrics -- 80 customers spent over $1 million annually (48% year-over-year growth), while 1,225 customers spent over $100,000 annually (17% year-over-year growth).
- Gross margin -- 83.8%, up from 82.5% a year ago, supported by cloud cost optimization and product mix.
- Operating profit -- $32.9 million, with an operating margin of 21.4% versus 17.4% in the year-ago quarter.
- Free cash flow -- $37.3 million, a 24.2% margin, up from $28.1 million and a 23% margin a year earlier.
- Net dollar retention -- 120%, an increase of four percentage points year over year and up one point sequentially, demonstrating strong customer expansion and continued product uptake.
- Gross retention -- 97%, as reported for the period, indicating high customer loyalty and mission-critical product status.
- RPO (remaining performance obligations) -- $574.9 million, representing 36% year-over-year growth, excluding upside from usage above minimum commitments.
- Cash and short-term investments -- $741.2 million at quarter-end, up from $704.4 million at the end of 2025.
- Share repurchase program -- Up to $300 million authorized in ordinary share repurchases, announced in late February.
- Updated cloud growth guidance -- Full-year baseline cloud growth raised to 33%-35%, from previous guidance of 30%-32%, in response to ongoing adoption and usage trends.
- Net dollar retention guidance -- Floor raised to 118% for 2026, reflecting confidence in recurring expansion and higher contract values.
- Q2 guidance -- Revenue expected between $154 million and $156 million, with non-GAAP operating profit of $28 million to $30 million, and non-GAAP diluted EPS of $0.23 to $0.25 on about 126 million shares.
- Full-year guidance -- Revenue projected at $628 million to $632 million, or 18.5% year-over-year growth at the midpoint; non-GAAP operating income of $112 million to $116 million; non-GAAP diluted EPS of $0.93 to $0.97 on about 128 million shares.
- AI and security momentum -- Noted customer adoption of AI-era features and security tools, including wide interest in Curation, Xray, and Advanced Security, as customers address complex threat vectors and increasingly prioritize governance.
- Platform innovation -- New launches included JFrog MCP Registry (the enterprise-grade MCP server registry), Skills Registry, and enhanced integration with NVIDIA’s AICube Blueprint, targeting AI-generated and multi-agent binary management.
Summary
The first mention of the company in this section is: JFrog (FROG +5.96%). Management reported that consumption-based cloud revenues were driven by accelerated AI adoption, with customers routinely spending above contractual minimums, and expressed confidence in converting this usage into larger annual cloud commitments over time. The company highlighted a fundamental industry shift, as both traditional and AI-native customers increased binary production and sought a unified system of record for managing, securing, and governing software artifacts. JFrog stated that demand for security-centric products surged due to ongoing and increasingly frequent software supply chain attacks, with Curation users explicitly described as protected during recent industry incidents.
- Management disclosed a historic milestone, as cloud revenue surpassed 50% of total revenue for the first time, marking a strategic transition toward a cloud-first delivery model.
- Responding to customer needs, JFrog expanded platform support for AI workflows with the introduction of registries specifically aimed at managing MCP servers and reusable agent skills, establishing new product categories demanded by enterprise clients.
- Cloud usage outperformance resulted in guidance methodology maintaining conservatism—committing to only include contractual annual commitments and not excess usage in forward-looking estimates.
- Executives stated, "our first quarter, which exceeded the top end of our guidance range on every metric," attributing the guidance update to observable and durable usage trends rather than a one-time spike.
- Management confirmed that high-value customer cohorts, both AI-native and traditional, contributed to growth, with noted multi-year commitments increasing both RPO and confidence in sustained Enterprise Plus adoption.
- JFrog reiterated its differentiated position by integrating security, governance, and universality for all binaries—including new AI artifacts—across cloud and on-prem solutions, underpinning its competitive moat.
- Leadership said, "We introduced the JFrog MCP Registry, the first enterprise-grade registry for MCP servers, extending our platform to support the growing AI ecosystem," underscoring the company's proactive role in addressing emerging customer requirements.
Industry glossary
- MCP (Model Control Plane) server: In JFrog's context, a deployable binary component enabling AI agent orchestration and interaction, managed through a centralized registry.
- JFrog Curation: A firewall-like gatekeeper that enforces organizational policy at the entry point for open source packages, preventing untrusted code from entering the software supply chain.
- JFrog Xray: A continuous security tool for binary artifacts within Artifactory, providing dependency analysis, secret detection, and vulnerability identification post-ingestion.
- Skills Registry: A newly released centralized repository for managing and governing reusable AI agent capabilities, facilitating secure and scalable adoption of agentic workflows.
- Enterprise Plus subscription: JFrog's top-tier plan offering expanded platform features, integrations, and security, designed for large-scale or highly regulated enterprise users.
Full Conference Call Transcript
Operator: Ladies and gentlemen, thank you for joining us, and welcome to the JFrog Ltd. First Quarter 2026 Financial Results Earnings Call. After today's prepared remarks, we will host a question and answer session. If you have dialed in to today's call, please press 9 to raise your hand and 6 to unmute. I will now hand the conference over to Jeffrey Schreiner, Head of Investor Relations. Jeffrey, please go ahead. Thank you, Nicole.
Jeffrey Schreiner: Good afternoon, and thank you for joining us as we review JFrog Ltd.'s first quarter 2026 financial results, which were announced following the market close today via press release. Leading the call today will be JFrog Ltd.'s CEO and cofounder, Shlomi Ben Haim, and Ed Grabscheid, JFrog Ltd.'s CFO. During this call, we may make statements related to our business that are forward looking under federal securities laws and are made pursuant to the safe harbor provisions of the Private Securities Litigation Reform Act of 1995, including statements related to our future financial performance and including our outlook for the second quarter and full year of 2026.
The words anticipate, believe, continue, estimate, expect, intend, will, and similar expressions are intended to identify forward-looking statements or similar indications of future expectations. You are cautioned not to place undue reliance on these forward-looking statements, which reflect our views only as of today and not as of any subsequent date. Please keep in mind that we are not obligating ourselves to revise or publicly release the results of any revision to these forward-looking statements in light of new information or future events. These statements are subject to a variety of risks and uncertainties that could cause actual results to differ materially from expectations.
For a discussion of material risk and other important factors that could affect our actual results, please refer to our Form 10-K for the year ended 12/31/2025, which is available on the Investor Relations section of our website, and the earnings press release issued earlier today. Additional information will be made available in our Form 10-Q for the quarter ended 03/31/2026, and other filings and reports that we may file from time to time with the SEC. Additionally, non-GAAP financial measures will be discussed on this conference call. These non-GAAP financial measures, which are used as a measure of JFrog Ltd.'s performance, should be considered in addition to, not as a substitute for, or in isolation from GAAP measures.
Please refer to the tables in our earnings release for a reconciliation of those measures to their most directly comparable GAAP financial measures. A replay of this call will be available on the JFrog Ltd. Investor Relations website for a limited time. With that, I would like to turn the call over to JFrog Ltd.'s CEO, Shlomi Ben Haim. Shlomi?
Shlomi Ben Haim: Thank you, Jeff. Good afternoon, and thank you all for joining the call. We entered 2026 strong. Our first quarter performance reflects both the clarity of our strategy and the discipline in execution. Our continuous focus on powering the world's software for JFrog Artifactory as the system of record for trusted binaries, software packages, and AI artifacts is resonating deeply with market demand. We are seeing growing adoption among the world's leading organizations, and AI labs are choosing JFrog Ltd. as they transform to adopt modern software supply chain practices.
Across industries, geographies, and deployment environments, whether cloud or on-prem, our customers are partnering with JFrog Ltd. as their foundational platform while they navigate a complex transition of adding AI technologies and tools to their software supply chain. They tell us they are prioritizing AI adoption while simultaneously maintaining legacy pipelines and open source packages, all as they demand stronger security, governance, and fast release cycles. We are working closely with our customers, the broader developer community, and AI-native companies to support them through this period of change. Our Q1 results reflect this momentum with AI redefining the software supply chain and powering our continued expansion.
In the first quarter, JFrog Ltd. delivered total revenue of $104 million, representing 26% year-over-year growth. Cloud revenue grew 50% year over year, underscoring the accelerating shift toward our cloud-first platform. This performance was driven by continuous strength across our core growth vectors: increasing consumption of our cloud services, rising demand for our software supply chain security solutions, higher ASP on new customer acquisitions, and robust expansion within our existing customer base. We also saw continued momentum at the high end of our customer portfolio. The number of customers with annual spend exceeding $1 million grew to 80, up from 54 a year ago, representing 48% year-over-year growth.
Customers spending more than $100,000 annually increased to 1,225 compared to 1,051 in the prior year, representing 17% year-over-year growth. These results reflect our alignment with the evolving needs of modern enterprises. Developers—and increasingly AI agents—are producing software at scale and speed. These surging binaries, fueled by AI, are driving the need for a single, trusted system of record to manage, secure, and govern these assets across the entire supply chain. On today's call, I will walk you through the quarter in detail, and Ed will follow with our updated outlook and additional financial insights. Now, I will highlight the key drivers behind our performance this quarter.
First, continued cloud growth driven by increasing consumption and rising demand for a true system of record as a service, delivering scale and universality. Second, the sustained momentum in our Security business as customers prioritize end-to-end protection and governance amid rising software supply chain attacks. And finally, I will highlight our ongoing innovation that leads to solid adoption of our platform and Enterprise Plus subscription growth. Let me start with our cloud business. As mentioned earlier, cloud revenue in Q1 grew 50% year over year, an exceptional result that reflects not one single driver, but a broader trend we have been observing over the past several quarters.
As AI makes human-to-technology interaction nearly costless, and soft cost itself increasingly commoditized, binaries become king. Organizations are actively encouraging developers to utilize AI coding agents as well as explore agentic capabilities, causing software output to accelerate, resulting in more compiled code—a true AI-fueled tsunami of binaries. Observing our customers' consumption trends, we noticed that this growth is not tied to one package type or a specific AI-native workload. It is not a spike in usage or a onetime increase in open source caching. It is the result of a fundamental shift in how software is being generated, delivered, and consumed across the software supply chain.
We are seeing an acceleration in the volume of compiled software flowing through the JFrog Ltd. platform. This trend, which began taking shape in 2025, is driven by two major forces. First, developers are being supercharged by AI coding agents. Simply put, the world is creating more software packages. In this AI mass-adoption reality, we see organizations willing to budget overruns until they gain better clarity on long-term usage requirements and prior to increases in annual commitment. Second, as AI drives more software creation, it is also accelerating the flow of all open source components. Open source consumption by developers and AI agents is rising across nearly every software package we support.
And as the ultimate Switzerland of binaries, JFrog Ltd. sits at the center of this growth. Whether through on-demand increased usage momentum or annual commitments, we believe JFrog Cloud is positioned to benefit from these trends. Now, to the continued momentum we are seeing in security. As we mentioned in previous calls, modern software supply chain security is moving beyond traditional DevSecOps and fragmented scanners. AI coding agents are increasingly securing, scanning, and even fixing code rapidly at scale. And while still evolving, we see agents replacing human skills in code protection. We believe a trusted software supply chain requires a single authoritative system of record for all binaries and AI artifacts.
Building on this foundation, we deliver protection and governance beyond traditional scanning—analyzing, tracking, and proactively blocking risks at the point of entry or before distribution to production. As AI adoption accelerates and binaries scale, the threat landscape is becoming more complex. Software supply chain attacks are rising, increasingly targeting open source creators and package maintainers. This dynamic drives the growing demand for a trusted control layer and stronger DevGovOps practices. In Q1, we again demonstrated that customers subscribed to JFrog Curation were effectively protected from recent software supply chain attacks. Curation serves as a critical control point at the gate, enforcing policies that ensure only trusted packages enter the system, keeping Artifactory clean.
Once artifacts are stored, JFrog Xray and JFrog Advanced Security continuously secure and govern the binary flow, providing ongoing visibility and protection. In addition, as advanced AI models like OpenAI's GPT, Cyber, and Anthropic's Claude become increasingly embedded in development workflows, we believe modern software supply chain security and governance are defined by four core pillars. First, a centralized system of record: a single source of truth across multi-agent environments. Second, universal governance: consistent visibility and enforcement across all types of artifacts, whether consumed or generated. Third, predictable and deterministic protection: continuous, policy-driven guardrails that prevent malicious or vulnerable components from progressing. And finally, comprehensive coverage: securing both newly generated assets and the extensive base of existing mission-critical legacy binaries.
Our customers tell us they are accelerating software development and generating more binaries through the JFrog Ltd. platform. As AI adoption expands, JFrog Ltd. provides a unified system of record to secure, govern, and manage AI-generated, open source, and all legacy binaries in one place. Our customer adoption, Q1 results, sales pipeline, and future roadmap innovation are aligned with these observations. Looking ahead, we expect security to remain a key growth driver for JFrog Ltd. This sets the stage for an update on the innovation we introduced at our annual LEAP conference in New York this past March. LEAP is JFrog Ltd.'s top customers gathered by region, scheduled globally during H1 every year.
At LEAP New York, we demonstrated GA-ready solutions to concrete customers' needs for a trusted infrastructure layer for software supply chain management in the AI era. We introduced the JFrog MCP Registry, the first enterprise-grade registry for MCP servers, extending our platform to support the growing AI ecosystem. As MCP adoption expands, customers need a centralized, trusted way to manage, secure, and govern these new assets, which logically sits in Artifactory, a system of record. MCP is rapidly adapted next to agent skills, based on AI ecosystem demands. In Q1, we expanded our platform for AI-driven development with the introduction of the JFrog Skills Registry, a centralized way to manage and govern reusable AI capabilities.
In collaboration with NVIDIA, we announced the Skills Registry at GTC, enabling the governance and trust layer enterprises need to run agentic workflows securely and at scale. We further announced that JFrog Artifactory will serve as a registry for AI models and agent skills within NVIDIA AICube Blueprint, part of the NVIDIA agent toolkits. The Vice President of Enterprise Partnerships at NVIDIA, Pat Lee, noted, quote, security and governance are key to deploying AI agents in the enterprise. JFrog's agent skills registry for NVIDIA NeMo/Claude supports security and control for deploying long-running agents to help scale enterprise productivity with powerful new AI tools. End quote.
JFrog Ltd. unifies all artifact types—binaries, models, skills, and MCP servers—into a single platform governed by one framework, one set of policies, and complete visibility and traceability in one place. These innovations, combined with a growing ecosystem of strategic partnerships, are driving increased adoption across the enterprise, amplifying the value of our Enterprise Plus subscriptions and accelerating its expansions within organizations. With that, I will hand it over to Ed for a detailed review of our Q1 financials and our updated outlook for Q2 and the full year 2026. Ed? Thank you, Shlomi, and good afternoon, everyone.
Ed Grabscheid: We are pleased by the results of our first quarter, which exceeded the top end of our guidance range on every metric. It was a strong start to the year, highlighting our consistent strategic execution and ongoing operational discipline. During the first quarter, total revenues equaled $154 million, up 26% year over year. These results demonstrate the continued execution of our go-to-market strategy, fueled by our cloud revenues, ongoing demand for our security core products, and growth in our Enterprise Plus subscription. Our first quarter cloud revenues grew to $78.9 million, up 50% year over year, now representing 51% of total revenues versus 43% in the prior year.
Our outperformance in the cloud was driven by robust usage across our customer portfolio, which exceeded contractual minimum commitments. We strategically work towards converting this usage into higher annual commitments. During the first quarter, our self-managed or on-prem revenues were $75.1 million, up 8% year over year. We continue to proactively engage our on-prem customers to migrate DevSecOps workloads to our cloud, and explore solutions better aligned with their specific use cases, including hybrid and fit-for-purpose deployments. In Q1, 58% of total revenues were from Enterprise Plus subscriptions, up from 55% in the prior year.
Driven by the ongoing execution of our enterprise go-to-market strategy and broader customer adoption of the JFrog Ltd. platform, revenue contribution from Enterprise Plus subscriptions grew 33% year over year in Q1 2026. Net dollar retention for the four trailing quarters was 120%, representing a year-over-year increase of four percentage points and a one percentage point improvement sequentially. These results highlight the continued adoption of our security core products, increased cloud usage across a broad set of conventional software packages and AI workloads, and conversion of customers with usage over minimum commitments into higher annual contracts.
We continue to demonstrate that our customers view JFrog Ltd. as a mission-critical system of record to their software supply chain, with gross retention that equaled 97% as of the first quarter 2026. Now I will review the income statement in more detail. Gross profit in the quarter was $129 million, representing a gross margin of 83.8%, versus 82.5% in the year-ago period. We remain focused on cloud hosting cost optimization, as we anticipate a larger share of our revenues being generated from the cloud. Given our expected increase in cloud revenue contribution to total revenue, we reiterate our annual gross margins to be in the range of 82% to 83% in 2026.
Operating expenses in the first quarter were $96 million, equaling 62% of revenues. This compares to $79.7 million, or 65% of revenues, in the year-ago period. Our operating profit in Q1 was $32.9 million, or an operating margin of 21.4%, compared to a 17.4% operating margin in 2025. The continued balance between strategic investments and operational efficiency demonstrates our commitment to profitable growth. Cash flow from operations equaled $38.4 million in the first quarter. After taking into consideration CapEx requirements, our free cash flow reached $37.3 million, or a 24.2% margin, compared to $28.1 million, or a 23% margin, in the year-ago period. Now turning to the balance sheet.
We ended the first quarter with $741.2 million in cash and short-term investments, compared to $704.4 million at the end of 2025. Given our strong balance sheet, consistent free cash flow generation, and confidence in our strategy to execute on durable growth opportunities, JFrog Ltd. announced in late February our first ever share repurchase authorization of up to $300 million in ordinary shares. As of 03/31/2026, our RPO totaled $574.9 million, a 36% increase year over year, highlighting the successful execution of our go-to-market strategy as customers continue to make larger, multiyear commitments to our DevSecOps solutions. As a reminder, our RPO excludes any benefit from customers' usage over contractual minimum commitments.
And now, let us turn to our outlook and guidance for the second quarter and full year of 2026. As we enter 2026, we remain encouraged by the strength in our pipeline and emerging AI workload trends driving increased cloud usage. Even as cloud usage trends accelerate, our guidance philosophy will remain unchanged as we continue to de-risk our largest deals due to timing uncertainties and any benefit from cloud usage above contractual commitments. Our outlook reflects growing contributions from our JFrog Ltd. security core products, ongoing adoption of our full platform, and cloud growth driven from higher annual customer commitments. We are raising our estimated full-year 2026 baseline cloud growth to be in the range of 33% to 35%.
Given the anticipated contribution from our security core and increased baseline cloud growth assumptions, we now expect our net dollar retention floor to be 118% for 2026. Turning to operating expenses, we continue to prioritize innovation across our platform. We remain committed to a disciplined spending philosophy and are confident in our ability to manage expenses and drive ongoing efficiency in line with prior execution. For Q2, we anticipate revenues to be in the range of $154 million to $156 million, with non-GAAP operating profit anticipated to be between $28 million and $30 million, and non-GAAP earnings per diluted share of $0.23 to $0.25, assuming a share count of approximately 126 million shares.
For the full year of 2026, we anticipate a revenue range of $628 million to $632 million, representing 18.5% year-over-year growth at the midpoint. Non-GAAP operating income is expected to be between $112 million and $116 million, and non-GAAP diluted earnings per share of $0.93 to $0.97, assuming a share count of approximately 128 million shares. Now I will turn the call back to Shlomi for some closing remarks before we take your questions.
Shlomi Ben Haim: Thank you, Ed. AI is transitioning from experimentation to tangible revenue, and we are seeing stronger momentum across our business. Looking ahead, demand signals for JFrog Ltd. remain strong, including the durable cloud growth driven by AI, which is accelerating usage. New logo ASP is rising, and demand for our security solutions, amid the increasing frequency of software supply chain attacks, is growing. To my fellow frogs around the world, thank you. This quarter, you did not just deliver; you rose above. No matter the circumstances, you kept pushing forward, navigating with resilience, innovating with purpose, and triumphing when it matters most—for our customers. Because of you, we do not just move forward, we leap further.
May the frog be with you. Operator, we are now ready for questions.
Operator: We will now open the call for questions. Please limit yourself to one question. If you would like to ask a question, please raise your hand now. If you have dialed in to today's call, please press 9 to raise your hand and star to unmute. Your first question comes from the line of Sanjit Singh with Morgan Stanley. Your line is open. Please go ahead.
Sanjit Singh: Yes. Thank you for taking the question, and congrats on a fantastic start to 2026. I had two questions for the team. I wanted to start with Ed first. Obviously, cloud growth, great total revenue growth in Q1. When I look at the outperformance versus what the estimates were, it seems like you guys came in about $7 million above on Q1. Q2, you guys came in ahead by a couple million bucks. So roughly $10 million. When you look at the raise to the full year, it is somewhat less than that, so I just wanted to sort of sanity check any sort of revised assumptions about the second half ramp? That was my first question.
And then I had a more strategic one for Shlomi.
Ed Grabscheid: Hi, Sanjit. Thank you very much for the question. It is a good question. We had a very strong quarter in Q1, as you highlighted. The growth in the cloud is 50%. And more importantly, we now see the mix in our cloud above 50%. We delivered 51% the first time. It is a milestone for JFrog Ltd. where we see more revenue coming from our cloud offering than we do from self-hosted. But we also are committed to our guidance philosophy, which is we will only guide on those commitments. So while we saw the strength in Q1, much of that was driven by usage over minimum commitments.
We are deploying our sales organization, of course, to convert that into annual commitments, but until it becomes an annual commitment, it will not be part of our guidance, aligned with our philosophy.
Sanjit Singh: That is very clear. I thank you for that, Ed. And then, Shlomi, the question for you is, it is a really timely one. Some of our own field work on JFrog Ltd. shows a real inflection in demand for the security side of the portfolio. It seems very clear to us, and I think you highlighted that in your script. At the same time, there is more of this longer-term structural debate on security overall and what the model labs will subsume. And there seems to be a take that things like scanning, vulnerability management, vulnerability scanning, posture management, code security could be more of the purview of model labs longer term.
And so to the extent that you guys have some exposure to those parts of security, I would just love to get your latest thoughts on the long-term durability of those pieces of the security product portfolio.
Shlomi Ben Haim: Thank you, Sanjit. Good question. What we see in the market is a kind of flooding of software supply chain attacks, coming mainly around open source maintainers, and the hackers are going after them. JFrog Ltd. is positioned to secure our customers from that quite strongly. We called that in the script when we said that all the JFrog Curation customers were actually protected from those software supply chain attempts to attack. Moving forward, what is the real question? The real question is can you really secure and govern the binaries, the artifact, the outcome of AI? And what JFrog Ltd. provides is not only a place that scans.
Scanners are important, but the system of record of where you secure, manage, store, and govern your artifact is actually more important because, in the world of multi-agents that are all building and scanning and protecting and even fixing software, you still need to host it in a secure place. The second thing is you will have to protect yourself from the open source world that will still exist—the Python, the npm, the Hugging Face, the Docker—which is what JFrog Ltd. is doing at the gate. And the third thing is how you combine security of the new outcomes coming from agents or multi-agents with the legacy that is now being built.
You still need to manage dependencies with the binaries of yesterday that are still hosted, still regulated, and still on the servers in your production. The combination of the expertise that we built around binary security and not source code—because this is a big confusion in the market; coding agents are now securing source code, replacing human beings—together with the moat around Artifactory, the system of record, in a multi-agent world, the open source on top of it, and including the legacy, I think, gives JFrog Ltd. customers the confidence to bet on us. This is also one of the things we called out—new logos are now buying JFrog Ltd. with security, knowing that this is the future.
Operator: Your next question comes from the line of Radi Sultan with UBS. Your line is open. Please go ahead.
Radi Sultan: Awesome. Thank you so much. And I echo my congrats on a really strong start to the year. Maybe just two quick ones. Shlomi, on legacy code modernization, we have been hearing an uptick in JFrog Ltd. getting pulled along in AI legacy code modernization deals. So, Shlomi, if you could just talk through how big of an opportunity is legacy code modernization for JFrog Ltd., and where do you expect to see the biggest potential pull-throughs to your business? And then maybe, sorry, one more quick one for Ed. Could you speak to how impactful your AI-native customers were to the strength in Q1? Just want to get a sense of how broad-based the strength was. Thank you, guys.
Ed Grabscheid: Thank you, Radi. Maybe I will start and Shlomi will take it from there.
Shlomi Ben Haim: When we speak about legacy, we speak about legacy binary code, not source code. Basically, what you currently have in production is what we call legacy. You have to regulate for the next seven years if you are a bank, or the next 45 years if you are an automaker. This is legacy. These are binaries that were built today or yesterday, and tomorrow, with coding agents, we will still have dependencies that are in your servers in production. This means that those binaries need to be also first-level citizens in the system of record. Otherwise, how can you protect what is secured to be shipped?
What was made yesterday and approved and governed by the organization needs to still be maintained in the system of record. So it is a very important asset that our customers are protecting still, while coding agents are building the new compiled code, the new binaries that are also scanned and protected by JFrog Ltd.
Ed Grabscheid: And regarding the question on the native AI companies, not only did we have a successful Q1 driven by a broad set of AI-native customers, but traditional customers as well, or non-AI-native customers. You recall last year, we talked about a $1 million land that we had with an AI-native customer that renewed, and we are in continuous conversations with many of the large AI-native companies, and we will explain more and update later.
Shlomi Ben Haim: Radi, if I may add to it, serving the AI labs is important, and we take pride in it and we are very honored. But I think that once you become the power grid of these AI labs' software supply chain, you learn much more in how you should serve the rest of the portfolio. And that is the big plus—not a $1 million here, a $1 million there—but mainly what we are building together with them as we power the software supply chain.
Operator: Your next question comes from the line of Michael Cikos with Needham. Your line is open. Please go ahead.
Michael Cikos: Hi, team. Congratulations on the strong start to the year here. And one of the things we have been going through this earnings season, which is still pretty quick on the heels of the SaaSpocalypse—which seems overinflated at this point—but one of the things we are seeing is the budget is there for strategic vendors. And so I am wondering, when you are speaking with customers, is it fair to assume that this evolution of the agentic stack or how AI is playing out is causing customers to rethink the need to modernize their existing architecture? And as a result, JFrog Ltd. is being pulled into that conversation and benefiting with respect to cloud migrations.
Can you talk to what the tempo of conversations you are seeing out there actually is like? And then I just had a quick follow-up for Ed.
Shlomi Ben Haim: Yes. Thank you, Michael. What is it that we hear from the market? What we hear from our customers is that every application—to your point about SaaS companies—every technology that was built to have human interaction with technology is being questioned now. Everything. Every application—even source code. Source code is, as we mentioned in the script, something that now you can do on an experimental level, and you can do it a thousand times faster.
Michael Cikos: But what happens—
Shlomi Ben Haim: When the machine language, the binaries, need to be maintained, this is where they start to be a bit more cautious about how they plan the future. For example, in order to enable AI, you need to use MCP servers. This is the interaction between machine and your solutions. MCP servers are yet another binary. This is where, to your point, JFrog Ltd. comes into the question: can JFrog Ltd. become my MCP registry for all the MCP servers? The same thing happened with NVIDIA when they asked us about skills. Skills for agents—yet another binary. Can JFrog Ltd. become the skills registry?
So what we are hearing is: how can I build a stronger, better, scalable, universal system of record to manage all of these binaries? Because in tomorrow's world, what will matter will be the machine language—not source code, not human language—zero and one. And this is what JFrog Ltd. did for the past 17 years.
Michael Cikos: That is great to hear. Thank you, Shlomi. And Ed, for a quick follow-up here, just trying to peel back layers of the onion as far as the strength in cloud that you guys saw. Is there any way to further qualify—I do not know if you could talk to either the size of the cohort that drove the magnitude of that upside or how cloud overconsumption trended through the quarter from a linearity perspective? Can you just put any finer parameters around that strength?
Ed Grabscheid: It was a strong quarter from start to finish, Michael, to be honest with you. It was very broad-based. It was not concentrated in one geography or one industry. I will say that what you saw in terms of the cloud was represented in our increase in the cloud guide. We were very confident with what is happening right now in the cloud, and that is what gave us the ability to raise our guide from 30%–32% to 33%–35%.
Operator: Your next question comes from the line of Miller Jump with Truist. Your line is open. Please go ahead.
Miller Jump: Hey, great. Thank you very much for taking my question. I will echo my congrats on a really strong start. You know, last year you guys were talking about AI experimentation driving consumption beyond commitments. It sounds very different today from your prepared remarks. So can you just talk about the difference you see in the amount of binaries in your system reaching production now versus a year ago? And I would also say it sounds like there is still a number of customers that are maybe waiting to commit bigger. So what are you hearing in terms of their hesitancy? Thanks.
Shlomi Ben Haim: Oh, Miller, this is an awesome, awesome question. Basically, you are saying source code is being produced at a completely different pace, completely different volume. Everything produces source code now. It is not just human developers; all the coding agents together with the human developers. So the big question is, do we see binaries going at the same time? You can think about it as digital photography replacing film. Film was expensive. You would take one shot at sunset before you print it, and now assume that you can take 200 of them, and instead of one printing—instead of one posting to your social network—you will now have five. Binaries are the asset you will take to production.
Source code became cheap, and now you can make more binaries that need to be immutable. They need to be tracked. They need to be governed. And you will see this closing on binaries and what you can take to production because of the change of AI. Same thing goes to governance. How do you make sure—with the same metaphor—how do you make sure that the picture that you posted on your social does not carry your home address in the background? This is what JFrog Ltd. brings. Not only dealing with the volume of new secure pictures, but also governing what goes out.
Operator: Your next question comes from the line of Howard Ma with Guggenheim. Your line is open. Please go ahead.
Howard Ma: Thanks, and congratulations on a strong quarter. I have two questions. My first is I would like to better understand how exactly JFrog Ltd.'s revenue benefits from Curation and Advanced Security. I believe there are a few parts: the first being you need tier upgrades where you have to be on Enterprise X and Plus to qualify for buying those products; and then as you make commitments, you obviously get that commitment; and then there is over-usage, I believe, driven by increased traffic from attacks. So I just wanted to run that by you if those elements are correct.
Shlomi Ben Haim: Yes. I will start speaking about JFrog Curation and JFrog Advanced Security and JFrog Xray, and Ed can speak about the over-usage—what is counted. Listen, everything that comes from open source, pulled by agents—AI agents—or by a human developer is something that needs to be protected before it steps into Artifactory, your single source of truth. When we built Curation, it was based on customers' requests. They asked us to give them a firewall that will enforce policy of what comes in. That was at a completely different volume when it was made by humans pulling open source packages.
Now, when you have a thousand-times-faster pulling request for open source packages from public hubs—whether npm, Docker, Hugging Face, Conda, PyPI—you know that you are subject to attack. And the attackers are also using coding agents. They also became more sophisticated. They are also going after the maintainers that they know, by an order of magnitude, will spread their malicious packages. What Curation did very successfully was not only apply this firewall enforcing your policies, but also scale to this level of AI. And this is why our customers not only embrace Curation, but also increased the demand for it after every attack we saw since 2025, which I alluded to this quarter with MCP and Python and others.
Regarding JFrog Advanced Security and JFrog Xray, once it is inside your system of record—once it is inside Artifactory—you need to still maintain the security of your software supply chain. You need to look for secrets that were exposed. You need to look for dependency graph security, and this is what JFrog Advanced Security and Xray are doing. And then, when you ship to production, you ship something that you can actually trust.
Ed Grabscheid: And, Howard, regarding the monetization of Curation, the monetization is based off of seats. This is a common currency in security, and we monetize based off of the seats. So regarding the attacks—an increase in attacks—it certainly drives demand from our customer portfolio and new customers to take either an increased number of seats or adopt Curation. It does not necessarily drive data consumption. Data consumption is being driven by packages coming in and out of the organization or going into production. So Curation itself is not driving the usage over minimum commits.
Operator: Your next question comes from the line of Mark Cash with Raymond James. Your line is open. Please go ahead.
Mark Cash: Great. Thank you. Hi, Shlomi, maybe I wanted to build off a few previous questions and ask about MCP Registry and AI Catalog. Because there are a lot of companies saying they will provide the visibility and security for AI agents. So, where in the customer journey do organizations realize they need JFrog Ltd. governance capabilities? What pain points are they seeing that others cannot solve before coming to you? Thank you.
Shlomi Ben Haim: Yes. Thank you for this question, Mark. What is happening now is that every software provider already provides an MCP server because we all know if agents will not have any interaction with your software, that would be the end of your software usage. MCP servers are binary code. No matter who provides that, it is binary code. So our customers came to us and asked for an MCP Registry.
As they trust npm packages inside Artifactory, or Python inside Artifactory, or Docker containers inside Artifactory, they also want to have a list of MCP servers that they can put in an MCP Registry—this is what we released this quarter—and then they can tell all of the AI agents or human developers, this is a safe place to take your MCP servers from. Same thing happened with skills, which is a very growing trend when you use coding agents, and there is some kind of a movement now to CLI, which is the third technology. All of the above are binary code. All of the above are natural expansions of our solution, and therefore, they are stored in Artifactory.
Operator: Your next question comes from the line of Jason Celino with KeyBanc Capital Markets. Your line is open. Please go ahead.
Jason Celino: Great, thanks. Good afternoon. You know, the value proposition of Curation is quite compelling, and as you noted, these Curation customers were protected in Q1 from the software supply chain attacks that we saw on the news. It seems like a no-brainer to me and to most investors, but to the customer, what might be the alternative if they do not choose Curation? Or what factors are being considered that might be delaying that customer's decision? And, given you are seeing this tremendous demand, do you have the capacity to meet it? Thank you.
Shlomi Ben Haim: Jason, I think it is clear that for a very long time, JFrog Ltd. said we are betting on a world of automation, a world where machines have to manage the asset, and therefore we never shifted our focus from managing binaries. Every binary management tool is an alternative. So I think that therefore the strong differentiators that JFrog Ltd. brings—like universality; JFrog Ltd. is the Switzerland of binaries. JFrog Ltd. not only serves all the binary types, but also all the coding agents, human beings, and other citizens that are using our solution. JFrog Ltd. scale—we built 17 years of scalability.
We went with the biggest organizations on the planet to scale to their level, and now we are even elevating it more because of AI. So scalability matters. JFrog Ltd. is hybrid, giving you the freedom of choice of running it in the cloud, on every cloud, and on-prem if this is what you prefer, if you are in a highly regulated environment. JFrog Ltd. integrates with all your ecosystem tools when it comes to DevOps, DevSecOps, DevGovOps, which gives you also the freedom of choice in not getting into a vendor lock-in.
So if there would come a solution that provides all of this, in a universal way, and also complements the AI change in a world of machine-language binaries, that would be a threat over JFrog Ltd. I hope that we put the moats around what we build best, which is the system of record.
Operator: Your next question comes from the line of Kingsley Crane with Canaccord. Your line is open. Please go ahead.
Kingsley Crane: Hey, congrats on results, and thanks for taking the question. On the Q4 call, you called out that the November npm attacks had driven both immediate Curation revenue as well as building pipeline. Just trying to get a sense of if there is more urgency around procuring Curation or Advanced Security versus some of these larger software decisions that could take multiple quarters? And I guess more specifically, just on Q1, how much did Curation drive the upside in cloud in Q1? Thanks.
Shlomi Ben Haim: Well, listen, Kingsley. Every time that there is some kind of software supply chain attack, we see a rise in the pipeline, and obviously a lot of our customers are concerned, and that is an immediate impact. What happens when it is happening every few weeks? This is what is happening now. It is happening every few weeks. It used to be SolarWinds and then, a year after, Log4j, and then a year after something else. Now you refresh your browser—there is a software supply chain attack. And why? Because source code does not matter anymore. Source code scanning is something that used to be overappreciated. Now people understand what needs to be protected is what is going to production.
And the hackers, the attackers, understand that. They go after the maintainers of the open source packages, and this is what you have to protect yourself from. Now, will there be companies that will react based on fear only? I think it is forever the trade-off, but we see more and more responsibility on the customer side, knowing that the magnitude that they are looking at is completely different from what it used to be yesterday.
Operator: Your next question comes from the line of Shrenik Kothari with Baird. Your line is open. Please go ahead.
Shrenik Kothari: Yes. Thanks a lot for taking my question. So, Shlomi, Ed, you have been careful in the past not to oversell AI as an immediate sort of revenue windfall and, in your own words, described 2025 more as an initial spark, not a bonfire. And, Shlomi, you drew comparison now with the transition from film to digital. So as this higher-quality AI code with Opus codecs is likely reaching more production—and we are hearing anecdotes about it—that definitely creates more valuable binaries for you to act as a system of record. Just where are customers today on that journey from AI code slop to production grade?
And what specific indicators are you watching that would tell you that the fire has started or is going to start?
Shlomi Ben Haim: Good afternoon, Shrenik. I think that what we see today is more an experimental kind of mode. Everybody is trying everything. Not so long ago, we would speak about Copilot, and now everybody is speaking about Anthropic and Codex. And I think that a lot is being adopted in every organization, which means that—as we used in this metaphor before—you can take many more pictures. It does not cost you any film. But what we also see is that not even a single customer has a full autonomous process. This is not yet there. It is still a combination of human developers with coding agents.
There is no coding agent that starts from scratch and pushes to production and maintains the production fully autonomously. So there are still some miles to go before AI will take over completely over the developer's position. We start to see the collaboration between strong human developers and coding agents. This is why there is a rise of the water on every front.
Operator: Your next question comes from the line of Brad Reback with Stifel. Your line is open. Please go ahead.
Brad Reback: Great, thanks very much. Shlomi, back to your comment during the prepared remarks about customers willing to, I will say, absorb meaningful overages in the cloud. What do you think is the gating factor, or why are they willing to do that and not commit and get a better rate? Thanks.
Shlomi Ben Haim: Yes. Hi, Brad. There is a race of AI adoption in every company now. No matter if you are a small or medium business, or if you are one of the largest banks in the world with 50 thousand developers, it is coming from the board. It is coming top down. The board is asking about AI adoption, making sure that you are in the race and not falling behind. So what we see now is that usage in the cloud is also part of this experiment. Now, if you go to the CFO and you say, JFrog Ltd. asked me to commit, the CFO will ask, commit to what? And therefore, they leave the meter on.
They let you use more and even pay more. And now our team's mission is to make sure that we convert this over-usage into commitment to gain a win-win situation with our customers. It will come. It will just take a bit more time because of the predictability that is now missing.
Operator: Your next question comes from the line of Lucky Schreiner with D.A. Davidson. Your line is open. Please go ahead.
Lucky Schreiner: Great. Thanks for taking my question, and congrats from me as well. Very impressive quarter. Maybe a bit of a follow-up there. Previously, you have spoken to some customers preferring to buy JFrog Ltd. on a self-managed basis, given the better visibility and cost controls around that. But I did not get a sense of those trends from the prepared remarks today. So one, is that fair to say? And two, is there maybe any potential reason for a change in those trends? Thanks.
Shlomi Ben Haim: Hey, Lucky. We still see customers asking for the self-hosted solution or on-prem solution. It is split into two profiles. One is the big AI labs that are building their own data centers. They have enough money. They have enough capital. They do not want to share anything with the public cloud for whatever reason you can imagine, and they will take an on-prem solution and embed it into their software supply chain architecture.
The second group that we see is the group of the highly regulated companies—government institutes, or whatever organization that needs to be highly regulated—and they will, for sure, do a lot of these tests and experiments in an on-prem environment before they would go to the cloud or to FedRAMP. And last, what we see are companies that are well established and kind of seasonal players in the on-prem environment, so they are not looking to have now one or two entities in their world playing in the cloud. They keep on extending their own on-prem.
But as you can see in our numbers, it is not only part of our strategy to migrate our business to the cloud—and this quarter, we also announced the first time that it crossed 50% of total revenue—it is also a benefit that we keep in our pocket: the only company that gives you a full hybrid solution with full freedom of choice. No matter who you are, we can give you the freedom to embed AI or to adopt AI in your environment.
Operator: Your next question comes from the line of Jason Ader with William Blair. Your line is open. Please go ahead.
Jason Ader: Yeah, thank you. Good afternoon. I wanted to revert to an earlier question, which was asked about the risk that the LLM guys encroach into the binary layer. And, Shlomi, I was hoping you could talk about some of the announcements that the labs made during the quarter. They started to talk at least a little bit about binaries. It was too technical for me, honestly. So if you could help enlighten us and just talk about what they announced around binaries, and why it is not something that you worry about.
Shlomi Ben Haim: Hi, Jason. Let me start with the last sentence. I am worried about everything. There is nothing that I am not worried about. But I have the confidence that what we are building alongside these companies is completing and being complementary to what the world is demanding. What you have heard is reverse engineering binaries. I guess that you referred to the OpenAI announcement about Cyber. This is a way to take the binaries themselves and reverse engineer them to see what they were built from. That is not replacing JFrog Ltd.'s solution.
Even if you are running fast-forward two years from now, when every organization would use OpenAI next to Anthropic, next to Cursor, next to Copilot, next to Gemini, I guess that we will all agree that even if you have this environment, they all build binaries, and you need a governance tool that provides a universal solution to contain them all. The second thing: who will protect the open source? It is not reverse engineering the open source packages; it is what the agent built itself. So when you bring something from npm or something from Docker, how do you make sure that what you brought into the organization passed your firewall?
How do you make sure that this is secure? The third thing is that this race is not just happening between the defenders and the vendors. This race is actually happening between the defenders and the attackers, because the attackers will also use Claude, and they will use Codex in order to build a more sophisticated malicious attack. So how can you make sure that the policies that you put at the gate are securing your system of record? And last, what happens when one authority should take a decision of what is going to production? Will it be one of the coding agents? Will it be two of them? Will it be a human being?
Will it be the company policy? This is the infrastructure we provide. We are not the policymakers; we are the policy enforcer. And we help you to make sure that what goes to production came out clean from the non-poisoned reservoir. So while we are seeing this happening, it is mostly focused on what I built as an agent, replacing human beings, replacing human language, to what I need to secure at the end and to govern and to trust. So that is how we see it now, and this is what our customers are telling us.
Operator: Your next question comes from the line of Andrew Sherman with TD Cowen. Your line is open. Please go ahead.
Andrew Sherman: Great, thank you. Great, congrats on the cloud numbers. Shlomi, on the security side, we have gotten a lot of questions on how much of your revenue comes from Xray since the labs now have similar products. It would be great if you could clear that up for people. How should we think about the contribution of that versus Advanced Security and Curation, and just the main barrier to entry for the latter, the Curation and such? Thanks.
Shlomi Ben Haim: Yes. Great question. Xray, however, was a part of our DevOps offering. And why is that? Because we do not think that Xray by itself should stand alone and start to do software composition analysis. We think that Xray should run over your Artifactory, making sure that your containers four tiers down are secured to be in Artifactory. Can ten other tools replace that? Yes. But if you get it as part of your package, using Artifactory, built into Artifactory, why do you need another tool? The second thing that Xray brings is this understanding of what is coming out from the open source environment, and to be able to break that in pieces and to secure that.
Can it be something that is brought both from the outside, like the point solutions tool on top of Artifactory? Yes. But what we see is that thousands of customers prefer to take it as part of their DevOps subscription with JFrog Ltd., knowing that this is a built-in solution on top of your system of record.
Operator: Your final question comes from the line of Koji Ikeda with Bank of America. Your line is open. Please go ahead.
Koji Ikeda: Yeah, hey, guys. Thanks so much for squeezing me in. When I look at cloud, the net new revenue added this quarter, I think, is the most ever in a quarter, let alone a first quarter. And so that absolutely implies customers are spending above commitment levels like never before. So why is it? Or maybe the better question is, how long do customers typically take before they come to JFrog Ltd. and start renegotiating their contracts for higher commitment levels, which presumably come with better volume discounts? Thank you.
Shlomi Ben Haim: Yes. Regarding the why, I will make it simple. We called it before. We would see more code means more binaries means more JFrog Ltd. And JFrog Ltd. is well known for being the binary people. Regarding how long it takes, we are actually not waiting, Koji. This is part of our practices—our enterprise sales practices changed something like two years ago. We are going to those customers with a better offer, with a better plan, if they are committing. The question would be, how long will this experiment be mature to be discussed as a commitment?
And this is something that I am sure that we will keep on following and provide you with more clarity on where the cloud goes. But if you look at the confidence in our guidance, we raised the cloud—although we see that a lot of it comes from usage over commitment—we raised the guides for the year because we know that this is not a spike. It is a few quarters already that we see the growth in the cloud.
Operator: This concludes the question and answer session. I will now turn the call back to Shlomi for closing remarks.
Shlomi Ben Haim: Everyone, thank you for your questions. Thank you for your trust. May the frog be with you, and may we have a great year.
Operator: This concludes today's call. Thank you for attending. You may now disconnect.
