Logo of jester cap with thought bubble.

Image source: The Motley Fool.

DATE

May 14, 2026 at 5 p.m. ET

CALL PARTICIPANTS

  • Chief Executive Officer — Dinakar Munagala
  • Chief Financial Officer — Harminder Sehmi

TAKEAWAYS

  • Revenue -- $2.7 million, representing a 172% increase year over year driven primarily by existing and expanded contracts despite a memory supply constraint.
  • Major contract expansion -- The NeoTensr contract expanded to a potential total value of $70 million, with $11 million in fulfillment anticipated in the next quarter.
  • Strategic partnerships -- New agreements include Winmate, expected to deliver approximately $15 million in first-year business, and Datacomm, which will contribute in later quarters through a Nokia (HEL:NOKIA) partnership.
  • AI services launch -- The company announced its first application service, face recognition, launching as part of Blaize AI Services; further services such as intelligent document processing are under development.
  • Revenue guidance -- Full year 2026 revenue guidance remains at $130 million, with management reaffirming a meaningfully back-half weighted revenue profile.
  • Gross margin -- Gross margin was 58%, up from 11% in the previous quarter; the mix shift toward higher-margin software and hardware, as well as the deferral of an HBM-intensive order, drove this increase.
  • Margin outlook -- Blended gross margins for the next two quarters are expected to be compressed due to a higher mix of third-party hardware, but management projects margins to exceed 30% in the fourth quarter.
  • Operating expense -- Total operating expenses including $8.9 million in stock-based compensation were $25 million, $14.7 million lower year over year.
  • Net loss -- Net loss for the quarter was $22.7 million, a material improvement from a $147.8 million net loss in the same period last year, which was affected by noncash and one-time items.
  • Adjusted EBITDA loss -- Adjusted EBITDA loss was $13.9 million, $1.5 million better year over year, but $1.9 million higher quarter over quarter.
  • Cash position -- Cash on hand was $33.3 million as of March 31, 2026, prior to the May 6 close of a $35 million equity offering supported by institutional investors.
  • Supply chain mitigation -- Blaize expects its forthcoming partner-branded servers, which do not require HBM, to reduce supply chain risks, with shipments beginning in the second half of the year.
  • AI services revenue contribution -- CFO Sehmi said, "If I take a combination of the hardware and some of the software, probably about 15% to 20%," of 2026 revenue guidance is attributed to AI services.
  • HBM dependency reduction -- CFO Sehmi stated, "maybe 20% or so would be the HBM sort of intensive stuff," with the majority of revenue expected from DDR and LPDDR memory-based solutions.
  • Equity raise use -- Proceeds from the $35 million equity raise will fund commercial deal commitments, AI services development, rack-scale hybrid platform advancement, and next-gen platform development.

Need a quote from a Motley Fool analyst? Email [email protected]

RISKS

  • Gross margin compression is expected in the next two quarters due to a higher portion of third-party hardware in the revenue mix.
  • Industry-wide high-bandwidth memory (HBM) shortages delayed key orders and remain a supply risk, though partially mitigated by product roadmap changes.
  • CFO Sehmi cautioned, "I think the macro sort of environment is still something that we all need to keep an eye on," implicating ongoing supply chain unpredictability.

SUMMARY

Blaize Holdings (BZAI 14.41%) reported a sharp year-over-year revenue increase and expanded its flagship NeoTensr contract to $70 million in potential value, despite fiscal Q1 revenue (period ended March 31, 2026) being held back by a global memory shortage. The company launched Blaize AI Services, signaled the forthcoming rollout of face recognition applications, and confirmed high-margin, recurring revenue streams from these offerings will become increasingly material by year-end. A $35 million equity raise strengthened liquidity and extended the company's cash runway through mid-2027, supporting aggressive investment in new partner contracts and product expansion.

  • Blaize's hybrid AI architecture is positioned to capitalize on a market shift towards sovereign, edge-based inference infrastructure.
  • Strategic engagement with Nokia (HEL:NOKIA) and Datacomm is building a pipeline for multisite, multiphase deployments in Asia Pacific, with European and North American opportunities under active development.
  • Management confirmed that recurring revenue from application-level services will ramp in the fourth quarter as AI services are incorporated into customer use cases and contracts are executed.
  • Uptake of Blaize's lower power, DDR-based solutions may accelerate the transition away from HBM-dependent product lines, reducing supply chain risk and supporting gross margin outlook into 2027.

INDUSTRY GLOSSARY

  • HBM (High-Bandwidth Memory): A type of advanced memory chip used for AI training and inference workloads that enables high data transfer rates, often creating supply chain bottlenecks in AI hardware markets.
  • DDR/LPDDR: Standard forms of dynamic random-access memory (DRAM) used in server and embedded solutions, offering lower cost and energy usage compared to HBM.
  • Hybrid AI: An architecture that combines multiple AI accelerators (e.g., GPU and GSP) within the same infrastructure to optimize for different workload types and application services at the edge or in data centers.
  • Rack-scale: Infrastructure design where compute resources, accelerators, and networking equipment are integrated at the rack level to provide flexible scaling and high-density AI performance.
  • AI services: Application-level software offerings delivered on top of AI infrastructure, generating recurring per-query or subscription revenue streams.

Full Conference Call Transcript

Unknown Executive: Before we begin the prepared remarks, we would like to remind you that earlier today, Blaize Holdings, Inc. issued a press release announcing its first quarter 2026 results. Earnings materials are available on the Investor Relations section of Blaize Holdings, Inc.'s website. Today's earnings call and press release reflect management's views as of today only and include statements related to our 2026 financial guidance, revenue, gross margin, competitive position, anticipated industry trends, market opportunities, products and financing opportunities, all of which constitute forward-looking statements under the federal securities laws. Actual results may differ materially from those contained in or implied by these forward-looking statements due to risks and uncertainties associated with our business.

For a discussion of material risks and other important factors that could impact our actual results, please refer to the company's Form 10-K and Amendment #1 to Form 10-K for the year ended December 31, 2025, and our Form 10-Q for the period ending March 31, 2026, including the Risk Factors section therein and today's press release, both of which can be found on our Investor Relations website. Any forward-looking statements that we make on this call are based on assumptions as of today, and other than as may be required by law, we undertake no obligation to update these statements as a result of new information or future events.

Information discussed on this call concerning Blaize Holdings, Inc. industry, competitive position and the markets in which it operates is based on information from independent industry and research organizations, other third-party sources and management's estimates. These estimates are derived from publicly available information released by independent industry analysts and other third-party sources as well as data from Blaize Holdings, Inc.'s internal research. These estimates are based on reasonable assumptions and computations made upon reviewing such data and Blaize Holdings, Inc.'s experience in and knowledge of such industry and markets. By definition, assumptions are subject to uncertainty and risks, which could cause results to differ materially from those expressed in the estimates.

During this call, we will discuss certain non-GAAP financial measures. These non-GAAP financial measures should be considered as a supplement to and not a substitute for measures prepared in accordance with GAAP. For a reconciliation of non-GAAP financial measures discussed during this call to the most directly comparable GAAP measures, please refer to today's press release. Now I'd like to turn the call over to Dinakar Munagala, CEO of Blaize Holdings, Inc.

Dinakar Munagala: Thank you, Lana, and good afternoon, everyone. We came off a breakout growth year in 2025, and we expect 2026 to continue that trend. Q1 strengthened our commercial foundation through several new contracts and partnerships. First, we expanded our NeoTensr contract, bringing the total potential value to $70 million. We signed a strategic partnership agreement with Winmate, a publicly traded leader in ruggedized computing with the intent to close approximately $15 million in business in the first year. We deepened our joint engagement with Nokia across Asia Pacific. Together, we stood up a joint AI innovation lab advancing hybrid AI rack scale development. The engagement also includes a strategic partnership with Datacomm, one of Southeast Asia's leading cloud service providers.

Finally, we announced Blaize AI Services and will bring our first application service to market. Q1 revenue came in at approximately $2.7 million. This reflects a global memory shortage that limited server availability from one of our trusted suppliers and delayed orders. Customer demand remained intact throughout the quarter. We expect to secure the inventory needed to deliver over $11 million to a single customer in the second quarter of this year and we are reaffirming our full year 2026 revenue guidance of $130 million.

At GITEX AI 2026, in April, one of the largest AI showcases in Asia, we announced Blaize AI Services, which we expect to turn AI infrastructure into production-ready APIs that cloud service providers, data center operators and system integrators can deploy, monetize and resell. Today, we are going to announce the next step in execution, the upcoming launch of our face recognition AI service, the first in a series of application-level services running on the Blaize Hybrid AI platform. Why this matters? AI services will complement our hardware sales with recurring application layer revenue per query. It's higher margin, it's stickier, and it scales with our partners' growth, not just with their CapEx cycle.

Face recognition is the first proof point, additional high-demand services, including intelligent document processing will follow. We have signed a contract with NeoTensr that is expected to generate up to $50 million in revenue in the first year. This builds on more than $20 million in revenue that we recognized in Q4 of 2025, bringing the total potential value to approximately $70 million. The development uses a co-branded AI server built on Blaize Quad card. Each server handles 200-plus simultaneous camera streams with advanced AI analytics while running LLM and VLM inference on the same infrastructure.

This is what our hybrid AI architecture was built for, real-time perception at the sensor layer, advanced reasoning on the same rack, no round trip to a distant cloud. The rollout is expected to span multiple cities across Asia Pacific in multiple phases. Each phase is expected to drive higher-margin revenue as the AI services layer takes hold. Earlier this month, we entered into a strategic agreement with Winmate. Together, we will integrate Blaize AI into ruggedized systems, drones, handhelds, vehicle-mounted units and embedded devices for mission-critical operations, border security, maritime, essential infrastructure and field health care. Beyond the contracts I just described, we are advancing a series of rack-scale hybrid AI engagements anchored by our joint partnership with Nokia.

This work reaches cloud service providers and infrastructure partners. These opportunities are multisite, multiphase with hundreds to thousands of edge nodes per program. They span smart city, sovereign data center and large-scale ruggedized field use cases. The architecture is hybrid GSP plus GPU at rack-scale, orchestrated by Blaize AI Services stack. The pattern is consistent. Customers want sovereign control of their data. They want efficiency. They want application-level AI services they can resell. Hybrid AI delivers all 3. Stepping back, the AI infrastructure conversation is shifting fast. A year ago, the industry was focused on one thing, massive centralized GPU clusters for training.

Today, the conversation moved decisively towards sovereign language model, inference at the edge, in-country at unit economics that actually work at scale. That shift is what Blaize was built for. Three pillars: number one, sovereign AI infrastructure. Governments and large enterprises across Asia, Middle East and Europe demand compute that stays within their borders under their control. Hybrid rack-scale enables this without hyperscaler economics. Number two, smaller LLM-based AI services. Most enterprise AI workloads do not need a frontier model. They need a tightly tuned domain-specific model on infrastructure they can afford. Our hybrid architecture runs vision and language workloads on the same rack, opening the service revenue our partners can monetize for query. Number three, programmable energy-efficient compute.

This is where the Blaize GSP advantage compounds. Performance per watt, deterministic latency, a software stack that serves vision, LLM and VLM workloads on the same hardware. Hybrid rack-scale is the unit of deployment for the next phase of AI. We are building toward it, and our partners are buying in. On May 6, we closed a $35 million equity offering, supported by a group of large institutional investors. This capital strengthens our balance sheet. The proceeds will support our commercial deal commitments, continued AI services development, rack-scale hybrid platform advancement and next-generation platform development. Blaize is a company executing against one of the most significant opportunities in AI history.

Rack-scale hybrid AI, sovereign infrastructure, the strategic path for recurring AI services revenue and partnerships that put Blaize at the center of the AI inference build-out. Contracts are expanding, partnerships are deepening across an increasingly diverse base of AI use cases. And finally, engagements are advancing in the field. So with that, I'll turn it over to our CFO, Harminder Sehmi.

Harminder Sehmi: Thank you, Dinakar, and good afternoon, everyone. I'm pleased to share our first quarter 2026 results today. First quarter revenue was $2.7 million, up 170% (sic) [172%] year-on-year and in line with the pre-release issued on April 14. As we flagged at that time, this was impacted by an industry-wide shortage of high-bandwidth memory or HBM, the specialized memory chip that is necessary for AI servers primarily used for training or running large language models. That shortage delayed an order to one customer, NeoTensr, that we now expect to fulfill in the second quarter at a value of more than $11 million. This is about a timing issue.

Customer demand remains strong and over 70% of the revenue billed to NeoTensr in Q4 of last year has been collected to date. Beyond NeoTensr, revenue in the quarter included delivery of software licenses and servers to our primarily U.S.-based customer drawn from inventory on hand. As noted on earlier calls, our road map for hybrid servers mitigates against these challenges. Our partner-branded servers powered by Blaize cards deliver competitive AI inference performance without requiring HBM. We expect those servers to begin shipping in the second half of this year, and we have already placed forward orders for Blaize chips and cards.

We're exploring ways in which to strategically procure certain memory cards now to meet our projected demand into 2027. We believe this approach helps derisk our projected revenue growth as the data center opportunities begin to crystallize. In parallel, we are developing a comprehensive rack-scale service solution to address data center inference workloads. We will continue to deliver enhancements to the application features on our AI services platform throughout the year. Given the timing of large orders and the early stage of data center expansion, we expect revenue to be back half weighted this year with visibility increasing as opportunities convert. Gross margin was 58% this quarter, up from 11% in the fourth quarter of 2025.

Two factors drove the expansion. First, the mix shifted towards our higher-margin software and Blaize-powered hardware. Second, the HBM-intensive NeoTensr order shifted into the second quarter. As previously indicated, blended gross margins are expected to be compressed by the higher portion of third-party hardware in our revenue mix in the next 2 quarters. As we begin the transition to deliver more inference servers and recognize recurring software revenues, blended gross margins in the fourth quarter of 2026 should exceed 30%. We anticipate further expansion in gross margin in 2027 as our partnership with Nokia opens additional data center opportunities globally.

Net loss for the first quarter was $22.7 million compared to the net loss of $147.8 million for the same period a year ago. Q1 of 2025 included significant noncash items and onetime merger transaction accounting adjustments. Consistent with previous calls, I'd like to spend a few moments breaking these numbers down to provide clarity about the underlying results, including singling out quarter-on-quarter trends where helpful. Total operating expense, including stock-based compensation of $8.9 million was $25 million in this quarter. This was a decrease of $14.7 million year-over-year. Q1 of 2025 included $11 million of stock-based compensation and $12 million in transaction expenses related to the business combination. The cleaner story is in our operating discipline.

Research and development costs of $5.8 million in the first quarter, excluding stock-based compensation, were marginally lower than the prior quarter cost of $5.9 million. Selling, general and administrative expenses, again, excluding stock-based compensation were $10 million in the first quarter of 2026, up $1.6 million sequentially. Adjusted EBITDA loss for the first quarter this year was $13.9 million, $1.5 million better than the loss in the first quarter of 2025 and $1.9 million higher than the fourth quarter of last year. We ended the first quarter with a cash balance of $33.3 million on March 31, 2026.

On May 6, we announced our $35 million equity raise that extends our runway to the middle of 2027 and adds a new base of shareholders. This round drew strong participation from high-quality institutional investors with deep expertise in data center infrastructure investments. This growth capital will enable us to deliver against demand to accelerate customer rollouts, lean into the data center opportunity and invest in our product road map. We maintain close relationships with our key vendors and continually seek to secure favorable payment terms, which is particularly important during this period of supply chain constraints. As our data center opportunities gain momentum, we also intend to explore appropriate project financing partnerships to support deployments at scale.

Finally, our revenue outlook for full year 2026 remains unchanged with the second half meaningfully stronger than the first. Our adjusted EBITDA loss guidance also remains unchanged at between $45 million and $50 million for the year. In closing, our recent equity raise was well subscribed and drew strong participation from marquee investors with exposure to the data center infrastructure ecosystem. Our AI services platform and rack-scale hybrid AI developments are resonating strongly as the market shifts towards inference and real business outcomes from AI. And finally, we have great and growing partnerships in place to support revenue growth. With that, I'll turn it back over to the operator.

Operator: [Operator Instructions] Our first question comes from Kevin Cassidy with Rosenblatt Securities.

Kevin Cassidy: Congratulations on maintaining the $130 million for the year. When we look at that $130 million, how would you expect it to be spread across geographically for you?

Harminder Sehmi: So it's -- the NeoTensr contract, of course, is expected to contribute a significant portion of the $130 million. There are other opportunities in Asia Pac through the Nokia partnership. Datacomm is the one that we announced. That should start to feature towards the end of Q4. And we have other edge opportunities in Europe that are also expected to be part of that $130 million number. So it's spread around Europe, Asia Pac. Dinakar, I don't know if you want to add.

Dinakar Munagala: Yes, the pipeline is quite strong in North America as well. And we are beginning to discuss some commercialization via orders that in the U.S. as well as in Africa as well. As they materialize, we'll, of course, be sure to announce them.

Kevin Cassidy: Okay. Maybe could you also talk about the effect that maybe the war in Iran might have on some of your opportunities there for security?

Dinakar Munagala: We have actually received significant inbounds for our drone detection system use case that we've demonstrated. This is all about perimeter security kind of use cases. And yes, there's an increased momentum in terms of opportunities coming our way. Of course, as these materialize into POs and revenue, we will keep announcing them.

Kevin Cassidy: Okay. And just one more question on the supply chain. So I think in your pre-announcement, you had said that you're expecting product to be shipped in the April quarter first. Did that happen? And is it only the memory that's the long lead times? Or are you having trouble with other products also?

Harminder Sehmi: So these are the HBM-intensive memory sort of servers and NeoTensr is one of the early customers for the business we do there. So it's actually obtaining the server itself. One of the reasons that we explained in Q1, we could have secured supply, but we would have actually had to pay premiums that we weren't prepared to at the time. As we move forward into Q3, Q4 and our hybrid servers become available and particularly the one we're really excited about is the one with NeoTensr, the white labeled one, which has our Quad PCIe card in it, then some of those supply chain problems should diminish somewhat.

But I think the macro sort of environment is still something that we all need to keep an eye on.

Operator: Our next question comes from Richard Shannon with Craig-Hallum Capital Group.

Richard Shannon: I'll ask a very quick tactical question here regarding the outlook here for the second quarter. Harminder, I think you mentioned you're targeting $11 million for one particular customer. Is that the estimate or starting point you would like us to think about? Or could it be somewhat or meaningfully higher than that?

Harminder Sehmi: It will be somewhat higher, but again, it depends on just getting -- maybe in our one-to-ones, Richard, we can talk a little bit more openly about that. But for now, we have good visibility on getting the NeoTensr delivered in addition to 1 or 2 others that we have in mind.

Richard Shannon: Okay. Perfect. Second question, I guess, for probably both of you, but I want to ask about the Blaize AI services. You're talking about the first application being face recognition rolling out here. I'd love to get kind of a few different questions about this. First of all, over what time period do you expect this to be rolled out and ultimately bring first revenue recognition for you? Are there any particular end markets where you expect to be first adopted? And then the last part is, how do we think about kind of the revenue contribution over the life cycle of your equipment relative to that equipment sale? Is there a percentage we should be thinking about?

Just any way to kind of provide a mental model for that, that would be great.

Dinakar Munagala: Sure. I can take the first part and then Harminder can jump in. AI services, certainly, it is exciting to our cloud service provider partners as well as data centers because it allows them to monetize their infrastructure that they've invested in and that's driving all the momentum. So initial application, of course, we have video-based applications that we are working on, which we're actually working with anchor partners as well as the facial recognition. And the initial target is around use cases around smart kitchens, around immigration, those class of use cases where face rec is pretty widely used. Initial anchor customers are in the Asia region.

Also things like citizen safety, elderly care, et cetera, there's some software that we've developed that is actually being well received. In addition to this, document processing is something that we will be next launching, and that's announcing -- it's already under development, and we will be releasing it to early access cloud service providers once it's complete. And this is actually quite helpful because from an economic standpoint, the cloud infrastructure that they invest will be monetized, the recovery, return on investment is much faster because they'll be able to monetize it through these services. I'll let Harminder...

Harminder Sehmi: Yes. Your other question was the time period. We expect from Q4 onwards to start to deliver some of the CapEx. So if you stand back, the AI services comprises of Blaize-powered servers, hybrid servers. So there's a certain amount of CapEx involved, which we recognize straight away. And then there is a recurring revenue element associated with monetizing the APIs. And that, of course, there will be some sort of a contract in place, but the revenue recognition will be monthly as usage takes place. But Q4 is when we start to see some of that featuring in our revenue mix.

I actually expect to see AI services as a whole becoming a significant feature of 2027 revenue mix and more of it being some of this recurring revenue because we have the opportunity to basically trade off some of the upfront margin that we would make on the CapEx sale in place of higher margin of ongoing software revenues.

Dinakar Munagala: And just to add that although we spoke about these 2 or 3 areas, there's quite a strong and compelling road map behind this that we are announcing and showing our early access partners, and it's resonating well with them. This is actually helping us significantly in terms of translating the conversations into actionable, how they place orders and become long-term partners with us.

Richard Shannon: Okay. Great. My last question, I'll jump on the line here is just a follow-up on Nokia. Obviously, a great partner to have here with worldwide reach. It seems like your first big partnership with Datacomm seems to be the kind of the champion of Indonesia here. How do we expect to see or how should we look for success in other places in Southeast Asia through Nokia? How are those developing? What should we expect to see from that during 2026?

Dinakar Munagala: So we started off about 6, 7 months ago with Nokia. And the initial action was to develop a joint pod, rack-scale offering that comprise both Nokia and Blaize hardware as well as AI services software. And we've demonstrated this at GITEX Asia. That was well received. And there's a pretty strong pipeline of customers behind that cloud service providers, infrastructure players, system integrators that we've been working with. And the first conversion is Datacomm, and there are others behind it. So as these contracts start materializing, we'll start announcing them. I don't know if you want to add any more.

Harminder Sehmi: And just the only thing I'd add is the other thing we're really excited about is the rack-scale hybrid server work that's happening right now because as you recall, a couple of quarters ago, we introduced the whole concept of AI services platform. And what we're now starting to see is that concept resonating really well with cloud service providers. Something that Dinakar has mentioned been mentioning for a while is the faster we can help these Tier 2 players to reduce their ROI through a combination of Blaize hardware and other partner solutions, then the faster we will see the adoption of real-world outcomes from AI being utilized by customers.

Operator: Our next question comes from Craig Ellis with B. Riley Securities.

Craig Ellis: I wanted to pick up where you left off talking about AI services and just clarify, inside of the expectation for $130 million in revenues this year, what have you incorporated for AI services?

Harminder Sehmi: If I take a combination of the hardware and some of the software, probably about 15% to 20%.

Craig Ellis: Got it. And then another lens into the $130 million, we've got more HBM-dependent configurations and HBM free configurations. If we look at the $130 million on the systems side of the business away from services, how does the expectation split between what's dependent upon HBM and what would be HBM free?

Harminder Sehmi: So if you'd asked me this question maybe 3, 4 months ago, I would have said a large portion of the NeoTensr early contract that we've got would be more HBM intensive. What's actually forcing a faster adoption of -- towards our hybrid solutions is the fact that these servers are now becoming uneconomic for some of the smaller players. So out of the $130 million, maybe 20% or so would be the HBM sort of intensive stuff. But I'd see -- I'd expect to see a migration. Our servers start to come on stream in the second half of this year and at scale.

And the faster we can get that done, the faster we can make sure that our own supply chain is unencumbered, then that transition will happen that much faster.

Dinakar Munagala: Yes. Just to add that quite a bit of momentum around our -- the fact that we were able to demonstrate a real end business case return on investment using DDR technology. I think that's actually resonating well with customers. So majority of the $130 million is based on DDR, LPDDR kind of memories.

Operator: [Operator Instructions] Our next question comes from Scott Searle with ROTH.

Scott Searle: Maybe just a couple of follow-ups on Blaize AI services. I wanted to clarify in terms of the ramping recurring model, is that revenue share? Or is that going to be purely capacity driven? And then also to follow up on a couple of the earlier questions, I think you said, Harminder, about 15% to 20% would be tied to that either in CapEx or otherwise in calendar '26. Is all of that to occur in the fourth quarter? And then what's the early thought process then in 2027? You said it would be significant. Just wondering if you could frame it for us. And then I had a couple of follow-ups.

Harminder Sehmi: Okay, sure. So the recurring revenue is partially revenue share, but we also have developed a very rich library of AI models, which we're already monetizing with some of the sales that we've made so far. So it's going to be a combination of the particular deals that we strike with the partners that we've got in cloud service providers, the cloud service provider partners that we get through rev share, through licensing of some of those libraries that we've developed. And then in -- yes, the 15% to 20%, I expect largely in Q4. It's just a question of when those servers of ours become available at scale.

Scott Searle: Got you. And in the past, you guys have talked about a total qualified opportunity pipeline. I'm wondering if you could give us some indication in the ballpark of where that might be. And Dinakar, there were a couple of comments that I found interesting. I think you referenced the United States. some opportunities. I wonder if you could talk about the application in the end market. And I think specifically, you said within Europe, more edge AI applications, and you've mentioned drones a couple of times. I'm wondering how small and scalable do the solutions go? Are you going out to the drones themselves in ruggedized applications? Or is it an other infrastructure that ends up being drone detection?

Dinakar Munagala: So the combination of both. If you see, we do have this small [indiscernible] factor, form factor that can go into a drone. So we do have a pipeline based on that. We also have the connectivity layer to a command and control center and where our servers reside. And there, you could do actions like drone detection, any kind of early drone security warning, which is actually quite an interesting use case amidst what's happening globally. So I'd say it's a combination of both. To the earlier question about U.S., the range of opportunities are from energy-efficient data center. That's one of the initial and driving thing because our servers are inherently lower in power.

And therefore, the OpEx for the end cloud service provider and the cloud and the data center operator is much lower. At the same time, using our AI services, they can monetize the infrastructure. So that's driving the U.S. business. And I don't know if you want to.

Harminder Sehmi: Yes. So you asked about the pipeline. So look, pipeline is constantly evolving. For us, it's a sizable number. We -- and we're prioritizing the near-term opportunities and particularly those that leverage our -- the hybrid AI services advantages. What we are transitioning to focus on, and I'll talk -- start to talk a lot more about this on the next call is about our contracts and POs, about our bookings, about backlog and revenue. I think these are much more meaningful metrics that enable folks like yourselves and investors to get a sense of where the revenue growth is -- how the revenue growth is developing.

Scott Searle: Very helpful. And lastly, if I could, I'll throw out one more. Just the competitive landscape, it's rapidly shifting. It's rapidly evolving out there in terms of edge AI and data center hybridization. I'm wondering who you're seeing on the short list and who you're really competing against besides the large obvious guys?

Dinakar Munagala: So we're actually complementing quite a bit of GPU-based designs. So people look at us as a healthy way to reduce both CapEx and OpEx. So that's one. The second piece is often the discussions are around, hey, I have these enterprises, right? They really care about the use case that they're trying to solve within a certain CapEx and OpEx budget. So really, those are the frameworks that we get in. And then having a programmable solution and the right software and AI services helps us piece together along with our system integrator partners, solutions for the business.

So it's less to do with who's a head-on competitor, but more about how we deliver to a certain business value, and that's what is resonating and leading to wins.

Operator: I'm showing no further questions at this time. I would now like to turn it back to Dinakar Munagala for closing remarks.

Dinakar Munagala: Thank you for your questions. And let me share a few thoughts before we close. The inference market is now and Blaize is positioned at the center of it. 2025 was a breakout 20x growth year and the contracts and partners that we discussed today are extending the trajectory into 2026. NeoTensr drives our Asia Pacific edge data center expansion with $70 million in total value. Nokia anchors our rack-scale engagements and AI services engagements across cloud service providers and infrastructure partners globally with Datacomm extending our reach across Southeast Asia. Winmate brings Blaize into ruggedized platforms for mission-critical operations and embedded edge infrastructure.

Customers are validating our hybrid AI rack-scale platform and our AI services layer as the right way to address the inference economy. The momentum is real, and we're excited and we expect to continue this trajectory in the coming quarters. Thank you for your time and continued support.

Operator: This concludes today's conference call. Thank you for participating. You may now disconnect.