When Moonshot AI's investors sat down with MiniMax's investors
Five individuals and two companies.


Over the past 18 months, AI has been racing ahead in China. Moonshot AI and MiniMax are two large language model companies often mentioned in the same breath: both fit the aesthetic preferences of dollar-denominated funds, and both have moved faster on fundraising. At WAVES 2024 last week, we invited investors from both companies to join us on stage: Yusen Dai, managing partner at ZhenFund; Yungang Huang, managing partner at Source Code Capital; Rui Han, partner at Gaorong Venture Capital; Yu Chen, partner at Yunqi Partners; and Ling Xia, partner at Mingshi Capital. The first three invested in Moonshot AI; the latter two in MiniMax.
From this conversation, we see the fascinating intersection of technology and investment: the hunger and non-consensus that come with everything still being early. When asked about their "initial decision," each person's answer was completely different — some believed in AGI, some believed in "end-to-end," and some simply "believed in this person."
Though they repeatedly emphasized they weren't here to "stoke rivalry," you could still feel the divergence between the two sides. On general-purpose robotics, an industry tied to the AGI concept, some investors on stage had already deployed capital in multiple companies. But Dai took a clear stance: "The industry is too early, not yet at the point where VC should invest."
The AGI industry is indeed still in its early days. And precisely because everything is still so early, some questions are genuinely difficult to answer for entrepreneurs and investors in the thick of it — after all, whatever they say becomes part of the historical record.
WAVES is a new summit IP launched by 36Kr last year; this was its second edition. The conversation was moderated by Jing Liu, editor-in-chief of "Dark Currents WAVES."
Below is an edited transcript:

"Dark Currents": Welcome everyone to WAVES! Before coming on stage, I promised our five guests that we definitely wouldn't stoke rivalry today. After all, everyone here is either an investor in Moonshot AI or an investor in MiniMax. But we also want world peace, so we've deliberately mixed up your seating to avoid things getting too tense. Let's start with each of you introducing yourselves in your own way.
Yusen Dai: We specialize in angel investing, targeting global Chinese entrepreneurs. We've made extensive angel investments in AI, plus AI applications — we're prepared.
Yu Chen: I'm your typical engineering nerd. I met Junjie Yan (MiniMax's founder) in 2021 and we hit it off immediately because we were both discussing computer science topics and really clicked, so I invested in Junjie. We focus on early-stage tech investing and have also invested in many AI projects. For projects that Yusen invests in, we always have different benchmarks.
Rui Han: Over the past decade, Gaorong Venture Capital has consistently focused on projects that make people's lives and the world better. In the past year, we've made very significant investments in AI. We believe this is just the beginning, and we hope for more exciting developments ahead.
Ling Xia: For the past ten years, we've only focused on one thing: early-stage investing in technology. We were fortunate to catch the biggest opportunity of the past decade starting in 2014 — intelligent electric vehicles. We were the earliest institutional investor in Li Auto.
We also got into this wave of AI relatively early. In early 2022 and mid-2022, we invested in large language model company MiniMax and humanoid robotics company LimX Dynamics respectively. This morning, Mr. Huang and MiniMax founder Junjie Yan had an excellent conversation. The biggest investment opportunity over the next 8-10 years is AI, and we will continue to make systematic investments and deployments in the AI track.
Yungang Huang: We particularly value AI and everything AI enables in terms of applications — we're looking at both software and hardware. We've invested in Moonshot AI and GalaxyBot, an embodied intelligence company.
"Dark Currents": Our theme is discussing large language models, and we can start by talking about these two companies. Could you each recall your investment process for MiniMax and Moonshot? Because there are indeed voices saying large language models may not be a game for VC — most VCs in Silicon Valley missed OpenAI's earliest stage. When you went through IC, what was the most important thing you said?
Yusen Dai: We've always invested in people, and we're very focused on investing in young people — fitting today's theme. We believe major waves are often driven by young people. For example, the OpenAI founders were 30, 29, and 28 — very young. We've always sought out the most outstanding young talent. Zhilin Yang — we were watching him back at Tsinghua. He was the god among gods across several cohorts there, and at CMU he was also a very well-known international first-tier AI researcher with excellent work.
When he was doing his PhD at CMU, he participated in founding Recurrent AI, which we invested in because of Zhilin. ZhenFund was the angel investor in Recurrent AI. When Zhilin started working on large models, we saw it as a major wave favoring young people — the best people had already noticed. So we invested; it was an easy decision.
"Dark Currents": ZhenFund has a theory about investing in people. Does he qualify as a "young genius"?
Yusen Dai: Absolutely. He fits our definition of a young genius because he's young, so he can be the best in a new wave — this is a new wave. If you're doing real estate now, it's hard to be a young genius. We want this genius to have no age qualifier — not the best post-90s entrepreneur, or best post-00s entrepreneur, but simply the best entrepreneur, the person who understands AI the most.
When Bill Gates started Microsoft, he was the best programmer. When Zuckerberg started Facebook, he was the best entrepreneur. We want excellence without age as a modifier. We feel Zhilin fully meets these criteria — he's a world-class AI scientist.
"Dark Currents": The key is the person.
Yu Chen: When we talked with Junjie Yan, he said he wanted to build the Chinese version of OpenAI. He spent a lot of time explaining what large models are and why one large model would be more effective than countless small models combined. I also have a programming background, so we talked very well — a bit of kindred spirits.
When I invest in tech companies, I'm used to gauging technical grasp through conversation. Even though they're working on cutting-edge technology, the fundamental grasp of technical details — this is what I value highly. When I worked at Google, I saw what the world's top programmers were like, and I compare founders against that. It's easy to filter out the technically strongest entrepreneurs.
Beyond his pursuit of technology, Junjie started thinking about productization and commercialization very early. He's not someone who only focuses on technology — this is what I appreciate about him.
Rui Han: We've been waiting to see whether there's a large language model product that can truly enter and stay in people's lives. We conducted an internal survey about Kimi and found that not only Gaorong's investment and research teams, but also IT, HR, PR and other colleagues were all using Kimi. And quite a few of them had unsubscribed from GPT to switch to Kimi. This was one small piece of our decision-making.
Is VC a suitable player in the large model game? As Yusen mentioned, whether in China or the US, VC funding is a small piece. Allow me to stitch together two phrases to express my view: don't fail to do good because it seems small, and when everyone adds fuel, the flames rise high.
If I had to say one thing at IC, my answer is: China must and will have its own AGI. If this doesn't work out, any valuation is expensive; if it does, none are expensive.
"Dark Currents": The trend itself matters a lot.
Rui Han: Yes.
Ling Xia: Mr. Huang said this morning that in October 2021, three of us from Mingshi met Junjie Yan, and I was the only one who understood what he was talking about. There's a reason behind this. As everyone knows, we invested very well in the intelligent electric vehicle track, which I later took charge of.
If you follow autonomous driving development, you should know that in 2021, the industry had a landmark change: Tesla's transformer-based BEV and the end-to-end data-driven approach to perception. This was relatively unfamiliar to investors in other tracks. But for an investor focused on vehicles and autonomous driving, this was not unfamiliar at all. The question was simply whether migrating an end-to-end data-driven paradigm from autonomous driving to NLP could work.
Coincidentally, a core member of MiniMax's early team had previously worked on autonomous driving at Uber. When I pushed this internally, I never viewed MiniMax as the Chinese version of OpenAI. We saw this as a new wave of technology. The "large" in large models isn't essential; the Chinese version of OpenAI or the Chinese version of any company isn't essential. What's essential is the new paradigm of end-to-end data driving behind it.
My internal logic has been consistent: how we view autonomous driving, how we view NLP, and starting in mid-2022 how we view end-to-end data-driven robotics — the logic behind this is coherent and continuous. We're investing in a new generation of technology paradigm driven by end-to-end data, and this is the consensus I hoped to build internally.
"Dark Currents": Xia was very honest there. Mr. Chen views these companies from an AGI perspective, but Xia is talking about end-to-end — this is the insight drawn from Tesla in the automotive industry.
Ling Xia: Precisely because I'm viewing it from the end-to-end data-driven new paradigm perspective, I pay more attention to the team's engineering capabilities. Junjie has been through 0-1-100 in research, engineering, and commercialization. He's always thinking about how to do things in an engineering way — this is a very prominent trait of his.
Including how to use 1/10 the cost to get 10x the data — this series of thinking patterns is very engineering-oriented. And at this particular moment, LLMs should be moving from research to engineering-driven development. Our perspective matches his.
Yungang Huang: We invested in Moonshot AI during the Spring Festival, but we had been talking for a long time before that. In our conversations, beyond who would be China's AGI and what the vision was, we also had to talk about concrete execution — the large model landscape, how they understood future productization and commercialization. We're relatively pragmatic. What they said at the time gradually materialized over the following six months, including product capabilities. By that point, it was the right moment.
Because large models are an extremely capital-intensive business, the team proved its strategic and execution capabilities, while more investors came in to support. At that point, it was the best approach, so we invested.

An Yong: For ZhenFund, reading people is a critical part of investing. You're all early-stage investors, and each of you has developed your own methodology for evaluating founders. Now the first and second tiers of AI large model companies have established their distinct styles.
I'd like to ask Xia Ling and Yungang — what distinctive traits do you think teams that ultimately achieve outsized success in AI might possess?
Xia Ling: MingShi has been investing in tech startups for the past decade. The profile of tech founders we like is quite clear and easy to summarize: focused, resilient, and low-key. The founders we've backed all display these three traits very prominently. You could also see the focus, resilience, and low-key nature in Huang's conversation with Junjie this morning.
Another dimension we've consistently adhered to is our belief that what truly builds a tech company is a technologist who becomes an entrepreneur, not a scientist. We're very clear that we should invest in someone who can grow into a technology-savvy entrepreneur, not simply back a scientist.
Because scientists and entrepreneurs think in fundamentally different ways. At MingShi, we evaluate through the complete lens of an entrepreneur's profile, not just whether someone happens to be an outstanding scientist in a particular field.
Yungang Huang: I agree with what Xia said. Regardless of era or industry — including large models today — founders who truly understand product is crucial, and that carries two meanings: deeply understanding user needs, and understanding the path to future commercialization. "Product" encompasses a lot, with technological variables as the foundation, so product sense is particularly important.
Beyond the technical capabilities everyone recognizes in Zhipu AI, his product sense is excellent. Only with that sensibility can you achieve true commercial success. What really creates distance is whether you have distinctive products that generate proprietary data — that's how you reach ultimate success. At the end of the day, the person needs to be well-rounded, understanding both technology and product.
An Yong: Of the five of you here today, how many can code? Does knowing how to code affect how you approach AI investing?
Yusen Dai: In early-stage investing, we're particularly prone to "blind spots under the lamp" — when you understand a domain, paradigm shifts in that domain can actually lead you to make premature negative or critical judgments.
Our investment areas are quite diverse, so any specific background isn't the most crucial factor here. When I started my company selling cosmetics, I didn't actually need cosmetics myself — it was about understanding business, enterprise, and entrepreneurship itself.
An Yong: Chen, since you know how to code, do you have "blind spots under the lamp"?
Chen Yu: I don't have blind spots under the lamp — it's about maintaining an open mindset. You may know programming, but you recognize that what you learned in school is completely different from what's happening now. Technology changes very rapidly; computer science essentially sees its entire knowledge structure iterate every four to five years.
I also agree with Yusen on why people prefer to invest in founders around 30 years old — it implicitly reflects that their knowledge structure is relatively newer, enabling them to create cutting-edge things. I don't have blind spots under the lamp. I approach the latest technology with an open mindset, and I continue learning it myself.
Han Rui: Investors need to grasp the big picture and let go of the small stuff — you don't need to know how to cook to invest in restaurants.
Xia Ling: I've had blind spots under the lamp in my past. I worked on image recognition in graduate school, so when I started in VC, I missed Face++ precisely because of that blind spot. After ten years of investing, I believe whether you can code is also quite superficial. The value of early-stage investors lies in discovering future opportunities before others in the market.
The recognition and capture of opportunities isn't equivalent to knowing how to code — curiosity and learning ability matter more. It depends on where you're willing to invest your energy, where your understanding comes from. If you understand technology, or know programming, you may naturally find it easier to access the frontier of technological development. But that alone is far from sufficient.
Yungang Huang: Having knowledge doesn't equal having judgment — they're two different things. We still need to diligently acquire knowledge, especially since AI now helps us solve many knowledge problems. What matters more is thinking problems through clearly, and AI will increasingly help us solve them. AI can already handle most coding tasks, so whether you can program isn't important.

An Yong: Timing is particularly important in early-stage investing. It's been a year and a half since ChatGPT launched in October 2022 — several "springs and autumns" in China's AI investment industry, with many players coming and going, rankings shifting constantly. Facing such massive and volatile waves, how do you find your own rhythm?
Yusen Dai: Most of us here invested through the internet and mobile internet eras, which was investing when internet technology had reached a relatively mature stage. AI is still in a relatively early phase, so we often look to history for lessons — tracing back to how angel investing started in Silicon Valley, and finding analogies in VC history over the years. We need patience. The "hundred-group wars," "hundred-model wars," "hundred-C wars" — people are accustomed to flocking toward anything new.
But today's AI, or large models, resembles chip-making in its early days — it requires substantial research and scientific components, not just building on top of open-source code. We need patience. In the year and a half since ChatGPT's release, many new application innovations and scenarios have already emerged.
Many say AI evolution has slowed, but that's because expectations have risen — people expect the explosive proliferation of applications we saw with the internet. We need to recognize this is still relatively early-stage technology. Many investors and entrepreneurs say we're in the "Apple era" and should move fast and aggressively. But it's hard to say whether this is the Apple era or the BlackBerry era. Those of us born after 1985 lived through the BlackBerry era; those born after 1995 don't even know what BlackBerry was. BlackBerry-era technology had limitations — you couldn't build TikTok then, even if you knew it would be huge, the technology couldn't support it.
Technology development goes through phases where technology itself is the bottleneck — you can imagine things but can't build them. Eventually technology matures, and the constraint shifts to ideas — if you can imagine it, you can build it; if you can't imagine it, you're stuck. Right now, we're in the "can imagine but can't build" phase. Our technical founders need to judge technology, to predict how far it can advance in the next year or two given current resources. If you think too far ahead, you can't execute. At this stage, understanding technology — especially having frontline research perspectives — is crucial. We need patience. BlackBerry wasn't about a hundred flowers blooming; there were maybe three flowers worth picking, and choosing which one mattered.
Regarding users and monetization — because we've all lived through the mature mobile internet era — people ask about business models. Since we've all seen how the mobile internet story ended, they ask what AI's endgame is. There's no answer to that now. Anyone who claims to have the answer is a fraud.
I posted on Moments before: Google launched in 1998, and only found its core business model by its 2004 IPO. Facebook went live in 2004, and only introduced feed ads with its 2012 IPO, finding its business model then.
If the two great "money-printing machines" of the internet — Google and Facebook — took six and eight years respectively to find their business models, then demanding that an even earlier-stage technology like large model applications develop clear business models and explore endgames and competitive landscapes in just over a year is premature. What matters is penetration rate. Before technology penetration reaches a certain threshold, we're still in a rapid expansion phase.
Thinking about monetization too early is a mismatch of stages. We should focus on how to get more people using AI products, what products current technology enables, what scenarios have genuine user adoption rather than being concepts without users — especially given the current less-than-optimistic capital markets.
An Yong: Learn from history, respect patterns, maintain patience.
Yusen Dai: Patient capital.
Chen Yu: In this era of information overload, it's easy to get swept up by media and peer momentum. I think the most important thing is maintaining independent thinking — don't let others set your pace, and reason from first principles about what's useful and feasible. ChatGPT went viral in December 2022, but we started engaging with large models in early 2021 and decided to invest then.
Why? Because I believe the world always needs general-purpose intelligent agents. While current large model solutions may not be the ultimate solution, they move us substantially in the right direction. It's somewhat like Goldbach's conjecture — it may not solve the problem, but it brings us close enough. Independent thinking in investing matters enormously.
Han Rui: We've looked at different industries over the years, and each has its own cycle and pace. I understand Liu's question as: how do we adapt as investors when looking at different industries?
If the endgame is very distant, don't try to commit precise errors — fuzzy correctness is sufficient. We might place key thermometers along the path: for example, at Gaorong, our researchers and back-office teams have all started using Kimi. We see this as AI products entering daily life and staying there for the first time — this thermometer matters enormously to us along the path.
If we return to the narrow definition of rhythm — the market doesn't care about your rhythm, and the market won't accommodate your rhythm. Before rhythm becomes a logically self-consistent trap, or a constraint on self-transformation, what you need to do is interrupt it.
An Yong: Vigilantly detect changes.
Han Rui: Don't let your rhythm become your main theme — your rhythm means nothing to the world.

An Yong: Successful VC investing in early stages is fundamentally non-consensus. I'd like to ask: facing today's AI, what views do you hold that differ from the market?
Yusen Dai: Right now I don't know what's consensus and what's non-consensus.
An Yong: Ha, I knew you'd say that. Then share your own views.
Yusen Dai: Don't discuss AI monetization. Also, general-purpose humanoid robots are still too early — so we haven't invested in a single one. It's very worth learning about, but genuinely too early, definitely not at its GPT moment.
An Yong: You mean so early that VCs shouldn't invest at all?
Yusen Dai: Depends on whether you have extraordinary patience. Or whether you're approaching it from a research or investment perspective.
Chen Yu: At this stage, energy matters more than so-called compute. Many major US tech companies have halted data center investments — not because they lack money for GPUs, but because the US no longer has sufficient energy supply. Even if you started building nuclear plants now — as we've seen in the news, Microsoft partnering with Westinghouse on nuclear plants — this isn't something that happens overnight. Solving the energy problem is crucial, even more important than the current compute problem.
Han Rui: Building on Chen Yu's point — I was joking with colleagues the other day: with all these time-travel dramas, if a modern person with abundant knowledge traveled back to ancient times, what ultimately limits your ability to change the world? Materials science.
To answer Liu's question — non-consensus is partly a matter of perspective, but there's another dimension that's easily overlooked: degree. Today, I think degree can create enormous distance between people's conviction in AGI. "Belief" and "belief" can be vastly different. Is yours conditional or unconditional? And if conditional, what are the conditions? Belief in your words, belief in your heart, or belief in your actions?
If we use climbing Everest as an analogy — are we still buying gear? Or only bought half the right gear? Have we reached Tibet? Base camp? Or are we actually ascending? Of course, the more optimistic view is that this Everest has no summit, and you can keep climbing forever. I think if we discuss granular non-consensus on details today, things change too fast. But I believe the degree of conviction can create real distance between practitioners and investors.
"Dark Surge": What stage is your conviction at?
Han Rui: Like Moonshot AI's English name — Moonshot. It's a moon landing.
Xia Ling: Let me respond to the previous two speakers. In 2022, we invested in LimX Dynamics, a general-purpose robotics company. Our core thesis was that the end-to-end, data-driven paradigm could transfer to this domain. If we treat humanoid robots or general-purpose robots as our desired endgame — analogous to Level 4 autonomous driving — then however long it takes L4 autonomy to go from concept to deployment, that's how long it will take for general-purpose robots to enter millions of households.
The "ChatGPT Moment" for intelligent, generalized manipulation in humanoid robots is at least 2-3 years away. But are there companies, under this end-to-end data-driven paradigm, that can find their equivalent of L2++ autonomous driving — achieving their own data and commercial flywheel first, while their underlying tech stack continues evolving toward the L4 they ultimately want? Doing what Tesla and Li Auto have done over the past few years. I believe such companies exist in the market today, and the timing is right.
Energy means something different for China versus the US. The US grid and utility companies are extremely fragmented — power infrastructure buildout simply can't keep pace with compute cluster deployment. That's their problem. In China, this isn't an issue. We've done the math: today's AI challenge isn't absolute compute capacity, but energy density in limited physical space — not the absolute amount of energy. China's infrastructure is exceptionally strong, with ultra-high-voltage transmission. This isn't our core bottleneck.
To respond to Liu's question — I think there's an important trend closely tied to AI that hasn't been deeply discussed domestically. Because of this generation of AI, the very essence of "chip" has fundamentally changed. In the PC and digital eras, we thought chips meant large-scale integrated circuits. In the smartphone era, we thought chips meant a single SoC.
But in today's AI era, the chip is a system. Chip design should be design for the system. Moore's Law doubles SoC performance every 18 months, but at the system level, improvement comes by an order of magnitude every two years. This is under-discussed in China today.
Yungang Huang: We've looked at a lot of humanoid robots and invested in some. The world and humanity really need them, because what AI can do today is still extremely limited. We need the physical world to fold our blankets and wash our clothes for us. We need physical companionship, not AI virtual companionship. The question is how far away this is.
Why do robots need two legs? Why legged instead of wheeled? Wheeled is faster. The most intelligent, most general-purpose agents are sitting right here — us humans. We're the most general, the most powerful. If you need to go fast, you take a car; robots can take cars too. For small-range movement, we have stairs, where wheeled designs are at a disadvantage. Why did Musk build humanoid robots? The person who thinks most in first principles — how did he choose? Very simple.

"Dark Surge": An investor shared a question with me: Alibaba's Qwen as an open-source model is competitive, even better than what startups have built. In this race, where can startups actually compete with or find breathing room against large companies?
Chen Yu: You have to compete on differentiation. There are clearly several business models or product forms for large models now. Take API provision — Alibaba Cloud, Volcano Engine have advantages there. They can offer APIs at cost or below, making money from cloud computing instead. But for startups, it's obviously hard to compete with giants on cost and pricing.
Or take productivity tools, whether chat or search — if ByteDance can't charge money, you can't either. You can't be burning compute on one side, paying heavily on another, and ultimately not making it back. Large model startups need to find product and business model differentiation from the giants. Differentiated competition is the crucial direction for startup survival.
Han Rui: It's about the people. When we use a company name or a major firm's name to represent model capability, I don't think that's entirely accurate. We should perhaps pay more attention to the granularity of that small core team — specifically, who built it.
Today's Chinese large model companies are all still in catch-up mode, both giants and startups. From our perspective, this process looks more like a leapfrog race — new entrants can jump ahead by a step. Take a photo at any given second and there's a leader, but a few seconds later the photo might show someone else in first place.
If the answer to a question changes every month in real life, then that answer probably has no meaning over three-to-five-year or longer horizons. What matters more is: through all these leaps, who can consistently stay in the first tier.
We've observed paths to breaking through, not just in AI but other industries: entrepreneurs must first do something extraordinary before they can establish the ordinary. Without the extraordinary, there's no chance to establish the ordinary. Find a point where you have a shot at the extraordinary, then go all in. The people doing things at large companies are specific individuals too, flesh and blood — nothing is unbreakable.

"Dark Surge": Returning to Moonshot and MiniMax — these two companies have reached valuations of $2.5 billion or higher. Are you worried the market is overheating and prematurely透支ing their valuation growth space? Can you predict where the ceiling might be for this wave of Chinese large language model companies?
Yusen Dai: If they succeed, they're all cheap. If they fail, $300 million or $3 billion is expensive. With such massive uncertainty, valuation is hard. Capital markets are bipolar — $3 billion used to be nothing; plenty of companies got there selling coffee and whatnot. Valuation is relative. What matters is ultimately converting technology into products and landing them.
I remember in mobile internet's early days, Instagram sold for $1 billion — 13 people. Everyone said this company was so early, so small, no revenue, and sold for so much. Turned out to be worth hundreds of billions.
In the early days of a technology revolution, if you can genuinely be among the most leading players — whoever you are — I don't think tens of billions of dollars is particularly high. Now, does China's capital market have bubbles? There's not even foaming agent left — who knows where the bubble is.
Xia Ling: Whether MiniMax or Moonshot, Chinese companies today are to some degree undervalued, not overvalued. If you compare with European and American peers, given these two companies' technical levels, their valuations should absolutely be at least 2-3x higher.
Conversely, if America's challenge is energy, then Chinese AI's biggest challenge today is capital. If we want to build a 100,000-H100 compute cluster, you need $4-5 billion in capex. Even leasing, you're paying over $1 billion in rent annually. Working backwards from the required annual fundraising to the implied valuation — that's definitely challenging.
Long-term, I still lean toward extreme optimism. Asked about the ceiling — in 2014, someone asked Mingming Huang, after you invested in Li Auto, what kind of company will it become? He said floor in the tens of billions, ceiling at $100 billion. That was heavily challenged at the time.
For China's top AI companies, as Junjie said this morning — if among the world's top five AI companies, number two is Chinese, then I believe over a ten-year horizon, this should be a trillion-dollar company. The first wave of Chinese internet companies reached tens of billions. The second wave like ByteDance, Alibaba basically reached hundreds of billions. Regrettably for various reasons, ByteDance should logically be China's first trillion-dollar company today.
If we look ten, fifteen years out — if America today has the Magnificent 7, I think China will certainly have 6-7 trillion-dollar market cap companies by then. AI companies won't number fewer than two; that would mean something for China.
I think AI is more like a productivity revolution — the better analogy is the invention of computers. In 1967, IBM's market cap was $150 billion. That year US GDP was over $800 billion, meaning IBM alone, as the leading computer company, was one-quarter of US GDP. If ten years from now China's GDP is 180 trillion RMB or even 200 trillion, what's one-quarter of that worth?
Why was it worth so much? Because 20-30% of US GDP at the time came from the computer industry. If AI can drive 20-30% incremental GDP for China going forward, such enterprises will absolutely be worth it.
Yungang Huang: Large model investment isn't very VC-friendly — the capital requirements are enormous. Actually neither startups nor investors want such high valuations; it's just that because you need to spend money, valuations get pushed up. In fact each round's valuation doesn't rise that much — you just need the money. I believe something massive will definitely emerge, $100 billion or trillion-dollar.
The most important thing for China — everyone shouldn't invest so scattered. Unite to support the best two or three companies, that's most effective. I went to Silicon Valley last year, and every time I'm struck by how American large model companies or teams all want to build AGI for all humanity, to be the leader.
Every Chinese team, regardless of capability or position, says I want to build my own AGI. Let's unite to support that best team together — that would be good, instead of wasting money.
"Dark Surge": Borrowing Wang Chuanfu's words: together, that's China's AGI.
Chen Yu: Everyone answered really well. Being able to sit here today itself shows absolute conviction in AGI. Since we have absolute conviction, the number itself doesn't matter.
Han Rui: Although at this stage I believe the qualitative matters more than the quantitative, I sincerely hope all the numbers everyone mentioned come true.
"Dark Surge": Thank you all.


