返回列表
🧠 阿头学 · 💰投资

AI 是底座型基础设施,而非单纯软件

AI 从"预录软件"到"实时生成智能"的范式转移,决定了整套硬件技术栈必须重构,这将引发人类历史上最大规模的基础设施建设浪潮。
打开原文 ↗

2026-03-10 原文链接 ↗
阅读简报
双语对照
完整翻译
原文
讨论归档

核心观点

  • 范式转移的物理约束 从检索既定指令到实时推理生成,每个 token 都对应电力消耗、热量管理、芯片并行。这不是软件优化问题,而是工业系统重构问题。
  • 五层蛋糕的因果链条 能源→芯片→基础设施→模型→应用,每层成功都会向下拉动需求。应用产生经济价值后,会直接牵动到发电厂。
  • 开源模型的双刃效应 DeepSeek-R1 等开源模型逼近前沿,激活了整个堆栈的需求——不只是降低应用门槛,更直接拉动训练、基础设施、芯片与能源的需求。
  • 生产力飞轮的真实案例 放射科医生需求反增不是矛盾,而是 AI 承担例行工作后,医生专注判断与照护,医院服务更多病人、雇佣更多人。生产力→产能→增长。
  • 基础设施建设的规模与紧迫性 已投数千亿美元,仍需数万亿美元。电工、管道工、网络技术员等技能型工作供给紧缺,这是人类历史上最大规模的基建项目。

跟我们的关联

  • 与 Neta 的"实时社交智能"产品方向高度契合:每次互动都是基于用户上下文的实时推理生成,而非模板检索
  • 海外增长策略可利用中国团队对模型商品化(成本暴跌)的响应优势,用算力成本差转化为增长资金优势
  • 20 人团队管理可套用"放射科医生"模型:AI 剥离基础工作,团队全员聚焦战略、品牌、用户洞察

讨论引子

  • 当 Neta 的 DAU 从 10 万增长到 100 万,单位推理成本与延迟会成为增长的硬约束吗?我们的商业模型能承受的 $/DAU 上限是多少?
  • 开源模型逼近前沿后,Neta 的护城河是什么——是模型能力、还是对用户上下文的理解、还是推理成本与延迟的优化能力?
  • 如果把 Neta 看作一座"社交智能工厂",我们应该如何重新定义团队的 KPI——从"节省时间"转向"释放产能"与"增长"?

AI 是当今塑造世界的最强大力量之一。它不是一个聪明的应用或单一模型;它是一种关键基础设施,就像电力和互联网。

AI 建立在真实的硬件、真实的能源与真实的经济账上。它把原材料转化为可规模化的智能。每家公司都会使用它。每个国家都会建设它。

要理解为什么 AI 会以这种方式展开,最好从第一性原理出发,看看计算领域到底发生了哪些根本性变化。

从“预录软件”到“实时智能”

在计算机历史的大部分时间里,软件都是“预录”的:人类写出算法,计算机照着执行。数据必须被精心结构化,存进表格,再通过精确的查询取回。SQL 之所以不可或缺,正是因为它让那个世界得以运转。

AI 打破了这种模式。

第一次,我们拥有了一种能够理解非结构化信息的计算机。它能看图像、读文本、听声音,并理解其中的意义。它能对上下文与意图进行推理。最重要的是,它能实时生成智能。

每一次回应都是新生成的。每一个答案都取决于你提供的上下文。这不是软件在检索既存指令;这是软件在推理,并按需生成智能。

由于智能是在实时产生的,其下方的整套计算技术栈都必须被重新发明。

AI 作为基础设施

从工业视角看,AI 可以分解为一个五层堆栈。

能源

在最底层的是能源。实时生成的智能,需要实时供给的电力。每一个生成的 token,都是电子流动、热量被管理、能量被转化为计算的结果。在此之下不存在任何抽象层。能源是 AI 基础设施的第一性原理,也是限制系统能够产出多少智能的硬约束。

芯片

能源之上是芯片。它们是为在超大规模下高效地把能量转化为计算而设计的处理器。AI 工作负载需要巨大的并行性、高带宽内存以及高速互连。芯片层的进步决定了 AI 能以多快的速度扩展,也决定了智能会变得多么可负担。

基础设施

芯片之上是基础设施。这包括土地、电力输送、冷却、施工建造、网络,以及将数万颗处理器编排成一台机器的系统。这些系统就是 AI 工厂。它们不是为存储信息而设计的,而是为制造智能而设计的。

模型

基础设施之上是模型。AI 模型可以理解多种信息:语言、生物学、化学、物理学、金融、医学,甚至包括现实物理世界本身。语言模型只是其中一类。一些最具颠覆性的工作,正在蛋白质 AI、化学 AI、物理仿真、机器人与自主系统等领域发生。

应用

最上层是应用——经济价值在这里产生:药物发现平台、工业机器人、法律副驾驶、自动驾驶汽车。一辆自动驾驶汽车,是被具身到机器中的 AI 应用;一台人形机器人,则是被具身到身体中的 AI 应用。同一套技术栈,不同的结果。

这就是五层蛋糕:

能源 → 芯片 → 基础设施 → 模型 → 应用。

每一个成功的应用都会牵动其下方的每一层,一直牵动到为它供能、让它运转的发电厂。

这场建设才刚刚开始。我们已经投入了数千亿美元,但仍有数万亿美元的基础设施需要建造。

在世界各地,我们正看到芯片工厂、计算机装配厂以及 AI 工厂以空前的规模开工建设。这正在成为人类历史上最大规模的基础设施建设。

支撑这场建设所需要的劳动力规模极其庞大。AI 工厂需要电工、水管工、管道装配工、钢结构工人、网络技术员、安装人员和操作员。

这些都是技能型、报酬优渥的工作,但目前供给紧缺。参与这场变革,你并不需要计算机科学博士学位。

与此同时,AI 正在推动知识经济的生产力提升。以放射科为例:AI 现在已能辅助阅片,但对放射科医生的需求仍在增长。这并不矛盾。

放射科医生的目的在于照护病人,阅片只是过程中的一项任务。当 AI 承担起更多例行工作,放射科医生就能把精力集中在判断、沟通与照护上。医院因此更高效:能服务更多病人,也会雇佣更多人。

生产力带来产能。产能带来增长。

过去一年发生了什么变化?

在过去一年里,AI 跨过了一个重要门槛。模型已经好到可以在规模化场景中真正派上用场。推理能力提升。幻觉减少。事实对齐显著改善。第一次,基于 AI 构建的应用开始产生真实的经济价值。

药物发现、物流、客户服务、软件开发与制造业等领域的应用,已经展现出很强的产品与市场匹配度。这些应用会强力拉动其下方的每一层。

开源模型在这里扮演关键角色。世界上大多数模型都是免费的。研究者、创业公司、企业,乃至整个国家,都依赖开源模型来参与先进 AI。当开源模型逼近技术前沿,它们改变的就不只是软件;它们会激活整个堆栈的需求。

DeepSeek-R1 就是一个有力例证。它让一个强大的推理模型得以广泛获取,从而加速了应用层的采用,并提高了对训练、基础设施、芯片与能源等下层要素的需求。

这意味着什么

当你把 AI 看作关键基础设施,其含义就会变得清晰。

AI 以一个 Transformer LLM 为起点。但它远不止于此。它是一场工业级的变革,将重塑能源如何生产与消费、工厂如何建设、工作如何组织,以及经济如何增长。

之所以会建设 AI 工厂,是因为智能如今需要实时生成。之所以要重新设计芯片,是因为效率决定了智能能以多快速度扩展。能源之所以成为中心,是因为它设定了智能产出的上限。应用之所以加速,是因为其下方的模型跨过了门槛,终于能在规模化场景中发挥作用。

每一层都会强化其它层。

这就是为什么这场建设规模如此庞大。这就是为什么它会同时触达如此多的行业。而这也正是为什么它不可能局限于某一个国家或某一个领域。每家公司都会使用 AI。每个国家都会建设它。

我们仍处在早期。大量基础设施尚未存在。大量劳动力尚未完成培训。大量机会也尚未被释放。

但方向已经很清楚。

AI 正在成为现代世界的底座型基础设施。而我们此刻做出的选择——建得多快、参与得多广、部署得多负责——将决定这个时代最终会成为什么样子。

链接: http://x.com/i/article/2027423520918310912

AI is one of the most powerful forces shaping the world today. It is not a clever app or a single model; it is essential infrastructure, like electricity and the internet.

AI runs on real hardware, real energy, and real economics. It takes raw materials and converts them into intelligence at scale. Every company will use it. Every country will build it.

To understand why AI is unfolding this way, it helps to reason from first principles and look at what has fundamentally changed in computing.

AI 是当今塑造世界的最强大力量之一。它不是一个聪明的应用或单一模型;它是一种关键基础设施,就像电力和互联网。

AI 建立在真实的硬件、真实的能源与真实的经济账上。它把原材料转化为可规模化的智能。每家公司都会使用它。每个国家都会建设它。

要理解为什么 AI 会以这种方式展开,最好从第一性原理出发,看看计算领域到底发生了哪些根本性变化。

From Pre‑Recorded Software to Real‑Time Intelligence

For most of computing history, software was pre‑recorded. Humans described an algorithm. Computers executed it. Data had to be carefully structured, stored into tables, and retrieved through precise queries. SQL became indispensable because it made that world workable.

AI breaks that model.

For the first time, we have a computer that can understand unstructured information. It can see images, read text, hear sound, and understand meaning. It can reason about context and intent. Most importantly, it generates intelligence in real time.

Every response is newly created. Every answer depends on the context you provide. This is not software retrieving stored instructions. This is software reasoning and generating intelligence on demand.

Because intelligence is produced in real time, the entire computing stack beneath it had to be reinvented.

从“预录软件”到“实时智能”

在计算机历史的大部分时间里,软件都是“预录”的:人类写出算法,计算机照着执行。数据必须被精心结构化,存进表格,再通过精确的查询取回。SQL 之所以不可或缺,正是因为它让那个世界得以运转。

AI 打破了这种模式。

第一次,我们拥有了一种能够理解非结构化信息的计算机。它能看图像、读文本、听声音,并理解其中的意义。它能对上下文与意图进行推理。最重要的是,它能实时生成智能。

每一次回应都是新生成的。每一个答案都取决于你提供的上下文。这不是软件在检索既存指令;这是软件在推理,并按需生成智能。

由于智能是在实时产生的,其下方的整套计算技术栈都必须被重新发明。

AI as Infrastructure

When you look at AI industrially, it resolves into a five-layer stack.

Energy

At the foundation is energy. Intelligence generated in real time requires power generated in real time. Every token produced is the result of electrons moving, heat being managed, and energy being converted into computation. There is no abstraction layer beneath this. Energy is the first principle of AI infrastructure and the binding constraint on how much intelligence the system can produce.

Chips

Above energy are the chips. These are processors designed to transform energy into computation efficiently at massive scale. AI workloads require enormous parallelism, high-bandwidth memory, and fast interconnects. Progress at the chip layer determines how fast AI can scale and how affordable intelligence becomes.

Infrastructure

Above chips is infrastructure. This includes land, power delivery, cooling, construction, networking, and the systems that orchestrate tens of thousands of processors into one machine. These systems are AI factories. They are not designed to store information. They are designed to manufacture intelligence.

Models

Above infrastructure are the models. AI models understand many kinds of information: language, biology, chemistry, physics, finance, medicine, and the physical world itself. Language models are only one category. Some of the most transformative work is happening in protein AI, chemical AI, physical simulation, robotics, and autonomous systems.

Applications

At the top are applications, where economic value is created. Drug discovery platforms. Industrial robotics. Legal copilots. Self-driving cars. A self-driving car is an AI application embodied in a machine. A humanoid robot is an AI application embodied in a body. Same stack. Different outcomes.

That is the five-layer cake:

Energy → chips → infrastructure → models → applications.

Every successful application pulls on every layer beneath it, all the way down to the power plant that keeps it alive.

We have only just begun this buildout. We are a few hundred billion dollars into it. Trillions of dollars of infrastructure still need to be built.

Around the world, we are seeing chip factories, computer assembly plants, and AI factories being constructed at unprecedented scale. This is becoming the largest infrastructure buildout in human history.

The labor required to support this buildout is enormous. AI factories need electricians, plumbers, pipefitters, steelworkers, network technicians, installers, and operators.

These are skilled, well-paid jobs, and they are in short supply. You do not need a PhD in computer science to participate in this transformation.

At the same time, AI is driving productivity across the knowledge economy. Consider radiology. AI now assists with reading scans, but demand for radiologists continues to grow. That is not a paradox.

A radiologist’s purpose is to care for patients. Reading scans is one task along the way. When AI takes on more of the routine work, radiologists can focus on judgment, communication, and care. Hospitals become more productive. They serve more patients. They hire more people.

Productivity creates capacity. Capacity creates growth.

AI 作为基础设施

从工业视角看,AI 可以分解为一个五层堆栈。

能源

在最底层的是能源。实时生成的智能,需要实时供给的电力。每一个生成的 token,都是电子流动、热量被管理、能量被转化为计算的结果。在此之下不存在任何抽象层。能源是 AI 基础设施的第一性原理,也是限制系统能够产出多少智能的硬约束。

芯片

能源之上是芯片。它们是为在超大规模下高效地把能量转化为计算而设计的处理器。AI 工作负载需要巨大的并行性、高带宽内存以及高速互连。芯片层的进步决定了 AI 能以多快的速度扩展,也决定了智能会变得多么可负担。

基础设施

芯片之上是基础设施。这包括土地、电力输送、冷却、施工建造、网络,以及将数万颗处理器编排成一台机器的系统。这些系统就是 AI 工厂。它们不是为存储信息而设计的,而是为制造智能而设计的。

模型

基础设施之上是模型。AI 模型可以理解多种信息:语言、生物学、化学、物理学、金融、医学,甚至包括现实物理世界本身。语言模型只是其中一类。一些最具颠覆性的工作,正在蛋白质 AI、化学 AI、物理仿真、机器人与自主系统等领域发生。

应用

最上层是应用——经济价值在这里产生:药物发现平台、工业机器人、法律副驾驶、自动驾驶汽车。一辆自动驾驶汽车,是被具身到机器中的 AI 应用;一台人形机器人,则是被具身到身体中的 AI 应用。同一套技术栈,不同的结果。

这就是五层蛋糕:

能源 → 芯片 → 基础设施 → 模型 → 应用。

每一个成功的应用都会牵动其下方的每一层,一直牵动到为它供能、让它运转的发电厂。

这场建设才刚刚开始。我们已经投入了数千亿美元,但仍有数万亿美元的基础设施需要建造。

在世界各地,我们正看到芯片工厂、计算机装配厂以及 AI 工厂以空前的规模开工建设。这正在成为人类历史上最大规模的基础设施建设。

支撑这场建设所需要的劳动力规模极其庞大。AI 工厂需要电工、水管工、管道装配工、钢结构工人、网络技术员、安装人员和操作员。

这些都是技能型、报酬优渥的工作,但目前供给紧缺。参与这场变革,你并不需要计算机科学博士学位。

与此同时,AI 正在推动知识经济的生产力提升。以放射科为例:AI 现在已能辅助阅片,但对放射科医生的需求仍在增长。这并不矛盾。

放射科医生的目的在于照护病人,阅片只是过程中的一项任务。当 AI 承担起更多例行工作,放射科医生就能把精力集中在判断、沟通与照护上。医院因此更高效:能服务更多病人,也会雇佣更多人。

生产力带来产能。产能带来增长。

What Changed in the Last Year?

In the past year, AI crossed an important threshold. Models became good enough to be useful at scale. Reasoning improved. Hallucinations dropped. Grounding improved dramatically. For the first time, applications built on AI began generating real economic value.

Applications in drug discovery, logistics, customer service, software development, and manufacturing are already showing strong product-market fit. These applications pull hard on every layer beneath them.

Open-source models play a critical role here. Most of the world’s models are free. Researchers, startups, enterprises, and entire nations rely on open models to participate in advanced AI. When open models reach the frontier, they do not just change software. They activate demand across the entire stack.

DeepSeek-R1 was a powerful example of this. By making a strong reasoning model widely available, it accelerated adoption at the application layer and increased demand for training, infrastructure, chips, and energy beneath it.

过去一年发生了什么变化?

在过去一年里,AI 跨过了一个重要门槛。模型已经好到可以在规模化场景中真正派上用场。推理能力提升。幻觉减少。事实对齐显著改善。第一次,基于 AI 构建的应用开始产生真实的经济价值。

药物发现、物流、客户服务、软件开发与制造业等领域的应用,已经展现出很强的产品与市场匹配度。这些应用会强力拉动其下方的每一层。

开源模型在这里扮演关键角色。世界上大多数模型都是免费的。研究者、创业公司、企业,乃至整个国家,都依赖开源模型来参与先进 AI。当开源模型逼近技术前沿,它们改变的就不只是软件;它们会激活整个堆栈的需求。

DeepSeek-R1 就是一个有力例证。它让一个强大的推理模型得以广泛获取,从而加速了应用层的采用,并提高了对训练、基础设施、芯片与能源等下层要素的需求。

What This Means

When you see AI as essential infrastructure, the implications become clear.

AI starts with a transformer LLM. But it’s much more. It is an industrial transformation that reshapes how energy is produced and consumed, how factories are built, how work is organized, and how economies grow.

AI factories are being built because intelligence is now generated in real time. Chips are being redesigned because efficiency determines how fast intelligence can scale. Energy becomes central because it sets the ceiling on how much intelligence can be produced at all. Applications accelerate because the models beneath them have crossed a threshold where they are finally useful at scale.

Every layer reinforces the others.

This is why the buildout is so large. This is why it touches so many industries at once. And this is why it will not be confined to a single country or a single sector. Every company will use AI. Every nation will build it.

We are still early. Much of the infrastructure does not yet exist. Much of the workforce has not yet been trained. Much of the opportunity has not yet been realized.

But the direction is clear.

AI is becoming the foundational infrastructure of the modern world. And the choices we make now, how fast we build, how broadly we participate, and how responsibly we deploy it, will shape what this era becomes.

Link: http://x.com/i/article/2027423520918310912

这意味着什么

当你把 AI 看作关键基础设施,其含义就会变得清晰。

AI 以一个 Transformer LLM 为起点。但它远不止于此。它是一场工业级的变革,将重塑能源如何生产与消费、工厂如何建设、工作如何组织,以及经济如何增长。

之所以会建设 AI 工厂,是因为智能如今需要实时生成。之所以要重新设计芯片,是因为效率决定了智能能以多快速度扩展。能源之所以成为中心,是因为它设定了智能产出的上限。应用之所以加速,是因为其下方的模型跨过了门槛,终于能在规模化场景中发挥作用。

每一层都会强化其它层。

这就是为什么这场建设规模如此庞大。这就是为什么它会同时触达如此多的行业。而这也正是为什么它不可能局限于某一个国家或某一个领域。每家公司都会使用 AI。每个国家都会建设它。

我们仍处在早期。大量基础设施尚未存在。大量劳动力尚未完成培训。大量机会也尚未被释放。

但方向已经很清楚。

AI 正在成为现代世界的底座型基础设施。而我们此刻做出的选择——建得多快、参与得多广、部署得多负责——将决定这个时代最终会成为什么样子。

链接: http://x.com/i/article/2027423520918310912

相关笔记

AI is one of the most powerful forces shaping the world today. It is not a clever app or a single model; it is essential infrastructure, like electricity and the internet.

AI runs on real hardware, real energy, and real economics. It takes raw materials and converts them into intelligence at scale. Every company will use it. Every country will build it.

To understand why AI is unfolding this way, it helps to reason from first principles and look at what has fundamentally changed in computing.

From Pre‑Recorded Software to Real‑Time Intelligence

For most of computing history, software was pre‑recorded. Humans described an algorithm. Computers executed it. Data had to be carefully structured, stored into tables, and retrieved through precise queries. SQL became indispensable because it made that world workable.

AI breaks that model.

For the first time, we have a computer that can understand unstructured information. It can see images, read text, hear sound, and understand meaning. It can reason about context and intent. Most importantly, it generates intelligence in real time.

Every response is newly created. Every answer depends on the context you provide. This is not software retrieving stored instructions. This is software reasoning and generating intelligence on demand.

Because intelligence is produced in real time, the entire computing stack beneath it had to be reinvented.

AI as Infrastructure

When you look at AI industrially, it resolves into a five-layer stack.

Energy

At the foundation is energy. Intelligence generated in real time requires power generated in real time. Every token produced is the result of electrons moving, heat being managed, and energy being converted into computation. There is no abstraction layer beneath this. Energy is the first principle of AI infrastructure and the binding constraint on how much intelligence the system can produce.

Chips

Above energy are the chips. These are processors designed to transform energy into computation efficiently at massive scale. AI workloads require enormous parallelism, high-bandwidth memory, and fast interconnects. Progress at the chip layer determines how fast AI can scale and how affordable intelligence becomes.

Infrastructure

Above chips is infrastructure. This includes land, power delivery, cooling, construction, networking, and the systems that orchestrate tens of thousands of processors into one machine. These systems are AI factories. They are not designed to store information. They are designed to manufacture intelligence.

Models

Above infrastructure are the models. AI models understand many kinds of information: language, biology, chemistry, physics, finance, medicine, and the physical world itself. Language models are only one category. Some of the most transformative work is happening in protein AI, chemical AI, physical simulation, robotics, and autonomous systems.

Applications

At the top are applications, where economic value is created. Drug discovery platforms. Industrial robotics. Legal copilots. Self-driving cars. A self-driving car is an AI application embodied in a machine. A humanoid robot is an AI application embodied in a body. Same stack. Different outcomes.

That is the five-layer cake:

Energy → chips → infrastructure → models → applications.

Every successful application pulls on every layer beneath it, all the way down to the power plant that keeps it alive.

We have only just begun this buildout. We are a few hundred billion dollars into it. Trillions of dollars of infrastructure still need to be built.

Around the world, we are seeing chip factories, computer assembly plants, and AI factories being constructed at unprecedented scale. This is becoming the largest infrastructure buildout in human history.

The labor required to support this buildout is enormous. AI factories need electricians, plumbers, pipefitters, steelworkers, network technicians, installers, and operators.

These are skilled, well-paid jobs, and they are in short supply. You do not need a PhD in computer science to participate in this transformation.

At the same time, AI is driving productivity across the knowledge economy. Consider radiology. AI now assists with reading scans, but demand for radiologists continues to grow. That is not a paradox.

A radiologist’s purpose is to care for patients. Reading scans is one task along the way. When AI takes on more of the routine work, radiologists can focus on judgment, communication, and care. Hospitals become more productive. They serve more patients. They hire more people.

Productivity creates capacity. Capacity creates growth.

What Changed in the Last Year?

In the past year, AI crossed an important threshold. Models became good enough to be useful at scale. Reasoning improved. Hallucinations dropped. Grounding improved dramatically. For the first time, applications built on AI began generating real economic value.

Applications in drug discovery, logistics, customer service, software development, and manufacturing are already showing strong product-market fit. These applications pull hard on every layer beneath them.

Open-source models play a critical role here. Most of the world’s models are free. Researchers, startups, enterprises, and entire nations rely on open models to participate in advanced AI. When open models reach the frontier, they do not just change software. They activate demand across the entire stack.

DeepSeek-R1 was a powerful example of this. By making a strong reasoning model widely available, it accelerated adoption at the application layer and increased demand for training, infrastructure, chips, and energy beneath it.

What This Means

When you see AI as essential infrastructure, the implications become clear.

AI starts with a transformer LLM. But it’s much more. It is an industrial transformation that reshapes how energy is produced and consumed, how factories are built, how work is organized, and how economies grow.

AI factories are being built because intelligence is now generated in real time. Chips are being redesigned because efficiency determines how fast intelligence can scale. Energy becomes central because it sets the ceiling on how much intelligence can be produced at all. Applications accelerate because the models beneath them have crossed a threshold where they are finally useful at scale.

Every layer reinforces the others.

This is why the buildout is so large. This is why it touches so many industries at once. And this is why it will not be confined to a single country or a single sector. Every company will use AI. Every nation will build it.

We are still early. Much of the infrastructure does not yet exist. Much of the workforce has not yet been trained. Much of the opportunity has not yet been realized.

But the direction is clear.

AI is becoming the foundational infrastructure of the modern world. And the choices we make now, how fast we build, how broadly we participate, and how responsibly we deploy it, will shape what this era becomes.

Link: http://x.com/i/article/2027423520918310912

📋 讨论归档

讨论进行中…