Thought Leadership
Mar 13, 2026

VAST FWD 2026: Renen Hallak on Building the New Infrastructure of Intelligence

VAST FWD 2026: Renen Hallak on Building the New Infrastructure of Intelligence

Authored by

Nicole Hemsoth Prickett, Head of Industry Relations

VAST Data Founder and CEO Renen Hallak during his keynote presentation at VAST Forward 2026.

For the last few years, the conversation around artificial intelligence has been centered on models. Bigger models, better models, and of course, faster GPUs to train those models.

But if you listen carefully to VAST founder and CEO, Renen Hallak, the models aren’t the system. They are just the visible surface of something deeper that hasn’t fully emerged yet. The real system, he argues, is the infrastructure layer that coordinates intelligence itself.

Taking center stage in front of a massive crowd gathered in Salt Lake City for the first annual VAST FWD conference, he gave us a glimpse into the future, explaining that the AI era will ultimately be defined not by models but by the operating system that manages the data, memory, identity, and coordination of billions of intelligent agents.

There is a pattern to this transition, he says. Hardware arrives first, followed by the “killer app” moment that emphasizes how much hardware matters. Eventually, though, the complexity becomes too much to manage, and the industry invents a new operating system to organize the chaos. For example, as he tells us, personal computing didn’t really scale until Windows and Macintosh made the PC usable as a platform rather than a device. And mobile didn’t become an ecosystem until iOS and Android turned phones into programmable environments.

AI, Hallak believes, has now reached the same moment. GPUs are the hardware and ChatGPT, Claude, Grok (and similar services) are the killer applications. But what is emerging at the top is that operating system that will turn intelligence into a coherent platform.

For all its innovations to get us to this point, VAST did not begin with that ambition, he reminded the crowd. The company started with a far more mundane problem. In early 2016, neural networks were beginning to show promise, but the infrastructure beneath them had yet to catch up. At that time, DeepMind had just demonstrated early breakthroughs and OpenAI had only recently formed. And as far back as it all seems now, NVIDIA CEO Jensen Huang hadn’t even delivered the first DGX system to Elon Musk.

What Renen Hallak and cofounders saw in that moment was a looming constraint in infrastructure. Neural networks could be powerful if fed enormous quantities of data at super high speed, but unfortunately, the systems responsible for storing and delivering that data were built around legacy tradeoffs that made that kind of scale impossible.

What we realized, after speaking to a bunch of other potential customers in other fields, was that this tradeoff between high capacity and high performance was holding back every large scale analytics project, and if we didn’t solve it at its core, it would become the obstacle that will inhibit AI. And so what we built was a new architecture…we built it from the ground up to eliminate those tiers and break that trade off.

At that time, traditional storage architectures forced a compromise between performance and capacity. Fast systems were small, large systems were slow and that was just a limitation everyone was expected to recognize. This spawned a ton of workarounds with organizations spending huge effort shuttling data between tiers. For conventional enterprises, this compromise was manageable (in part because it was just normal). But for machine learning, it was, no hyperbole, actually catastrophic. The minute training datasets got large enough, the infrastructure fractured into layers of complexity that made efficient compute almost impossible.

So the founding goal at VAST? Break the tradeoff between scale and performance.

The architecture the company built around this mission was based on a disaggregated and shared everything approach. Compute and storage could scale independently but operate as a unified logical system. Metadata was redesigned to manage massive datasets efficiently. Networking and flash storage were treated as integral components rather than peripheral tech.

The result was an infrastructure platform that eliminated the historical boundary between capacity and speed. At the time, it looked like a storage breakthrough. In hindsight, it looks more like the first layer of something much larger.

In his keynote, he traced how that architecture slowly evolved upward through the computing stack as customers introduced new constraints.

VAST Data founder and CEO Renen Hallak speaks at VAST FWD 2026.

The first stage produced the VAST DataStore, a platform that unified object, block, and file access into a single environment. Instead of forcing organizations to run separate infrastructure for different workloads, the DataStore allowed everything to exist inside one platform. This was initially framed as consolidation but in reality it removed a major barrier between AI workloads and the huge archives of enterprise data that had previously been isolated across incompatible systems.

The next layer emerged from a completely different problem. Media companies like Pixar needed to collaborate across continents on datasets measured in petabytes.

Traditional replication approaches made global workflows fragile and slow. VAST responded with DataSpace, a globally consistent namespace that allowed data to be accessed across datacenters, clouds, and edge locations without losing coherence. Once that capability existed, the system began to resemble something more than storage. Data could now exist everywhere while remaining part of a single logical environment and so when you step back, you see what as a storage system now started to look more like a planetary memory layer.

Then another constraint surfaced. Organizations managing billions of files discovered that scanning and indexing their datasets could take hours or days. By the time a system finished traversing a massive filesystem, the underlying data had already changed. The answer was not faster scanning but a new kind of database embedded directly into the platform. That database allowed structured information, metadata, vector embeddings, and application data to coexist within the same infrastructure layer. Instead of maintaining separate databases, object stores, and file systems, organizations could manage all of that information inside a single platform capable of querying it in real time.

At that point Hallak realized the system had crossed an important threshold. VAST was no longer building storage, it was building a data platform. But the transformation did not stop there.

The next step came from a genomics company that wanted its pipelines to become data driven. Instead of applications constantly querying storage systems, the infrastructure itself should trigger computation whenever new information appeared. That request produced VAST DataEngine, a runtime environment capable of orchestrating functions, triggers, streams, and filters directly inside the platform. Data no longer waited for applications to ask questions. It began driving computation itself.

When you step back and look at the architecture through that lens, the outline of something familiar appears. Storage becomes a core service. A database layer organizes structured information. An execution engine coordinates computation. These are precisely the ingredients that historically form the kernel of an operating system. Hallak did not set out to build one. But by solving successive infrastructure problems, the platform gradually acquired the characteristics of one.

With the past and present view in place he broadened the scope to frame the future of AI as a fork in the road.

One possibility is that current models remain sophisticated prediction engines. They will produce better text, images, and software, but the overall architecture of computing will remain recognizable.

But the other possibility is far more disruptive. If predicting the next word captures something fundamental about reasoning, then the coming decade could produce systems capable of performing complex cognitive work. Agents will write software, design experiments, coordinate projects, and generate scientific hypotheses at speeds humans cannot match.

In that world the core challenge is no longer intelligence itself. It is coordination. Billions of agents interacting with data, models, and infrastructure will generate an explosion of activity that must be observed, audited, and controlled. Those agents will communicate with one another, create new models, access sensitive information, and potentially act on behalf of humans in the real world. Someone has to track their actions, enforce policy, manage identity, and maintain the memory of what they have learned.

That responsibility, he argues, belongs to the AI operating system.

Unlike traditional operating systems, this system cannot exist on a single machine. It must operate across datacenters, clouds, edge devices, and robotic systems simultaneously. It must manage both structured and unstructured data while orchestrating training pipelines, inference workloads, and reasoning systems. It must track the memory and communication of billions of agents while enforcing security and identity policies. In other words, it must coordinate intelligence at planetary scale.

Hallak borrowed a metaphor from Nvidia CEO Jensen Huang to describe the broader stack. At the bottom lies energy. Above that sit chips and hardware. Then come cloud platforms, AI models, and applications. Each layer of that stack is being rebuilt for the AI era. The role VAST intends to play sits squarely in the middle. Software infrastructure. The layer responsible for coordinating the flow of data and computation between hardware and intelligent systems.

Seen through that lens, the pieces VAST has built over the past decade begin to look less like individual products and more like components of an emerging operating system. DataStore manages unstructured information. The database layer organizes structured data. DataEngine orchestrates computation. Together they form the kernel of a system designed to coordinate AI workloads at global scale.

The next phase involves expanding that kernel outward.

He described several capabilities now being integrated into the platform from global key value caching to reduce inference latency and improve GPU efficiency. He outlined infrastructure that allow agents to fine tune models autonomously, security systems that enforce identity and policy across agent interactions, and confidential computing environments that allow organizations to monetize proprietary models without exposing the underlying weights.

Each feature solves a practical engineering challenge but together they start to resemble the infrastructure needed to manage a full-fledged economy of autonomous agents.

He ended the keynote by stepping briefly away from architecture. AI, he said, may eventually produce extraordinary abundance. Software, knowledge, and scientific discovery could become dramatically cheaper to generate. But abundance does not automatically produce wisdom. Technology can accelerate discovery, but it cannot determine what humanity should value.

Extreme personalization means that our medicine will be personalized and designed especially for us. Our entertainment, every movie, will be generated once and viewed once and then thrown away. Extreme abundance means that everything will be disposable. Agentic scientists will be able to test more theories in a second than the entirety of the human race was able to generate in a generation.

Looking across a room filled with engineers, researchers, and architects, he framed the coming decade as a shared responsibility. The infrastructure being built today may shape the behavior of the first generation of machines capable of reasoning at scale. The story of VAST, as he told it, is not about storage. It is about infrastructure evolving quietly until it becomes the invisible operating system beneath a new form of intelligence.

And if his thesis proves correct, the most important system in the AI era may not be the models we interact with. It will be the one keeping track of everything they know.

More from this topic

Learn what VAST can do for you

Sign up for our newsletter and learn more about VAST or request a demo and see for yourself.

* Required field.