Welcome to my personal breakdown of the opening of 3DEXPERIENCE World 2026 in Houston, Texas. If you couldn’t be there in person with thousands of designers, engineers, and makers, don’t worry, I followed the sessions closely and pulled together what really matters.
This year already feels different. The event broke records as the most-watched stream in the history of the community, and honestly, it’s not surprising. From the first minutes, the atmosphere was intense. One topic clearly dominated everything else: Artificial Intelligence not as hype, but as something very real for engineering.
Manish KUMAR, CEO of SOLIDWORKS, opened with a message that felt refreshingly honest. He didn’t ignore the fear around AI, job security, relevance, replacement. Instead, he addressed it directly. His position was simple and clear: AI is the engine, but engineers are still the ones driving.
AI is excellent at handling repetitive, pattern-based work. But engineering value doesn’t disappear because of that—it shifts. To explain where we are today, Manish used one of the best analogies I’ve heard so far.
He compared AI to the discovery of fire. Early humans didn’t discover fire to build civilisation. They used it for warmth and protection. Only later did it lead to smelting, furnaces, and the industrial world. The same thing happened with the steam engine, it started as a pump for flooded mines long before it powered factories and trains.
According to Manish, we’re currently in the “spark phase” of AI. We have the fire, text summaries, image generation, code debugging but we haven’t yet built the furnace or cooked the meal. That next wave is coming, and engineers will decide what AI actually powers.
This analogy really works. It takes AI out of the “panic zone” and puts it back where it belongs: as a tool. That said, I do think this spark is spreading fast. AI is moving quicker than many traditional engineering cycles, and we need to stay alert, not afraid, just realistic.
One of the most powerful lines of the keynote was this: “A brain needs a body.”
AI might be smart, but without physical systems, it’s useless. Manish illustrated this with a mechanical dog. AI can learn how to walk but it still needs joints that move smoothly, structures that carry loads, and systems that manage heat.
AI doesn’t break the laws of physics. Engineers still have to design machines that survive real life: slipping floors, unexpected impacts, thermal limits, material fatigue. Even the data centres powering AI rely on massive mechanical, hydraulic, and thermal systems.
The physical world remains the industrial backbone of AI.
This was a much-needed reality check. You can have the smartest algorithm in the world, but if an actuator fails or a system overheats, that intelligence goes nowhere. I love this renewed focus on physical engineering. Software may evolve quickly, but making things work in the real, messy world is still where engineers truly shine.
Next on stage was Pascal Daloz, CEO of Dassault Systèmes, who explained how AI is being embedded into what they call 3D Universes. His message was clear: this isn’t AI as a gimmick or a black box, it’s AI inside engineering.
To make this real, they introduced three Virtual Companions:
Aura manages context and knowledge, requirements, changes, and project flow
Marie focuses on science: materials, chemistry, formulations, and compliance
Leo handles engineering reasoning: structures, mechanics, motion, and manufacturing
These aren’t generic chatbots. They’re built on decades of industrial knowledge. When asked about materials for an e-foil wing, Mari answered from a material science perspective, while Leo evaluated mechanical trade-offs like fatigue and drag.
Specialised AI personas make a lot of sense. One general AI can only go so deep. Having Marie for science and Leo for mechanics feels far more aligned with how real engineers think. My only concern is usability. If switching between companions feels heavy, adoption could suffer. But if they truly collaborate behind the scenes, this could be a big productivity win.
Manish then took a bold step: a live demo of Leo.
The standout moment was “Drawing to Sketch.” He selected a PDF drawing, and Leo instantly filtered out title blocks, ignored isometric views, read every dimension, and converted the drawing into a fully parametric 3D model—in seconds.
Leo also demonstrated surrogate modelling, using historical data and design-of-experiments methods to deliver near-instant finite element results without waiting for traditional solvers.
Turning PDFs into parametric models is the dream of anyone who’s dealt with legacy data. That alone is huge. Surrogate modelling is powerful too but it needs caution. If AI is estimating based on past data, engineers must still validate final designs properly. Speed is great, but trust must be earned.
The discussion then expanded with Morgan Zimmerman, CEO of the 3DEXPERIENCE brand. When asked what the brand stands for, his answer was simple but strong: helping users virtualise knowledge and know-how so they can compete in the generative economy.
Manish illustrated this with a personal story about building a table during COVID. Without woodworking knowledge, things like biscuit joints or load calculations, his first table was massively over-engineered. The second project went much smoother because the knowledge was already “virtualised” in his head.
That’s the goal of the platform: capture decades of company knowledge and use AI to feed that expertise back to designers at the right moment.
This idea of virtualising engineering heritage is powerful. So much knowledge disappears when experienced engineers retire. If the platform can truly capture the “why” behind designs, it becomes incredibly valuable. Also, the light humour between Manish and Morgan helped cut through the corporate language, it felt refreshingly human.
3DSWYM was also reintroduced—not as a simple communication tool, but as a 3D-aware personal notebook. It brings tasks, discussions, and reviews together, all connected directly to 3D data.
Because it’s AI-native, it feeds context to the virtual companions. Designs can be shared in one click and accessed from anywhere even a phone.
Engineers think in 3D, so our tools should too. This could finally close the gap between management tools and design tools. My only hope is that it helps focus work—not just adds more notifications.
One of the most practical announcements: you can now buy SOLIDWORKS online directly.
The offering is simplified:
SOLIDWORKS X for browser-based design
SOLIDWORKS Design for the desktop experience
After purchase, everything starts from 3ds.com (and soon solidworks.com). No lost links, no email confusion. The new Simplified Compass shows only essential apps, clearly explained—no more “747 cockpit” feeling.
And updates? Down from 45 minutes to 90 seconds, with the option to stay on a version for up to two years.
This is a massive win. The platform has long been powerful but intimidating. Simplifying access, visuals, and updates shows real listening. A 90-second update alone deserves applause.
ENOVIA’s Content Explorer might be the quiet hero of the keynote. It brings a Windows-Explorer-style experience to the platform—fast navigation, no heavy indexing, and smart views for recent and favourite content.
Parts can live in multiple folders without duplication, just like photos on a phone. There’s even a dedicated filter for SOLIDWORKS users.
This is the bridge many users have been waiting for. Folder-based thinkers finally get a cloud system that feels natural. It’s modern data management without forcing people to abandon familiar workflows.
3DEXPERIENCE World 2026 made one thing very clear: AI is accelerating fast, but engineers are still firmly in control. With tools like Leo and Marie, and a platform that finally feels intuitive, the future of design looks both advanced and practical. Last year, SolidWorks generated significant excitement among CAD professionals with its AI-driven capabilities. However, as anticipated, these features did not mature into technologies accessible to end users. This year, we observed similar demonstrations with equal interest. The key question is: are these showcased capabilities the result of extensive internal testing, or will end users be able to evaluate them shortly after the event?