The next releases of SOLIDWORKS are not just about adding new features. They represent a clear shift in how engineers will design, analyze, and make decisions in the coming years.
If you’ve ever attended a general session at 3DEXPERIENCE World, you know the feeling. Half the room needs more coffee, and the other half is thinking, “Wait… did he really just say the computer can do that?” Day Two of the 2026 conference in Houston felt exactly like that but on another level.
Manish Kumar, CEO of SOLIDWORKS, took the stage and clearly shifted the conversation. This wasn’t about adding another AI button to click. It was about moving toward Virtual Companions—AI that works with you, not just for you.
As someone who has spent years navigating feature trees and menus, the idea of a digital design partner is exciting… and honestly, a little intimidating. Let’s break down what Manish shared, from creative inspiration with Aura to serious engineering work with Leo.
The main message of Manish’s presentation was clear: AI is no longer a separate feature. It’s becoming embedded intelligence.
He introduced two AI personas:
Aura – focused on exploration and ideas (“what if”)
Leo – focused on execution and expertise (“how to”)
Aura lives inside SOLIDWORKS and connects you to your own knowledge. When starting a new project—Manish used a KiTbot as an example—you can ask Aura for inspiration. She searches your private files, supplier data, and internal resources without sending anything to the public web. Your IP stays protected.
To move from ideas to real design, Leo takes over. Using a Battlebot example, Manish showed how Leo can generate the assembly structure for you. Instead of manually creating every part and sub-assembly, Leo gives you a structured starting point so you don’t have to begin from scratch.
I think Aura makes a lot of sense for safe idea exploration, especially since it works only with your own data. Leo’s assembly structure feature is promising and can save time at the start of a project, as long as it stays flexible and aligns with how engineers actually work.
One of the most relatable moments was the story of Priyanka and her son Kabir. A simple real-life problem—carrying a cup—led to the Image-to-Mesh feature. You can now take a photo of a real object and convert it into a 3D mesh directly inside SOLIDWORKS. The goal isn’t perfect geometry, but fast context.
We’ve all received a STEP file from a supplier—no features, no history, just a solid block.
Manish showed how Leo can analyze these files and convert them into real parametric features. That means you can edit hole sizes, lengths, and patterns instead of rebuilding everything from zero.
Broken references, failed features, missing mates—we all deal with them.
Manish showed that Leo can now analyze these problems, explain why they happened, and suggest how to fix them. The long-term goal is automatic repair, so you can stay focused on designing.
Leo also tackles assembly performance. He can scan an entire assembly, find what’s slowing it down—fasteners, high-detail imports, complex springs—and recommend fixes like simplifying geometry or suppressing details.
Instead of digging through a BOM, you can now talk to your model.
You can ask:
“Which parts are made of aluminum?”
“Change all steel parts to titanium.”
This also applies to Generative Drawings, where you can tell Leo your standards, sheet sizes, and requirements before the drawing is created.
Very powerful—but also a little scary. Changing materials across an entire assembly with one sentence is great… as long as undo is instant.
The Virtual Companions don’t stop at design. Aura is being integrated into DELMIAWorks and ENOVIA, bringing ERP and MES information directly into SOLIDWORKS.
This means designers can ask simple questions like:
What’s the status of this order?
When does this part ship?
All without leaving the CAD environment. Design, manufacturing, and operations start to feel connected instead of separated.
ERP integration is a big win. Design and manufacturing often feel like two different worlds. Having access to cost, order status, and delivery information while designing helps engineers make better and more realistic decisions.

Manish also demonstrated guided simulation using Leo. In one example, Leo walked a user through a drop test for a medical device.
You don’t need deep simulation knowledge. Leo helps define loads, constraints, and setup, while making sure the physics stay correct. The goal is to make simulation more accessible without turning it into a black box.
Making simulation easier is powerful, but it also comes with risk. If users don’t understand the physics behind the results, they may trust them too easily. I hope Leo explains why something passes or fails—not just shows a green checkmark.
One of the most talked-about moments of the session was the live demo based on the Model Mania 2025 challenge. Manish shared that a friend had teased him, saying the model was too simple. His response was to show what modeling looks like when you stop using a mouse altogether.
He demonstrated Text-to-CAD and Speech-to-CAD by literally talking to the software. Using Leo, he asked for a model with specific dimensions, told the system to select all vertical edges, and then add holes on the top and side faces. The entire model was created step by step—without clicking a single command icon.
The message was clear: CAD is moving from a clicks-and-commands workflow to one driven by intent and conversation.
Looking ahead 6–12 months, Manish showed the most impressive demos of the session.
First, Speech-to-CAD: a user created a model entirely by talking—adding holes, edges, and features without clicking commands.
Then came the Water Tank Challenge.
A user asked Leo to design a cylindrical steel water tank with:
500,000-liter capacity
Specific diameter limits
In under five minutes, Leo generated a complete functional model, including structural frames. A simulation followed immediately to verify it could handle the load.
None of this happens without serious computing power.
Pascal Daloz (CEO of Dassault Systèmes) was joined by Jensen Huang, CEO of Nvidia, to talk about the Generative Economy and the World Model—AI grounded in real physics, not just language. This means Leo isn’t guessing. He understands gravity, friction, and cause-and-effect.
This partnership makes the AI much more trustworthy than a generic chatbot. Physics-based AI is exactly what engineering needs.
That said, this power comes at a cost. As Jonathan Patton from Intel mentioned, running these tools smoothly will require strong workstations—Xeon CPUs and high-end GPUs. The future is powerful, but it won’t be cheap.
The future of SOLIDWORKS is all about working smarter with AI companions like Aura and Leo. They help speed up design, reduce tedious tasks, and keep you focused on engineering. While exciting, it’s important to stay involved and make sure the AI’s work makes sense. The tools are here to assist — not replace — your creativity and expertise.