By Sam Smith-Eppsteiner and Thilo Braun
Hardware is everywhere, from the phones we use and the planes we travel on to the furniture we use daily at home. The pace at which we develop and build hardware drives how fast we can deploy critical new technologies in key areas such as clean energy and defense technology. However, despite significant improvements over the past decades, including rapid prototyping technologies, developing and making hardware largely remains slow and tedious. This is in stark contrast to software engineering, where modern tools, copilots, and continuous testing enable rapid product development. To reach the sci-fi utopias we’ve imagined of abundant energy, flying vehicles, and space exploration, we definitively need new approaches to building hardware. How do we make engineering atoms look more like bits?
We believe that we are at an inflection point — where the Super Evolution will have profound consequences on how we design and make things. We envision a future world of “intent to action,” where a product designer can describe his design intent and, with little to no manual intervention, receive a physical product. While this world remains in the future, many of the building blocks to get us there will solve real and meaningful problems today.
“Intent to action”
Several drivers put us at an inflection point that will accelerate progress toward this future:
This post will focus on the engineering process for discrete products. In a future post, we will do a deep dive into manufacturing.
Status Quo
Today, hardware engineering is largely a linear process with information primarily flowing downstream. Decisions early on drive the bulk of a product’s cost.
Several studies across different industries found that 70+% of life-cycle cost are committed by the end of the concept engineering phase when requirements are frozen, while only a small fraction of lifecycle cost are incurred up until this point. Today’s engineering processes are prone to suboptimal early decisions and are slow to correct these, if at all.
The predominant challenges from today’s hardware engineering process include:
Future vision
The quality of a design process can be measured in both its efficiency and effectiveness. Efficiency is how quickly a design is created and how many resources are required for the process. Effectiveness is how well the product meets its requirements and optimization functions, including weight, durability, manufacturing cost, maintainability, embedded carbon footprint, etc.
Humans are not particularly good at structured design space exploration. In 1997, Deep Blue beat world champion Garry Kasparov in chess. The computer was able to achieve this by systematically evaluating the universe of future moves — the ‘design space’ of chess. In hardware engineering, we anticipate that computers will similarly outperform humans in systematically exploring a design space, evaluating different design choices against the optimization functions defined in the design intent. Such optimization functions specify the priority of different design trade-offs, for example specifying a certain $ value a product may cost more if an additional kg of weight is shaved off the design.
Working backward from a future where computers are able to conduct systematic design space explorations for complex engineering systems, several key pieces are needed:
Generating new engineering designs based on requirements and text inputs. Several companies, including Quilter and nTop, are building generative design tools at a component level, while Zoo and others are working on platform Text-to-CAD tools. In the future, generative design will likely span from industrial design to engineering considerations. These tools must integrate aesthetic design with functional engineering and systems reasoning.
We anticipate new foundation models built for 3D systems will be needed to achieve the full potential here. This would enable generating and iterating on complex 3D assemblies based on input functional requirements. The key challenge to overcome is the aggregation of training data to develop such a foundation model.
We’re excited to see new generative design tools to solve real engineering pain points today, as well as creative solutions to building training data sets.
Simulation
Physics-based simulation today is slow and computationally intensive. Physics-informed machine learning based simulation has the potential to accelerate simulation by several orders of magnitude, achieving near-perfect results in seconds to minutes rather than days to weeks. Accelerating simulation will enable more systematic and exhaustive explorations of design spaces, enabling multi-variate optimizations of systems with little human intervention to find system optimums, rather than slow human-driven iterations driven by human intuition. Example companies in this space include Navier AI, Beyond Math, PhysicsX, and SimScale. We’re still in the first inning and look forward to novel approaches to simulation.
Verification and validation
Verification and validation (V&V) includes the process of ensuring that, if the requirements are met, the product will perform its intended use (’Are you building the right thing?’) and that these requirements have been met (’Are you building it right?’). V&V can make up 10% or more of project time and cost for complex engineering projects, but is lacking dedicated modern software tooling. This process includes planning and executing tests, interpreting results, and documenting these for safety and certification reasons. Today, these workflows are largely done in a combination of the requirements management systems (Polarion, Doors, etc.), document management systems, word, excel, etc. Dedicated tools could not only help make these workflows more seamless, but with Generative AI could also automate significant parts of extracting information from regulations, drafting verification plans, and interpreting results. This will allow engineers to focus on more intellectual parts of the job while reducing product risks from oversight in the verification process such as Rivian’s recent recalls for various non-conformances with headlamp and backup lamp regulations. Startups operating in this space include Nominal, Stell, Cadstrom, and Valispace.
An additional component to verification is ensuring that the product performs as intended in real world use. Today, warranty data is often encrypted in broad codes and handwritten notes. Other information from usage may make its way back to an organization in the form of telematics, Jira tickets captured by customer representatives, or hidden in online message boards such as Reddit. We believe that new tools will help connect the dots and capture issues faster, as well as ensure that findings are implemented in future product iterations. Companies working on this intersection include Axion Ray and Pull Systems.
Design for X
Today, design for X considerations are often an afterthought. Design engineers may have hundred page design guideline documents to work with and will manually review designs with manufacturing engineers. There is little quantification of the cost impact of design decisions, bar some excel models (or rudimentary tools such as McKinsey’s Cleansheet and Teamcenter product costing) for high level cost estimates. Better tools are needed to evaluate the impact of design decisions and enable informed design trade-offs across manufacturability, assembly, supply chain, and cost considerations. This includes everything from initial tolerance stacks and manufacturing process selection for new products, to quantifying where the highest ‘bang for buck’ is when doing design iterations to reduce product cost. Beyond the design, there is a lack of tools to support translating designs into manufacturing. This includes creating the assembly sequence (M-BoM), drawings and work instructions. These activities are tedious and engineers often see these as a distraction from their core work. Startups tackling this space include Threaded, Dirac, and Drafter. The interface between design and manufacturing remains underserved and we’re keen to see new solutions.
Design orchestration and collaboration
Putting it all together is key. To enable faster design iterations and achieving better engineering designs, pulling together data and orchestrating the various parts of the design process is critical. We believe that design engineers will increasingly spend time on orchestrating design processes, deciding design trade-offs, and refining design inputs. We are excited by new design orchestration platforms that have the potential to become core to the hardware engineering process. Players working on this include Generative Engineering and Synera. As the tools described above develop, we envision engineers will spend an increasing amount of time in these orchestration platforms, where specialized services such as design and simulation are run by agents.
In the meantime, in today’s world, with various disciplines and stakeholders needing to work together, new spaces for collaboration are needed. In particular, data is distributed across various formats and locations, shared across mails, calls, and PowerPoint presentations. This hinders effective information flow and decision making. We see immediate opportunities to improve the data fabric, documentation and collaboration for hardware engineering. Some companies tackling this include SygmaHQ, Violet Labs, and Quarter20.
What is needed to build a large company
The industrial landscape is littered with a graveyard of startups that failed to reach scale. Nonetheless, it is a large market, with over 1,000,000 hardware engineers in the U.S. alone (BLS). Through our experience working with companies in this field, we have identified several key tenants for building a successful software company serving industrials:
Closing thoughts
We’re at a critical inflection point for hardware engineering — to get where we want to go, we need to build in the real world. And we need to do it faster, more effectively, and more sustainably than ever before. Legacy industries are redefining products for a carbon free world. The war in Ukraine has made the need for a new generation of defense systems clearly apparent. At the same time, on the enabling technology side, Generative AI and physics informed ML simulation are enabling a new generation of engineering tools.
We’re excited to talk to builders building for builders — we see a huge opportunity in enabling and accelerating the future of hardware.