We are in the midst of a transformation to Industry 4.0—“the next phase in the digitization of the manufacturing sector” (McKinsey)—a transformation made possible by the newfound ability of design data to originate as either atoms (physical) or bits (digital), be converged, refined and optimized, and then be recast as “new and improved” iterations of their original selves.
Competition is constant and fierce in the product development industry, so every ounce of performance advantage, cost savings and productivity gain matters. The ability for design data to travel from atoms to bits, and back again is going to drive exponential productivity gains, and fundamentally reshape our world.
Following are the disruptive forces behind this shift to industry 4.0:
- Internet of Things (IoT) and 3D scanning: data collection everywhere, driven by declining manufacturing costs and ubiquitous connectivity
- Artificial intelligence (AI): prediction-based design, driven by parallel processing and scaling, along with the mass availability of data
- Virtual reality (VR): design immersion, driven by GPU performance and low-cost head-mounted displays
- Additive manufacturing: 3D printing, driven by improvements in quality, new materials, and reduced operating costs
Imagine a world where the products you use every day collect data on their usage to help you improve your use of the product, or to help the manufacturer improve the design of the product. This is happening now—all around us—through IoT sensors embedded in everyday products from cars to toothbrushes. In mission-critical situations, like a rocket launch, sensors provide real-time telemetry data that enables operators to make adjustments on the fly, or notice problems. In non-mission-critical situations, usage and performance data can be collected by the manufacturer for analysis of real-world product usage, reflecting the massive variance of environmental and usage conditions. Armed with all of this data, manufacturers can readily spot opportunities for design improvements.
As LIDAR (Laser Imaging, Detection and Ranging) cameras begin to rapidly propagate in autonomous drones and self-driving cars, it will be just a matter of time before there is an open-standard real-time 3D model of the world. Until then, companies like Google and others will be hard at work scanning the world to produce the data required for autonomous navigation. Now imagine this geophysical data coupled with data from IoT sensors and other data sources, such as weather. In this rich tapestry of data convergence emerges a fully digital facsimile of the real world—perfect for running “digital” experiments that would otherwise have never been possible.
In digital world filled with endless data that no number of humans could ever possibly process, the obvious question is, What next? Enter AI, with its ability to simultaneously learn and self-replicate, enabling a theoretically unlimited possibility of processing and interpreting data. Where it gets really interesting, though, is what happens next: conclusions defined by AI through testing of real-world data can be applied to incremental or hypothetical data, in effect simulating scenarios that have not yet occurred in the real world. This type of heavy lifting goes beyond traditional CAE by eliminating the need for humans to define the test case and variables. Where humans can’t think of everything, AI can. The net outcome is hyper iteration through the design/test product development cycle, resulting in generative, evolutionary data-driven design.
Where does this all lead? Back to atoms, of course. Rapid digital iteration demands an equally rapid physical counterpart. What’s interesting about 3D printing, though, is not just the obvious benefits like the ability to rapidly iterate physical prototypes, mass personalization and economical small batch production, but rather the ability to design and create physical objects that are unknown to us today. This may sound like science fiction, but it’s not. Additive manufacturing offers us the possibility of controlling the design and production of materials and objects at a molecular level. We are accustomed to specifying materials homogeneously at a part level, but with molecular-level control, we can design and simulate heterogeneous materials in single object, resulting in parts that have their functionality informed by materiality as much as by mechanics.
You may be wondering where you fit in to all of this? Have no fear. The need for engineers is not going away. While the work you do may change, your creativity and ingenuity are not going to be replaced by a machine anytime soon. Engineers and designers will, however, need to evolve their modality of interaction with all the rich data and endless complexity, as described above. Can we really imagine a future where all of this possibility coexists with a keyboard, mouse and two-dimensional displays as the primary user interface modality? Enter virtual reality, which enables the possibility for engineers, designers and constituents to become one with the design—at the right scale, and in the appropriate context, regardless of the actual physical limitations of such things. In the virtual world, we can have a team meeting inside a nanomotor, collaboratively study visual simulated engine thermodynamics at scale as we cycle through operating conditions, or take a test drive in our new concept rover on Mars.
The Cycle Continues
Not too long ago, we transitioned from paper to PCs, and the result was an exponential improvement in our work and production process, but that progress ultimately tapered to slow linear progression. The next massive technology disruption, Industry 4.0, is upon us, and will result in massive productivity improvements and an exponential acceleration of the design process. We are standing at the precipice of fundamental change to the way we work, and the smartest among us are already getting ahead of this tidal wave to develop and sustain a competitive advantage.
The technologies discussed in this article are available today in one form or another, and it won’t be long for convergence and, ultimately, exponential change. To illustrate this point, 3D printers will soon “print” IoT sensors directly into the computationally generated parts they produce, perpetuating the new reality of atoms to bits, and back again.
About the Author
Sean leads HP’s Global Industry Segment team, responsible for strategic alliances with software and hardware partners, industry engagement, and segment strategy. In their spare time, Sean and his team manage HP Mars Home Planet, the global mission to design a civilization on Mars for 1 million humans and bring it to life in VR. Previously Sean was a product manager at Autodesk and Corel, focused on 3D, graphics, and design visualization. Sean resides in Colorado and holds an MBA from Queen’s University in Canada.