@Jonnoxx thank you for this very thoughtful post. I appreciate the careful thought.
I am adding some points and elaborations:
Look at what Siemens, Dassault Systemes, and Intergraph/Hexagon are doing, and also Bentley. They have more mature visions and more advanced methods and tools.
Who knows if it will be Microsoft, Google, Apple, or some other tech giant who takes the AEC industry. Any of them could buy most of the major BIM authoring tool vendors outright with only a fraction of their free cash. I don't imagine there is much interest in doing so though. I could see them buying Fluor, Jacobs, Stantec, HOK, or any of the major AEC/EPC firms instead, as well as Siemens, Schneider Electric, Rockwell Automation, Honeywell, and Johnson Controls. They don't need legacy code. They have software development domain knowledge, systems engineering domain knowledge, and vast user data. They need design, construction, and operations domain knowledge and they need facilities operations data. I don't think they will have an interest in BIM authoring tool vendors or the BIM authoring tool market other than perhaps as an aquihire opportunity. I think that the big tech companies want to own connected environments, smart buildings, smart cities, smart transportation, etc., and to own the smart buildings/cities domains they need to understand and ultimately optimize and manage how buildings and communities are designed, constructed, and maintained. The BIM tool is among the least interesting and least useful technologies for understanding how to design and build buildings and environments as part of smart, connected systems of systems because the way major BIM authoring tool vendors seem to develop their tools, they seem to view the data capture and analysis they do through the traditional lenses of their industries and roles as opposed to through the lens of making tools to facilitate the creation of smart, connected systems. It would be more useful for the tech giants to buy the 10 biggest AEC/EPC firms, as mentioned above, and allow them to keep running as normal for years, gradually introduce smart systems requirements, and to mine and collect all of the data associated with design, construction, and operations, as well as monitor, collect, and analyze every email, meeting, phone call, drawing, and keystroke of the subject matter experts who design, build, and operate built environment systems of systems. This is how you map the entire facilities/environment/operations ecosystem. At the same time, on the back end, you have machine learning analyzing all of that data and figuring out how to do it all better and more efficiently and in a way that will have clients clamoring for your facilities and environments, in which, in turn, their data is also mined and operationalized. Perhaps occasionally, if they see an opportunity, based on their data mining and analysis, they may suggest methodological improvements to the AEC/EPC firms, but mostly we would be the study subjects for their systems analyses.
Look beyond the AEC industries' inward focus (its somewhat idiosyncratic focus). Look at aerospace, defense, computer science, automotive, energy, telecommunications, and industrial scale software industries. You'll find that there is something like a convergence or agreement on a number of methods and tools for systems of systems development and optimization that are superior, mostly, to the methods and tools found in the AEC industries. There are somewhat standard formalisms for systems of systems representation, requirements elicitation/development/validation, analyses, optimization, and operations. There are well developed methods of co-simulation between software systems, hardware systems, hardware-software systems, as well as hardware-in-the-loop, software-in-the-loop, and human-in-the-loop co-simulations for systems of systems development. There are well-established paradigms for emerging project types, including cyber-physical systems, socio-technical systems, ultra-large scale systems, multi-scale systems, and complex, large, integrated, open systems. Siemens and Dassault Systemes are keyed in on these constructs and methods and their tools reflect it. The AEC industry is years to decades behind in recognizing and adapting this knowledge and these methods and tools.
Why does all of this matter? Sit back and look around you. Look at your computer, you monitor, your smart phone, perhaps your smart watch, your tablet, perhaps your wearable technology, your car made in the last 25 years that is as much or more computer and network data node than car, perhaps, your cloud systems, your Alexa or NEST or Ring or other home automation systems, your security and access control systems, and also the transportation, distribution, and telecommunications systems of systems upon which you rely. Our digitally augmented world is designed, made, optimized, and operated through the constructs, measures, methods, and tools that I am referencing above, widely utilized in all other major capital-intensive, mission-critical industries --- with the surprisingly notable exception of buildings and the built environment. And yet, it is inevitable that these same constructs, measures, methods, and tools will be applied to the built environment because:
(1) they can be --- the built environment is designable, constructable, and operable using the same methods, as are all complex, dynamical systems of systems --- and;
(2) because there's so much profit to be made and control to be gained from fully connecting the built environment for commercial and governmental actors that they are compelled to turn the built environment into an extension of the rest of our digitally enhanced society;
(3) these technologies and methods and tools are already being applied to the built environment through control systems and building automation systems mentioned above, and so there is a natural and easy path for reaching into the AEC industry that is at least somewhat familiar to the tech giants.
If you go down the bunny holes implied in the content referenced above, you'll find that as far as systems geometry authoring and simulation platforms go, Daussault Systemes and Siemens have tools that are much more tailored to designing complex systems of systems than any BIM authoring tool vendors' tools. As an example, Dassault Systemes Catia platform includes components for overlaying systems engineering methods and performing hardware/software/human co-simulations. Siemens Tecnomatix line in conjunction with their NX line allows for robust modeling and simulation of human-environment interactions and design. The problem though is that these platforms are very expensive. In 2010 I got a software grant from Siemens to use their NX MCAD tool in conjunction with their Tecnomatix Process Simulate Human tools in order to do research on bringing systems engineering design and analysis methods as one may find in aerospace or defense into the AEC industry to improve the AEC industries' ability to execute model-based, evidence-based design. The software tools that Siemens let me use for that project were valued at in excess of $500,000. I don't know what a full suite of Dassault Systemes modeling, systems engineering, and co-simulation tools would cost, but as I have been told and read, it would also be 3-20 times as much as what AEC professionals typically pay for BIM authoring tools. So there is a fundamental barrier to using these superior systems design and analysis tools in our industry as our industry is not set up to afford to do this level of modeling and analysis. Beyond this, of course, our industry would also have to start integrating data scientists, human factors engineers, systems engineers, and computer scientists as standard parts of design teams. This is done to a limited extend in the heavy industrial, military, event, and healthcare subsectors of the AEC industry, but is not wide spread, and would also entail significant additional cost.
Bringing this all back to Vectorworks, one of the reasons that I like vectorworks is that it is a fundamentally very sound modeling and drawing tool with great representational ability, a solid internal data management system, that can handle big data sets well and is relatively inexpensive, at least in the US market compared to its competitors. From my perspective, Rhino and Vectorworks are similar in that they offer tremendous value for the cost and are very efficient for the most basic and fundamental tasks, that we will always need to do, even if they don't offer all of the bells and whistles of some of the more expensive packages. But, given my comments above, it should be clear that I believe that the priorities and foci of the more expensive BIM authoring packages either suffer from a poor vision and alignment with where our industry is headed (i.e., how to model/simulate smart/connected environments) or suffer from an inability to deliver the features and functionality in an affordable way. I should also say that with Vectorworks' focus and leadership on tools like Spotlight and ConnectCAD, in a low-key, low-cost kind of way, they are actually better prepared for a transition to the methods, tools, and workflows needed to design smart, connected environments than any, or at least most, of the other BIM authoring tool vendors currently appear to be. Combine this with Vectorworks' strategic relationship with Siemens, and Vectorworks' ability to handle big data sets, its database technology, and its general user-friendly modeling, drawing, and graphics capabilities, and I think that if Vectorworks plays its hand well, it could leapfrog the other BIM authoring vendors and gain market dominance. But as per the comment about complexity below, there are too many factors to know what will play out or how, and I do not work for any of these companies or know what they have in the pipelines. These are merely my observations based on my experience and what information is publicly available right now.
In case you are interested, look at the NIST Cyber Physical Systems Testbed and start to think about how this will integrate with AEC industry BIM authoring tools and project management tools. This will give you a sense of how our industries' methods and tools have to evolve to participate in the connected future of the built environment. Or look at IES' ICL --- also heading in the right direction. Also look at Matlab/Simulink, and how energy analysis software like OpenStudio can be integrated with Modelica to perform systems of systems analyses.
In case you're interested, take the concept of a digital twin as presented typically and set it aside. A geometry model as a data model is not necessarily a particularly interesting or useful data model. An equation is a data model. A story of any kind, for instance, a children's book, is a data model. A spreadsheet is a data model. Building design, construction, and operations have been represented by data models since people sketched out plans in sand with sticks. Building design, construction, and operations have been represented by digital data models since plant, railroad, and utility operators in the late 19th century first mapped out electricity distribution networks with wall-sized painted diagrams that had indicator lights located at key points to indicate the status of system functionality. A digital twin is also somewhat useless as a concept because there is a supposition in almost all, if not all discussion that I ever see about digital twins that they are in fact high-fidelity digital representations of actual physical and logical systems. But this supposition fails to take into account system complexity, components in the model that may be poorly represented or missing, and rate of change of system states (especially changes that occur outside of the designed performance envelope). Once a system gets so complex, when all of the little assumptions and estimates and unknowns and fudge factors start to add up and compound upon each other, and once the system experiences a series of external, unplanned for, traumatic events (like lightning strike, earthquake, flood, hurricane, tornado, fire, systematic and chronic misuse, etc.), the systems' behavior and structure become somewhat unpredictable, either temporarily or permanently. This is a mathematical and scientific truth. Complex, dynamical systems (i.e., complex systems whose states change over time) are only modelable and predictable when the system occupies something like a natural harmonic frequency, an equilibrium state (that may be far from entropy, i.e., is highly structured). When the system falls out of such an equilibrium state or harmonic frequency, its behavior can become fundamentally chaotic and unpredictable. Once the model loses fidelity, structure, and stability, how do we get it back in sync with reality? Maybe its easy. Maybe its impossible. Maybe getting it back in sync can be approximated but at the cost of a further loss of fidelity and an increased likelihood of wandering into chaotic states in the future. The buildings that we design are complex, dynamical systems. They wander in and out of states of structured, stable, predictable behavior and chaotic, unpredictable behavior throughout their lifecycles. A digital twin can be made and it may even be mostly accurate, much of the time, especially if system states and variables and contextual conditions are kept within limited performance envelopes, but digital twins can never be completely accurate or stable representations of the actual building performance and they cannot always accurately predict conditions and future states of the buildings' assets. There's a great book that explains this issue of the limits of predicting complexity by using a pool table analogy. Here's the key quote from the book: "If you know a set of basic parameters concerning the ball at rest, you can compute the resistance of the table (quite elementary), and can gauge the strength of the impact, then it is rather easy to predict what would happen at the first hit. The second impact becomes more complicated, but possible; and more precision is called for. The problem is that to correctly compute the ninth impact, you need to take account the gravitational pull of someone standing next to the table (modestly, Berry’s computations use a weight of less than 150 pounds). And to compute the fifty-sixth impact, every single elementary particle in the universe needs to be present in your assumptions! An electron at the edge of the universe, separated from us by 10 billion light-years, must figure in the calculations, since it exerts a meaningful effect on the outcome." (p. 178) If this is true for trying to predict the paths of pool balls on a pool table, then how much more is it true of the way that buildings (and by extension digital twins) function!
In case you're interested, all is not lost for architects, engineers, and contractors. There is a well established way in which AEC professionals maintain relevance. It is rooted in our history. You see, over time, AEC professionals figured out something fundamental about how to design, construct, and operate complex, dynamical systems. Rather than trying to model and manage the complexity and dynamicism, eliminate it instead. Make the building as simple as possible but no simpler. AEC professionals figured something else out, too, as codified in Don Schon's great book, The Reflective Practitioner. AEC professionals design systems of systems unlike any other industry, and there is a wisdom in our creative madness. What Schon calls reflective practice is like an early, bespoke version of agent-based modeling that we naturally developed in the AEC industry. The idea is that as opposed to trying to capture and manage all of the possible variables and permutations of a system of systems, start with some big concepts, simply defined, not too many, get them all aligned and working in something like a cohesive whole by imagining and playing out what-if scenarios and iterating, and then the rest will fall into place through iterative refinement, even though you may never be fully aware of the full system of systems or understand it. This practice will remain valid and is very powerful. It is also very efficient and effective. There was a study out of CIFE about 8 years ago that compared a group of seasoned AEC veterans scoping a project to a machine learning algorithm scoping the project. The machine learning algorithm produced better results than the team of human experts. But, the results of the algorithm were not that much better and it ran for a lot more hours than the SMEs spent conceiving of the project. The point is that while overall the smart tech produced a superior result, the humans were surprisingly competitive and efficient in that they produced a pretty similar result by spending a lot fewer hours analyzing the project needs and logistics. So again, there is a place for AEC SMEs. We are not irrelevant now nor will we be for a long time, if ever. Rather, the best outcome is when that deep and efficient human expertise is supercharged with just enough machine learning.
In case you are interested, also look into SEPS2BIM and BIMStorm. This is a great effort. They have worked on BIM-authoring tool agnostic building systems and component representations that can be loaded into any BIM authoring tool. Also, with SEPS2BIM, they've geotagged all components, so that everything is listed in a database as existing in a specific location in the world.