Hyperparameter tuning explained – how machines learn better for cities

Building design
a-city-street-full-of-traffic-next-to-tall-buildings-L7RbsRIG7DQ

Busy city street between tall buildings, taken by Bin White in Germany

Hyperparameter tuning – sounds like Silicon Valley jargon, but it has long been the backbone of all modern, data-driven urban development. Anyone who lets machines learn for the city today is helping to decide how intelligent, resilient and liveable urban spaces will be in the future. But how does this mysterious “fine-tuning” actually work? Who understands the levers in the machine room of algorithms – and what does this mean for the practice of urban planners, landscape architects and decision-makers? Welcome to a journey of discovery through the world of hyperparameter tuning: profound, provocative and very close to the reality of the city of tomorrow.

  • Definition and meaning of hyperparameter tuning in the context of urban planning and smart cities
  • Technical background: How machine learning and AI are used in urban systems
  • The most important hyperparameters – their function and effect on urban algorithms
  • Practical application examples: Traffic control, climate resilience, land use and more
  • Risks and challenges: Bias, transparency, commercialization and governance
  • Strategies and methods for professional hyperparameter tuning in urban planning
  • The role of open source, collaboration and data-driven participation
  • Legal, ethical and cultural aspects for Germany, Austria and Switzerland
  • Outlook for the future: How hyperparameter tuning can accelerate urban transformation

Machines learn cities – hyperparameters as the secret architects of urban intelligence

Anyone who walks through the world of urban planning today with open eyes will encounter terms such as “artificial intelligence”, “machine learning” or “smart city” in an almost inflationary fashion. Behind all the hype, however, lies a technical reality that is far less glamorous, but all the more crucial: hyperparameter tuning. But what is actually behind this term, which is almost reverently whispered in AI circles? Hyperparameters are the parameters that developers tweak before an algorithm even starts learning. They determine how deep a neural network is, how quickly it learns, how many decision trees a random forest model can grow or how much an algorithm can “overfit”. Finding the right values gives the AI system the ability to build a robust, high-performance and, above all, practical model from the city’s raw data.

This is highly relevant in urban planning practice. Let’s take the example of traffic forecasting: a learning algorithm is supposed to predict how traffic will develop in a neighborhood after the opening of a new subway line. The amount of data is enormous, the dynamics are high and the interactions are complex. A poorly tuned algorithm either produces trivial, irrelevant results – or it is so complex that it “memorizes” every error from the training data and fails miserably in real life. If you don’t have the hyperparameters under control, you are building castles in the air instead of the city of the future.

But hyperparameter tuning is more than just technical fine-tuning. It is a kind of art of translation between the “what” of planning and the “how” of the machine. Because every urban issue – be it mobility, climate resilience, energy efficiency or citizen participation – requires its own customized model architecture. And this, in turn, is significantly influenced by the hyperparameters. These inconspicuous variables thus become the secret conductors of urban transformation: they determine how sensitively a model reacts to new data, how flexibly it responds to anomalies and how stable it remains despite chaotic realities.

The complexity increases further as soon as several models compete with each other – or even cooperate. In many modern urban platforms, dozens, sometimes hundreds of AI models run in parallel: one for traffic light control, one for air quality forecasting, one for energy demand analysis. Each one has its own hyperparameters, its own optimization goals. Only the intelligent interaction of these systems turns a data city into a learning, resilient metropolis. Hyperparameters thus become the invisible infrastructure of urban intelligence.

Any planner, architect or decision-maker wondering how much influence they have on the city’s “engine room” should not leave hyperparameter tuning to the data scientists alone. Because only those who understand the principles can ask the right questions, formulate the right requirements – and ultimately prevent urban reality from disappearing into the shallows of poorly tuned algorithms.

Conclusion of this first insight: Hyperparameter tuning is not a niche topic for AI geeks, but a central tool for the management, control and further development of digital cities. Anyone who understands the language of parameters speaks the language of the city of the future.

The most important hyperparameters and their influence on urban models

What exactly are these much-cited hyperparameters? In the world of machine learning, they are the settings that are defined before the training process begins – and therefore play a decisive role in determining the success or failure of a model. Classic hyperparameters include, for example, the learning rate, the number of layers and neurons in an artificial neural network, the choice of activation functions, the batch size or, in the case of decision tree models, the maximum depth and number of trees. All of this sounds like mathematical precision work at first, but in the context of urban planning, there is a tangible, often politically explosive mechanism of action behind every parameter.

The learning rate, for example, determines how quickly or cautiously a model reacts to errors and adapts. If it is too high, the system “overshoots”, runs the risk of falling into chaotic patterns and overlooking stable trends. If it is too low, the model remains sluggish and hardly recognizes new dynamics – such as sudden changes in mobility behaviour after a pandemic – or only with a long delay. In practice, this means that the learning rate is more than just a mathematical quantity, it is a lever for the city administration’s ability to innovate and react quickly.

The model capacity, i.e. the complexity of the selected AI model, is just as crucial. A model that is too small cannot map the diversity of urban processes in the first place, while one that is too large overinterprets random events and produces pseudo-precision. The trick is to find the right balance – and this is ultimately a negotiation process between specialist knowledge, data quality and political will. Anyone who relies too much on standard values or black box optimizers risks making mistakes that go far beyond the scope of technology.

Another example is the so-called regularization process. This can be used to control how much the model is “penalized” if it relies too much on the training data. This is highly relevant in urban planning because urban systems are rarely static: A model that fits perfectly to the last five years is often blind to disruptive changes – such as new forms of mobility, unexpected climate phenomena or the sudden emergence of digital business models in public spaces. Those who adjust their hyperparameters wisely ensure that models remain robust and flexible.

The size of the training batches – i.e. how many data sets are processed simultaneously – also plays a significant role. Large batches lead to stable but less flexible models, while small batches lead to fast but often shaky learning processes. In urban planning, different weightings must therefore be applied depending on the application: Different settings make sense for long-term traffic forecasting than for the short-term control of emergency measures in the event of heavy rainfall.

Finally, the importance of hyperparameters for the governance of urban systems should be emphasized. If you know the parameters, you can exert targeted influence, create transparency and retain control over the urban AI models. Those who ignore them delegate decision-making power to opaque algorithms – with all the risks for participation, fairness and acceptance in urban society.

Hyperparameter tuning methods: from manual labor to autonomous machines

But how do you find the optimal hyperparameters for a specific urban problem? In practice, there are three major schools of hyperparameter tuning: manual tuning, automated search and the increasingly popular “AutoML” world, in which machines optimize their own parameters. Each method has its strengths, weaknesses and specific fields of application – and each requires a certain level of expertise on the part of planners and decision-makers.

With manual tuning, developers set the hyperparameters based on experience, intuition and technical expertise. This is often tedious, but necessary, especially in sensitive urban planning contexts: If you know that a certain traffic regulation only makes sense at low occupancy rates, you can specifically tune the model for these cases. The disadvantage: manual tuning is time-consuming, error-prone and scales poorly when complexity increases.

Automated search methods such as Grid Search or Random Search are more efficient. Grid Search systematically tests all combinations of predefined values, Random Search tries out randomly selected settings. Both methods can be easily parallelized and deliver acceptable results for many standard problems. In urban practice, however, they are often too slow, especially when the number of hyperparameters – and thus the dimension of the search space – explodes. This is where classic automation quickly reaches its limits.

Things get exciting with modern AutoML methods. Here, intelligent algorithms take over the search for optimal hyperparameters by evaluating the performance of different models and independently developing the best settings. Techniques such as Bayesian optimization, genetic algorithms or neural architecture search are becoming increasingly popular – also in urban data analysis. Their advantage: they often deliver surprisingly good solutions, even in complex, data-poor or highly dynamic environments. Their disadvantage: they are more difficult to control and explain, which quickly becomes a problem in the context of urban governance.

For urban planning, this means that the choice of tuning method is not a purely technical question, but part of strategic management. In safety-critical or politically sensitive areas, a high degree of transparency and manual control is still recommended, whereas in explorative innovation projects, machines can enjoy a little more autonomy. Ultimately, the balance between efficiency, traceability and participation determines the success of tuning.

One particular aspect is the collaboration between data scientists and technical experts from planning, architecture and administration. Only when both sides speak the language of hyperparameters can models be created that are not only technically brilliant, but also professionally relevant and socially accepted. Hyperparameter tuning thus becomes an interdisciplinary tour de force – and an opportunity to actively shape digital urban development.

Application examples: Hyperparameter tuning as the key to the smart city

So what does all this mean in practice in German, Austrian and Swiss cities? The answer is as diverse as the challenges facing urban spaces. One key use case is traffic control: cities such as Munich and Zurich are experimenting with traffic light control systems that learn and use hyperparameter-optimized algorithms to adjust traffic flow in real time. Here, the adjustment of the learning rate and model depth determines whether the algorithm reacts flexibly to short-term disruptions – such as a major event or roadworks – or whether it is based on long-term trends.

Another field is climate resilience. In Hamburg, AI models are used to predict heavy rainfall events and to optimize rainwater management. The choice of regularization parameters determines how robust the model remains in the face of rare but extreme weather conditions. Too strong a regularization leads to false alarms, too weak a regularization leads to overlooked risks – both can have fatal consequences in reality.

Hyperparameter tuning is also becoming increasingly important in the area of land use and neighborhood development. Digital twins of urban spaces, such as those used in Vienna or Rotterdam, require finely tuned models in order to realistically simulate various scenarios – from population growth to the conversion of brownfield sites. Batch size is crucial here: large batches increase the stability of long-term forecasts, while small batches allow short-term trends to be better captured.

The role of hyperparameter tuning in public participation should not be underestimated. Platforms that analyze feedback, wishes and usage data in real time must tune their models to recognize both majorities and minority interests. The choice of activation functions and the depth of the models influence how sensitively the system reacts to new opinions and moods. If you proceed too roughly here, you produce participation dummies instead of genuine participation.

Finally, the significance for urban energy and resource management should be pointed out. In more and more cities, AI models are being used to forecast power peaks, optimize the charging infrastructure for e-mobility or control district heating and cooling networks. Here, too, careful hyperparameter tuning is needed to create models that are both efficient and resilient – and thus provide the basis for sustainable, future-proof urban development.

The examples show: Hyperparameter tuning is no longer an abstract special topic, but a practical lever for improving urban quality of life. Those who exploit this potential will gain a decisive advantage in the international competition between cities.

Risks, governance and the future: hyperparameter tuning as a question of urban destiny

Of course, not all that glitters in the AI engine room is gold. Hyperparameter tuning in particular harbors risks that go far beyond technical malfunctions. One key problem is the risk of algorithmic bias. Incorrectly set models reproduce existing inequalities, give preference to certain neighborhoods or means of transport and thus exacerbate social or ecological imbalances. Those who fail to address these risks at an early stage risk not only poor forecasts, but also a massive loss of trust in digital urban planning.

Another problem is the lack of traceability. The more complex the hyperparameter tuning, the more difficult it becomes to explain how the models work – for planners, decision-makers and the public alike. Black box systems may be technically efficient, but they undermine the acceptance of new technologies and make democratic control more difficult. Transparency, open source solutions and open interfaces are needed here to avoid losing control of urban algorithms to commercial providers.

The risk of commercialization should not be underestimated either. Many modern hyperparameter optimizers are offered as proprietary cloud services whose internal logic remains opaque even to experts. If a city becomes dependent on large providers, it is giving up a key lever for controlling urban systems. The alternative: building up your own data expertise, promoting open source initiatives and working closely with universities and civil society actors.

Legal and ethical issues come to the fore as soon as AI models with tuned hyperparameters intervene in public decision-making processes. Who is liable if a poorly adjusted model delivers an incorrect prediction? How can we ensure that models work without discrimination? And how can citizens be involved in the tuning process without being put off by technical jargon? New governance structures, clear responsibilities and a culture of critical reflection are needed here.

Nevertheless, the opportunities outweigh the risks: hyperparameter tuning is the key to adaptive, resilient and participatory cities. Those who are aware of the risks and actively manage them can use the digital transformation to strengthen urban quality of life, sustainability and democracy. The future belongs to cities that do not dismiss the tuning of machines as a purely technical issue, but see it as a strategic management task.

Conclusion: Hyperparameter tuning – the invisible backbone of the digital city

Hyperparameter tuning is far more than a technical niche topic. It is the invisible backbone of modern, data-driven urban development. Anyone who understands and masters the levers in the engine room of algorithms can build urban models that are not only precise and efficient, but also transparent, fair and socially acceptable. The challenges are great: the spectrum of open construction sites ranges from bias and lack of transparency to commercialization and ethical and legal issues. But the opportunities are even greater: intelligent, adaptive cities that expand their data competence and actively involve their citizens will gain a decisive advantage in the competition for quality of life, sustainability and innovative strength.

For planners, architects, administrators and politicians, this means that hyperparameter tuning belongs on the agenda of every future-oriented urban development. It is not a black box, but a design task – and an invitation to actively shape the digital transformation. Anyone who sets out now to learn the language of machines will be able to build the city of the day after tomorrow. And who knows – perhaps the greatest potential for innovation in the city of tomorrow is not the technology itself, but our courage to manage it intelligently and responsibly.

POTREBBE INTERESSARTI ANCHE

Interior exhibition “new spaces”

Building design
General

The international interior exhibition “neue räume” invites you to Zurich for the tenth time. From 14 to 17 November 2019, the “neue räume” design trade fair will take place in Zurich’s ABB Hall on an area of around 8,000 square meters. There will be an exciting program, inspiring special shows and over 100 Swiss and international exhibitors from the worlds of interior and design […]

The international interior exhibition “neue räume” invites you to Zurich for the tenth time.

From 14 to 17 November 2019, the “neue räume” design trade fair will take place in Zurich’s ABB Hall on an area of around 8,000 square meters. An exciting program, inspiring special shows and over 100 Swiss and international exhibitors from the worlds of interior and design will be on display for four days. The trade fair will once again be a meeting place for the design scene and design enthusiasts.

Every two years, the show provides information on numerous new products as well as current and upcoming living trends. Special program items open up unusual design worlds: For example, the progressive production “Hands On” by the Zurich University of the Arts shows the aesthetic and functional design of prostheses and takes a controversial look at social design ideals. Culinary creations also take a literal look at design and think outside the box.

Interior exhibition “new spaces”
Duration: November 14 to November 17, 2019,
Thursday to Friday: 12 to 9 pm
Saturday: 10 am to 9 pm and Sunday: 10 am to 6 pm
ABB Event Hall 550 in Zurich-Oerlikon
Ricarda-Huch-Strasse 150
8050 Zurich, Switzerland

Robotic architectural assembly in real time

Building design
General
white-concrete-building-tagsuber-2EkR7J1jo6A

Modern white concrete building in daylight in Freiburg, photographed by Ilona Frey

Robots in construction? It sounds like science fiction, but it has long since become reality – at least where people dare to do more than the next BIM workshop. Robotic architectural assembly in real time promises nothing less than a revolution in construction practice: faster processes, more precise results, radical sustainability. But what is hype, what is substance? And how far along is the German-speaking world really when algorithms, sensors and mechatronic gripper arms take over the construction site?

  • Robotic architectural assembly in real time is changing the entire construction value chain – from planning to operation.
  • Germany, Austria and Switzerland are experimenting with initial pilot projects, but widespread implementation is still in its infancy.
  • Core technologies: AI-controlled control, digitalized production, adaptive sensor technology and human-machine interaction.
  • Sustainability by design: robots enable material-optimized, circular and resource-efficient construction methods.
  • Technical expertise – from parametric design to software integration – is becoming a basic requirement for architects and engineers.
  • Digital real-time assembly is challenging the traditional job description and shifting the boundaries between planning, execution and operation.
  • Debates about job losses, loss of control and ethical responsibility are shaping the discussion.
  • Vision: robots as partners in the design process – and as a catalyst for a new building culture.
  • Risks: technocratic bias, complex liability issues, new dependencies on software and platforms.
  • Global role models in Asia and Scandinavia are setting standards, while German-speaking countries are mainly struggling with regulatory hurdles.

From the digital vision to the real construction site: Where we stand

Robotic architectural assembly in real time is the new gold fever in the construction industry. Anyone who thinks this is about a bit of drone flying on large construction sites has missed the point. It’s about the complete integration of digital design data, parametric planning, robotics and automated production – right through to assembly on the construction site or directly in the urban space. Germany, Austria and Switzerland have taken the first steps: research projects, pilot construction sites, collaborations between start-ups, universities and established construction companies. But the reality? It is fragmented, full of prototypes and still a long way from widespread implementation. While ETH Zurich is demonstrating architectural assembly on a 1:1 scale with DFAB House and the Robotic Fabrication Laboratory, in Munich, Frankfurt and Graz many things are still in test mode. The reasons are well known: high investment costs, a lack of interfaces between software and hardware, and a planning law that slows down innovation rather than spurring it on.

But if you take a closer look, you will discover an astonishing dynamic. At technical universities, robotic arms are maturing that stack brickwork more precisely than any bricklayer, while autonomous assembly platforms are making their rounds on the construction sites of the first modular timber houses in Switzerland. In Vienna, façade elements are measured digitally, optimized in real time and then assembled by machines with millimetre precision – all under the watchful eye of AI. The construction site is becoming networked, a data platform, a stage for sensors and actuators. But the leap from demo to series production remains risky. After all, the construction industry is tough, the regulatory jungle is dense and the fear of losing control is deeply rooted.

What is lacking is not the vision, but the scaling. To date, most robotic assembly processes are one-offs – tailor-made for a lighthouse project, but not for day-to-day construction business. Investors are hesitant because amortization and maintenance costs are uncertain. Construction companies fear the complexity of new processes and the conversion of traditional trades. And for architects, the move to real-time assembly means they have to say goodbye to old habits. If you want to continue thinking in 2D plans, you can leave the robot at home.

Nevertheless, German-speaking countries are by no means lagging behind. The region is often a leader in basic research, but cautious when it comes to application. At the ETH, Switzerland demonstrates how robots not only assemble modules, but also open up architecture with new forms and materials. Germany scores with a lively start-up scene that is testing everything from adaptive formwork to automated concrete pressure assembly. And Austria? Is focusing on linking digital timber construction and modular prefabrication. But the big question remains: When will the prototype become the new standard?

The most important insight: robotic assembly in real time is not an end in itself. It is part of a fundamental paradigm shift that is rethinking construction. Those who wait until the technology is “ready” will be overtaken – by those who are already prepared to make mistakes and learn from them.

Technology, AI and data: The new DNA of architectural assembly

The technological basis of robotic architectural assembly reads like a who’s who of the digital revolution: parametric design software, algorithmic design, building information modeling, AI-supported process control, machine-to-machine communication and an army of sensors, cameras and actuators. Without this infrastructure, the robot remains an expensive toy. With it, it becomes an extension of the design. It all starts with an intelligent data model. Anyone still working with static plans today has lost out in the digital assembly process. Planning must be able to react to changes in real time – be it due to changes in construction site conditions, material deviations or optimized production routes.

AI plays a key role here. It not only controls the robot’s movements, but also learns from every mistake, adapts to new situations and can even make its own suggestions for optimization. The interaction between man and machine is becoming a new discipline. The architect becomes a data curator, the engineer a process designer, the site manager a system integrator. The construction site is becoming a hybrid arena in which software and hardware interact symbiotically. And if the robot suddenly places a screw incorrectly, the system reports the error in real time – including a suggested correction, of course.

What does this mean for training? If you want to succeed in this field, you need more than just creative talent. Basic algorithmic knowledge, software expertise, an understanding of sensors, actuators and how AI systems work are mandatory. The industry is no longer looking for pure designers, but “techno-architects” with a digital mindset. Those who refuse to do so will lose out. The new tools are complex, the interfaces are numerous and the workflow is a permanent beta test. But the learning effect is huge – and those who make use of it will come out on top.

The big challenge: interoperability and standardization. Every construction site, every project, every robot system has its own data formats, protocols and interfaces. Anyone who does not fight for open standards here is building a digital prison. The platform question becomes a question of power. Does the data belong to the robot manufacturer, the client or the planning office? The field is still open – but experience from other industries shows: Whoever controls the platform controls the market.

The technological revolution comes with new risks. What if the AI makes the wrong decisions? Who is liable in the event of incorrect assembly due to software errors? And how can we prevent the robot from becoming a Trojan that forwards sensitive project data to the highest bidder? The industry urgently needs clear rules, certifications and ethics for mechanical engineering. All this is only just beginning – but without these standards, robotic architectural assembly remains a risky adventure.

Sustainability and resource efficiency: robots as climate savers or energy wasters?

The great hope of robotic assembly: more sustainability through precision, material optimization and circular processes. But is it really that simple? At first glance, yes. Robots are incorruptible. They assemble exactly the amount of material that the algorithm specifies – no more and no less. They work around the clock, avoid errors, minimize waste and enable designs that would be almost impossible to achieve by hand. Material efficiency becomes the standard, not the exception. Those who plan parametrically can optimize the use of concrete, steel or wood down to the last gram. And in production? Less waste, less rework, fewer emissions.

But the devil is in the detail. Robots need energy – and not in short supply. The production halls for prefabricated modules are energy-intensive. Developing the software, training the AI, maintaining the systems: all of this costs resources. Anyone relying on the brave new world of robots should take a close look at where the electricity comes from. Renewable energies are mandatory, otherwise the climate savior will quickly become a CO₂ guzzler. What’s more: Not every robotic solution is automatically more sustainable than an experienced craftsman. The system limits must be checked again and again.

Another promise: Circularity. Robots can not only erect buildings, but also dismantle them – separating components by type, preparing them for recycling and returning them to the material cycle. That sounds like a circular economy at the touch of a button. In practice, however, the challenges are enormous: the construction products must be digitally traceable, the connections detachable and the documentation complete. So far, such projects have been isolated cases, but the direction is right. Those who plan modularly and digitally today are laying the foundations for architecture that can be dismantled. And the robot? Becoming a helper in urban mining.

The sustainability balance is ultimately decided in detail. If you look at the entire life cycle, you will see that robotic assembly can massively improve the environmental balance – provided the electricity mix is right, the processes are truly optimized and the designs exploit the potential of the technology. Otherwise, the green coating remains a mere facade.

Despite all the doubts, the opportunity is there. If German-speaking countries invest boldly now, set standards and establish sustainability as a guiding principle, robotic architectural assembly could actually become a lever for the ecological transformation of the industry. But only then.

Job description, debates and visions: What remains of the architect when the robot builds?

Robotic real-time assembly is an attack on the traditional job description. The architect as the lone genius designer, the planner as the master of the construction process: this image is passé. The new heroes are collaborators, system integrators and data managers. The design is no longer created on the drawing board, but in the parametric model. The execution? An interplay between man, machine and algorithm. This creates enthusiasm – and fear. What will remain of the trade when the robot builds the wall? Who still needs site managers when the AI optimizes the assembly plan? And who is responsible when the construction site becomes a black box?

The debate is heated. Some celebrate “Construction Industry 4.0” as a liberating blow: fewer errors, more efficiency, more creativity thanks to new tools. Others see a loss of control, warn of job losses and growing dependence on tech companies. As always, the truth lies somewhere in between. One thing is clear: the role of the architect is changing radically. Those who embrace the new technology can recombine design power and process knowledge. Those who stick to old routines will be overtaken. The professional associations are reacting hesitantly, the universities are experimenting. And the construction industry? It is desperately looking for talented people who can master the balancing act between design and technology.

Visionaries are already dreaming of complete integration: the robot becomes a partner in the design process. It provides feedback, suggests alternatives, responds to user requests and simulates sustainability scenarios. The construction site becomes a digital laboratory, the architect the conductor of an orchestra of machines and algorithms. The reality is still a long way off – but the direction is clear. The big questions are structural: Who sets the standards? Who controls the data? And how can building culture remain diverse if robots set the pace?

Internationally, German-speaking countries are once again both onlookers and pioneers. In Asia, robotic skyscrapers are being built at record speed, while start-ups in Scandinavia are focusing on fully automated wooden modules. In Germany, Austria and Switzerland, the risks are being thoroughly examined – but the best ideas are often developed in niches. The global architecture scene is eagerly awaiting the first lighthouse projects, but is also asking: can these countries do more than just research and pilot projects?

The paradigm shift is unstoppable. Those who shape it constructively can shape the future. Those who sleep through it will become subcontractors of the platform economy. The choice lies with the industry – and with each individual planner.

Conclusion: Robots, data, courage – and the future of building culture

Robotic architectural assembly in real time is not a trend for feature pages and innovation summits. It is a disruptive tool that will fundamentally change architectural practice, the construction industry and urban development. The technology is there, the pilot projects have been launched. What is missing is the broad courage to implement it, the will to standardize and the willingness to cut off old habits. Sustainability, efficiency and precision are not promises, but requirements. The construction site of the future is digital, networked – and full of data. Architects, engineers and builders who take the plunge today can become pioneers of a new building culture tomorrow. Anyone who hesitates will be overtaken by algorithms and robots. Welcome to the age of real-time assembly. It’s no longer just about building – it’s about building, measuring, optimizing and building again. And all this faster, more precisely and more sustainably than ever before.