‘Digital Twin’ May Be Essential Tool for Building Optimization

The 2018 market for smart buildings is estimated at $50 billion according to Research and Markets. This is being driven by numerous factors, including the falling prices of Internet of Things (IoT) devices, a new generation of facilities managers, regional mandates, and more. As such, facilities managers will continue to seek new advanced solutions that will allow them to work more strategically and efficiently as they balance building performance and occupant comfort.

An essential tool for these new solutions is the Digital Twin.

 

Digitization of the Physical Assets

I propose that instead of a digital twin being a construct – or pure digitization of a building from materials to behavior, instead think of it as a process – one that begins with a model capable of learning.

 

Digital Twin as a Process

In our case, the model we create follows the Wikipedia definition pretty explicitly – we start with historical information about the buildings’ energy, weather, and HVAC control system; then we add tariff information and occupancy. These form the basis of a mathematical model or projection of the energy profile, at the zone level, for how a building will consume energy. We take that model and apply an optimization engine that guides the HVAC system to idealized zone temperature, supply air temperature, and system pressure for any given time of day and weather (e.g., temperature and humidity) with both energy cost and comfort as overriding parameters. The HVAC system uses this guidance for control – deployed from the cloud but applied locally – to minimize energy costs by floating the temperature between coolest and highest allowable temperature during costly parts of the day. Finally, our model uses actuals each night to learn and renew the optimization.

The resulting process is a Digital Twin.

 

Extending the Digital Twin

Modeling and guiding HVAC systems is only the beginning. A true Digital Twin is always learning and improving. Consider rules versus artificial intelligence (AI).

Rules vs. Artificial Intelligence for Digital Twinning
If-This/Then-That is a great example of the power of an open rules-based paradigm. Arguably a great addition to the tech landscape of closed ecosystems, incompatible protocols and closed APIs. IFTTT, gets around all that with simple, rules-based connectors that anyone can create. I even had one for my Xiaomi Mi Band 2 to send my wife a text message that I was on my (motorcycle) ride home by tapping the capacitive button on my band twice in rapid succession. Cool, but not particularly useful for today’s building environments and high data requirements. The same is true for rules-based processes in general when it comes to really understanding and troubleshooting today’s buildings.

That’s where Digital Twins come into play. On the one hand, with rules you really need to know everything that could go wrong and how; program it in and wait for the exact condition to trigger. This is tedious and time consuming. Worse, it doesn’t know good from bad. A rule/trigger is a rule/trigger. So, it’s no wonder most facilities folks just ignore the rules-based alarms —often by the hundreds per day.

On the other hand, we have a Digital Twin system in play. For the sake of argument, let’s assume that Digital Twin is ours and has been fed 20,000 data points from a large building. Pretty normal stuff actually and something we, in fact, do. Each one of those points is trended and stored for analysis. Compare that to the two weeks or month of a typical BMS and you begin to see why cloud services are key; but that’s a topic for another blog. Anyway, now let’s apply a key AI tool – one-class classifier (OCC). OCC allows us to grab any historic time period and set of points and determine “normal.” Once done, the classifier is applied to data as it’s ingested. Anomalies are automatically detected.

But is it right? That’s where the human expertise comes into play. It’s important to have some building system expertise in order to know right and wrong during the training process for OCC, and make sure you tune the parameters to capture the correct/incorrect examples. It’s only after the learning process that OCC is put into place and trusted to generate a minimum of false positives when running in real-time.

The above example is again, a mirror of the Digital Twin as defined by Wikipedia. We’re using historical data to create a model, real-time data to test the model, human expertise to validate the model, and then learning to improve the model.

 

We’re Only Getting Started

The examples I’ve provided are only a start. As we train our optimization, diagnoses, and modeling engines we continually push the boundaries of the Digital Twin for buildings. IoT data, in the form of standalone or networked sensors further enable the learning, testing, and validation in a more automated manner. We recently took a big step to creating Digital Twins for pneumatically controlled buildings through our partnership with Cypress Envirosystems. Soon we’ll be releasing our Digital Twin version of fault detection and diagnosis.

It’s an exciting time to be applying cloud-based tech to building systems! Keep an eye on this space as we’ll be providing updates and interesting articles on Digital Twins and the enabling data science throughout the year.

By Steve Nguyen, VP of Product and Marketing at BuildingIQ. This article was originally published on the BuildingIQ blog and was reprinted with permission.

 

 

OSHA Written HazCom Plan
Sponsored By: VelocityEHS

  
Choosing the Correct Emission Control Technology
Sponsored By: Anguil Environmental Systems

  
EHS & Sustainability Infographic
Sponsored By: VelocityEHS

  
Energy Manager Today Product & Project Awards 2018
Sponsored By: Energy Manager Today

  

Leave a Comment

User Name :
Password :
 
If you've no account register here first time
User Name :
User Email :
Password :

Login Now
Translate »