An MEP Workflow with Speckle
Once upon a time, I was working as a fresh Arup intern with the dream of automating all the boring bits of my day-to-day project work. There were a lot of simple but time-intensive tasks that I was doing regularly that a human really didn't need to be doing (if anything, I was introducing an unnecessary level of human error 😅). This included stuff like pulling values out of Revit, manually plugging them into Excel spreadsheets to do calculations, ensuring the data and the spreadsheets were up-to-date, then hunting through a design report to replace all the values with updated results. These tedious tasks were eating into my time and several of my colleagues felt the same way.
Well, we're engineers 🦾 How could we make this better? One of my colleagues came up with the idea of putting together an example workflow to explore how we could improve this - and prove to the higher-ups that this was worth doing 😉
The goals of this workflow were as follows:
1. Get from a design brief to a design report in an efficient and semi-automated way
Here, semi-automated meant there was always a human to "press go" at each step. This was for transparency so someone less familiar with the tools didn't feel as if they were putting input values into a black box.
2. Maintain a Single Source of Truth for project data
Collaborators should have a single place to look for project data and be confident that this will always be up-to-date. No hunting in different models, spreadsheets, or PDFs (!) to find the values you're looking for.
3. Allow for flexible collaboration across different software platforms
This was critical as we were all in different offices using different software. Collaborators will have tools and processes that they like to use, so allowing for this would make it easier to get people on board.
At the centre of this, we needed a platform agnostic way to collaborate. What would be our Single Source of Truth? Enter: Speckle 🎊! With Speckle, data could be stored in the cloud (death to laggy network drives 🔨) and shared amongst all collaborators regardless of the software platform they were using. We could send multiple Streams of data and collect them in one Speckle Project shared amongst the team. The three of us - me in Edinburgh, Tom in London, and David in Manchester - got to work.
Here is a diagram to visualise the flow of data we were working with. First, input data would go into Speckle from Excel and Revit. We would then each run calculations using tools in Grasshopper, Python, and Dynamo and send these results back to Speckle. Finally, we'd output our results to the Revit model and a PDF design report.
Let's walk through the workflow so you can see this in action!
Design Brief: Excel → Speckle
The first step was simple: getting design brief values from an Excel spreadsheet into Speckle. Tom got these values from a client design brief which included things like occupancy, ventilation requirements, wet/dry bulb temperatures, and W/m^2 allowances. This was done manually by simply copying the data from Excel and pasting it into a new Speckle Stream.
Room Geometry: Revit → Speckle
Next, we needed to get room geometry information from a Revit model into Speckle. This was before the birth of SpeckleRevit, so I achieved this by creating a custom Dynamo node which extracted the data I was interested in (name, number, floor area, perimeter, and room height) from the Revit Spaces. This plugged straight into the Speckle node and we were on our way!
Calculations: Grasshopper, Python, & Dynamo
Now we get to the exciting bit — calculations 🤓! Within the firm, there were already several tools and scripts for automating different kinds of calculations. However, these existed in different languages and software making them difficult to share across projects. Using Speckle as our Single Source of Truth decoupled our project data from any single model or software meaning my colleagues could use whatever they wanted to produce their calculations as long as the results were piped back into Speckle.
Tom used the Speckle Project as input for his Grasshopper script to run the mechanical ventilation calculations:
I used PySpeckle within a Python script to run the room load calculations:
And David used Dynamo to run psychrometric calculations and size the air handling units:
There and back again...
We were able to seamlessly collaborate across various different software platforms thanks to all our project data living in Speckle - our Single Source of Truth ✨. It all worked and I was ecstatic, but now it was time to get that data back out. To wrap this all up with a neat little ribbon 🎁, these calculation results needed to be pushed to the Revit model and a final design report.
Populating the Revit model was achieved using a simple Dynamo script (and another custom node — oh, how I would have swooned over a SpeckleRevit plugin 👀). This pulled the results from Speckle and pushed them into the model as properties on the corresponding Revit spaces.
Finally, how do we get that beautifully formatted final PDF report? To do this, I set up a template report which contained dynamic fields to be populated using the data from Speckle. I decided to set up the template using LaTeX and compiled the report using Python. (However - if you're wedded to Microsoft word - I have used docxtpl with success 🧙♂️)
Tada - we've done it 🎉! We've gone from a design brief to a design report with under 3 minutes of actual work 😎. As things change, any of these steps can be re-run at a press of a button and the Revit model and report can be updated to reflect this. See the full workflow in the video below:
P.S. This workflow -- along with loads of other cool SpeckleStuff -- was shown at our last Speckle Community Meetup in London. Find more info and a link to the recording in this blog post 🌟