Virtual Reality for Urban Gardening: Creating the Model

Earlier this year, MAVRiC were approached to develop an exciting Virtual Reality (VR) application that aims to address the need for sustainable urban agriculture. In collaboration with the Brighton and Hove Food Partnership and the University of Brighton, our task was to reflect, and perhaps enhance the principles outlined in Dr Mikey Tomkins “Edible Map” project.

In case you missed our urban gardening promo video, watch it here!

The edible map project is an ongoing research initiative since 2010, where maps of UK cities are drawn up and reimagined as urban agriculture hubs. Groups of enthusiasts are taken on walks around their local area and asked to imagine what the surroundings could look like if they were to become sustainable urban gardening areas.

The key word here is ‘imagine’. Letting the mind run free with ideas for the area is one thing, but this can be a difficult thing to share with others, or even put forward to local councils or governing bodies that could make a real change to the area. That is where VR comes in; the idea being: reflect as much of the local area as accurately as possible in VR, then, provide the user with tools to place, remove and comment on various aspects of the site to help aid the discussion into urban agriculture.

The Modelling Process

The site we specifically looked at was the Brighthelm garden next to the Brighton and Hove Food Partnership building. To capture the site as closely as possible, we took several LiDAR scans as well as a series of 360 photographs. Our goal was to use the LiDAR scans natively within the VR application, however due to the size of the area we captured, this was incredibly dense, and we struggled to get a good performance with it dropped into the application we were building in Unity.

Brighthelm Gardens

However, this didn’t go to waste, this helped extensively with the site dimensions, especially as in the daytime, it is likely a laser measure would not have worked, and a tape measure would have been far too short for what we were trying to measure. To be able to survey large areas of land with our phones was a huge help!

LiDAR Scan of the site

To aid in the measurement process, we also used a tool called ‘Blender OSM’ to get building sizes, locations and dimensions. This is a fantastic tool which allows you to draw a box anywhere in the world, and quickly create a 3D model of the site. We loved it so much, we made two models; one for the 1:1 site model and another for a 1:500 city model, to help the user see the wider site better.

Blender OSM

With the site made, we then had to model things for the user to place. Taking the map used for the walkaround, we turned the icons into various 3D models of objects like planters, chicken coops and beehives. Originally, we had drawn these in SketchUp, however the import process made these very dense with very poor UV maps. So instead, we reverted to Blender, modelling natively within the application before porting to Unity.

Before and after fixing the UV maps

Feedback on the Model

After developing the VR tools (more on that in future blogs!), we presented the application in a Feature Complete Alpha state to approximately 30 participants. All provided fantastic feedback on tools and model alike. Below is a list of the key bits of feedback we got on the model itself and what changes we made in the Beta release.

“A low poly environment gives the impression that this is a game and not a tool.”

This was always a concern for us, naturally, we wanted to create a scene that felt realistic to mirror the site as closely as possible. There wasn’t much we could do with this, but we did implement a change in the Beta build to aid this. In the following build we pre-placed objects so that users could see what the area could be like before removing/commenting on them, having a blank canvas really emphasised the ‘game like’ nature of it and we found that having the pre-placed objects encouraged a discussion into them.

“If you have not visited the site, the wayfinding may be quite challenging and lacked real reference of the place and its environment”

We wanted to make it very clear that this is an application that should not be used in isolation. People should be going to the site then using the VR tool to help with the planning process. However, we felt that some users may be limited, such as wheelchair users and ambulant individuals, and to not exclude this demographic, we inputted the 360 photos into the application itself, so users could ‘visit the site’ before entering the low poly 1:1 model.

“The model doesn’t reflect the key elements of urban gardening such as time of the day and solar analysis.”

If we’re being nice to ourselves, we ran out of time to implement this for the FC alpha build, in hindsight however, this is a feature we should have prioritised more, and it was certainly something we wanted to include in the beta build. We therefore introduced the option to move the path of the sun to adjust the time of day, this helped give the user an idea of where the shadows landed on the site, aiding them in their decision to place things like planters or polytunnels.

“The site could be changed as well, such as the formation of new ponds and paths.”

An interesting one: to be able to edit the site model itself in such a fluid way was a challenge to squeeze in within the timeframe. We did, however, introduce a drawing tool to be able to comment on the site and potentially even draw new pathways. Perhaps in a later build, we could introduce this feature but as a way to address the immediate concerns, this really did the trick!

“Models like polytunnels and planters could have been ‘custom lengths’ rather than fixed.”

Unfortunately, we didn’t get around to getting this into the beta build, but certainly something that would aid in the development of the site. Especially smaller ones where planters may be too big to fit into the site.

Conclusion

In general, we found that VR could not replicate the real surroundings, even in a hyper-realistic state, things like smells, temperature, and movement of people through the site were unreplaceable and key elements to urban agriculture. However, as a tool to aid in the process, we felt that our site model did the job nicely, replicating the key areas that could be built on while also providing a 360 tour of the site.

We have a series of blog posts planned to discuss the tools we have made for this application, so keep your eyes peeled!  

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s