plant_plastic%20bell_edited.jpg

Guest Garden is an ongoing project dedicated to exploring the intersection of ecology, surveillance and image capturing technologies, and symbiotic practices such as gardening, collecting and documenting.

 

This page, formatted as a blog, is a way of thinking through and drawing connections between my research and experimentation and is under constant construction.

This video shows an animated 'sketch' of twisted flora occupying a virtual garden space

Initial test of 'Guest Garden'. 3D models of land and sea plants are manipulated and animated to be experienced in augmented reality. 

view my AR piece via this QR code - (Aero App is currently only available on latest iPhone iOS)

model_material0000_map_Kd.jpg

Images captured of young tomato plants during 3D scan. These images are separated from the body of the 3D object. They can be reunited with their form in post-production. In this image format, each frame captured by the camera during scanning in collaged into a carpet of glitches and glimpses of plant. 

Plants often evade effective 3D capturing due to their multifaceted, textural and complex appearances and structres.

"Glitch" shots from Google maps Photo Spheres:

These images are taken from 3D photos uploaded to Google maps all over the world and are represented by blue circles. The images harvested from these photo spheres all display glitches, ruptures and inconsistencies in the world captured around them. They are used here as a way of imagining what might be beyond, hidden within, or revealed by ruptures in this mapping technology. This is important when thinking about non-human life, as it allows us to thinking about how many lives and worlds are taking place around us, beyond our comprehension, at all times.

Screenshot 2021-04-03 at 21.47.18.png
Screenshot 2021-04-03 at 21.31.41.png
Screenshot 2021-03-03 at 16.46.09.png
Screenshot 2021-03-12 at 14.51.38.png

map with AI soundscape, 2021, 1 min 20 

Video documenting an Augmented Reality filter. The filter is a made up of a carpet of images taken from 'glitched' photo spheres. The glitch is found by looking down at the feet of who or what has taken the 360 photo. Here the glitch shots can be explored in AR.

The accompanying audio was made by uploading the glitch shots into a programme which uses AI to predict the sounds that you would hear if you were in the location where the image was taken. Many of the sounds are out of touch with the reality of the location.