Leafy 

Leafy is an augmented  reality based product that displays real-time feedback about what a plant needs and how it is feeling. The information is displayed in the same spatial context as the plant. This project is an exploration of a metaphorical graphical interface implemented beyond a screen-based platform. The key goal was to use the potential of augmented reality to make first time growers empathetic towards the environment. 

2 week long project with Jiyoung K, Nana C, Calvin R

My role was research, ideation, conceptualization and interaction flow

THE BRIEF

Find a design opportunity for integrating digital interfaces with physical spaces or objects. 
Design a progressive user interface solution using the capabilities of emerging technologies.
Explain the proposed design through a proof of concept demo.

RESEARCH

First time gardeners realize that something is wrong with their plant when it almost dies. Some reasons are

1. They don't remember to take care of their plants

2. They don't know how much water, sunlight or fertilizer their plant needs

3. They need quick feedback from the plant to know whether they are doing the right thing

FRAMING THE PROBLEM

How can we facilitate communication between first-time gardeners and their plants?

How can relevant information be presented to help people take care of their plants?

EXPLORING SOLUTIONS

Most of the information about the plant is invisible to the naked eye. We explored ways in which the hidden could be made concrete and usable. We realized that the information needed to be contextual, easy to understand and personable for users to establish an emotional connection with the plant. By using a metaphorical alarm through the personification of the plant, Leafy alerts and educates users.

DEVICE ECOSYSTEM AND INTERACTION MODELS

When the plant needs it, Leafy actively reminds the user to provide water, sunlight or fertilizer through unobtrusive visual cues. If the user initiates a conversation, the appropriate response is displayed through a detailed dashboard. Further interaction with leafy through voice commands invoke the views of the temporal aspect and the microscopic view.

Dashboard when the plant needs help

Mıcroscopıc view that builds engagement

Temporal view to create a sense of reward

CREATING THE SIMULATION

We used the Pepper's Ghost effect to do a quick physical prototype, refine the design and demonstrate the idea. We envision that in the near future, mixed-reality technology will develop beyond cumbersome headset displays to provide a seamless contextual experience using natural interactions.

© 2019 by Shruti Aditya Chowdhury