Design and build a novel interactive physical user interface prototype that addresses a concrete problem
A working prototype of a device that helps reduce accidental overdose by simplifying the way medicine consumption is tracked
The project was an individual assessment for my MSc's Physical Computing and Prototyping module. The brief was to design and build a novel interactive physical interface prototype that addressed a concrete problem. The prototype must feature a micro-controller, sensing, actuation, and digital fabrication. This project, which spanned 2 months, addressed the problem of accidental medicine overdose.
Inspiration came when I fell ill, requiring me to take 3-4 medicines. Alternating between states of sleep and wakefulness, I struggled to remember when I had taken each medicine and wondered if I had waited long enough for another dose. I wished there was an easy way to ensure I was taking my medicine correctly.
Desk research revealed that others, especially the elderly, experience the same problem. According to the NHS, "the most common form of poisoning in the UK is from medication". In the US, many calls to poison control are from older adults who get their medication confused. When talking with students, I found that those who had experienced serious illnesses or had recovered from surgery had faced the problem. However, most students did not think that accidental overdose was a problem for them.
After reviewing research, I chose to continue with the idea of reducing risk of accidental overdose. Specifically, my design would address (1) taking extra doses due to forgetfulness and (2) confusing medicines with one another.
Desk research, discussions with students, usability guidelines, and personal experience were used to create design principles. Specifically, I designed with situational (dark room), temporary (drowsiness from medication/illness), and permanent (low vision / dexterity due to age) disabilities in mind. The physical interface should (1) make information accessible to those with impaired vision, (2) have a simple straightforward UI, and (3) communicate meaning through multiple senses (text, colors, sounds).
My personal experience and interest in RFID led to the basic idea to tap medicine on a reader to record doses. To expand upon that idea, I used sketching to visualize potential designs. Sketching was used throughout the design process to explore placement of components and integrate new functionalities.
The user's goal is to find out if medicine is safe to take. To map out how the user would interact with the device to achieve that goal, I created a user flow diagram. Several iterations were mapped out and revised based on student feedback. Changed included removing redundant steps / conditions and rewording messages for better clarity.
The final user flow began with a message inviting users to interact with the device. A user then scanned a medicine, which triggered a response indicating whether it was ok to take (based on time since last dose). After requesting and receiving confirmation from the user that a dose was taken, the device would create a timestamp for the dose.
Prototyping started with an Arduino, RFID reader + tags (to recognize medicine), and a laptop (to show messages and receive input). Next a display and buttons were added to allow users to input and receive information on the device itself. After that, lights and a buzzer were integrated to supplement the text output with audio and visual cues.
The Arduino prototype was developed in phases. In the first phase, a way to read and record dose time for each medicine was programmed using the Arduino IDE. Medicines were identified using RFID and variables for each medicine (tag ID, name, recommended wait time, time of last dose) were stored using a struct. Additional components were then added one by one.
The final form factor was the result of several iterations. I explored ideas through sketching, paper prototyping, 3D CAD models, and a functional foam core model. Once the design and dimensions were finalized, the form factor was assembled from laser-cut acrylic.
Prototypes were shared with students throughout the design process for feedback. This feedback, increased familiarity of component capabilities, and usability concerns contributed to the final design.
For convenience reasons, the prototype was tested with students using sample medicines. Testing consisted of short informal observations of users interacting with the device in a university-run makerspace. Users were observed interacting with the device and were asked questions based on their interactions.
Overall, users were able to interact with the device with little guidance. However, there were a few problems. For example, users often pressed buttons too early as they didn’t realize there was more information on the next screen. This behavior made the machine seem unresponsive and distracted users from important information. To fix this, I decreased the delay time.
The final prototype was presented to professors and students at the Physical Computing and Prototyping module showcase. For the showcase, a poster and video accompanied a live demonstration.
Helping children with Autism Spectrum Disorder move from one activity to the nextView Project
Empowering local traders with varied levels of tech literacy to promote their businessesView Project
Studying the effect of feedback on the emotions of occasion wear shoppersView Project