Princeton University
Princeton Private
Design researcher on a 5-person team consisting of engineers and designers
|  see full project

According to recent surveys, a significant number of Princeton University students are engaging in sexual activity, yet they are not properly equipped to engage in such activities in a healthy way. In order to protect themselves from sexually transmitted infections, unwanted pregnancies, and other sexual issues, students must have easy access to sexual health resources on campus. However, since “sex” and “sexuality” are taboo topics on Princeton’s campus and in society at-large, many students may not feel comfortable actively seeking advice and information in person.  Our easy-to-use system is designed to enable users to address sexual health concerns and questions in a private, individualized fashion. Additionally, since many of these tasks involve patient health information, the system will store data in a confidential way and allow users and relevant shareholders to interact with it discreetly.

The design is constrained by what we consider to be high priority user values: privacy and discretion, ease of use, and easy access. Additional considerations include a discreet name for the product that would not appear out of the ordinary in a browser history or app launcher, in case the user does not want others to find out that they are consulting sexual health resources due to social stigma, along with a platform that is available to most people in our target population regardless of what devices they might own or their ability to go somewhere on campus in person. 

In order to evaluate our results, we decided to use empirical evaluation methods since we found them the most appropriate for our product. Automated methods, formal methods, and inspection-based methods would not have provided us with insights that only users could give. We selected users from our target population and conducted a two-part usability test. First, we observed the users while we asked them to carry out a fixed set of tasks, measuring quantitative metrics (including user efficiency and effectiveness) and recording observations. Additionally, we conducted one-on-one qualitative user interviews regarding satisfaction, ease-of-use, and other benchmarking metrics.

No items found.