BeepStreet

team

demi hu
lois huh
cammi tran
mira wang

role

UX designer
writer

timeline

spring 2024
2 days

tools

meta horizon worlds

Urban metrics can tell you a car is 10 feet away, nearby construction is 100 decibels loud, and a barren parking lot takes 1 minute to cross—but how do these numbers feel in context?

Background

BeepStreet was created at BuildFest, an annual weekend-long hackathon for using immersive tech to design for social impact. At this year's Meta-sponsored event, we built in Meta Horizon Worlds.

The social issue we chose to address was environmental justice — not just climate, but how built environments often perpetuate inequalities in walkability, noise and air pollution, and spatial comfort.

Problem

Spatial environmental concerns like noise pollution, physical obstacles, and air pollution are difficult to visualize through static data alone. Traditional methods like graphs and static 3D models often fall short of replicating the real experience a commuter has when navigating through an urban area.

Understanding these concerns is essential for urban planners and legislators to empathize with the people who experience it and design for them. However, there are limitations: it’s not accessible to actually try living out all the different lives of everyone they design for, it's not feasible to physcially build and rebuild cities, and planners don’t have a perfect imagination that can predict what their solutions will actually be like to experience.

Goals

Create an immersive, interactive, and realistic tool for planning cities

Display auditory and visual spatial data the way humans experience them

Foster empathy and understanding by sharing different commuter perspectives firsthand

Concept

To address the above, we came up with BeepStreet, a speculative VR tool for urban planning.

BeepStreet allows users to build and edit virtual models of urban plans, then run first-person simulations navigating through those plans. The simulations incorporate dynamic elements like spatial audio, moving traffic, and mobility obstacles to increase realism. By adjusting these parameters, users can predict issues, test different scenarios, and accurately visualize urban experiences. City planners and policymakers will be more informed and find more success at addressing people’s needs and environmental concerns.

Prototyping

I designed and sketched a user flow storyboard, then we moved into Meta Horizon Worlds to build the environment and script interactive elements.

Outcomes

Beepstreet won 3rd place at Buildfest and received the opportunity to showcase at SXSW 2024.

Demi Hu © 2024