How can we talk about physics-based UIs and panels and bubbles that can be flung across the screen if we’re sitting around looking at static mocks? (Hint: we can’t.) It’s no secret that many of us on the Facebook Design team are avid users of QuartzComposer, a visual prototyping tool that lets you create hi-fidelity demos that look and feel like exactly what you want the end product to be. We’ve given a few talks on QC in the past, and its presence at Facebook (introduced by Mike Matas a few years back) has changed the way we design. Not only does QC make working with engineers much easier, it’s also incredibly effective at telling the story of a design. When you see a live, polished, interactable demo, you can instantly understand how something is meant to work and feel, in a way that words or long descriptions or wireframes will never be able to achieve. And that leads to better feedback, and better iterations, and ultimately a better end product. When you are working on something for which the interactions matter so greatly—in this case, a gesture-rich, heavily physics-based ui—anything less simply will not do.