How Bad Chick Studios Creates AR Apps That Mix the Physical and Digital Space
Lafiya Watson of Bad Chick Studios shared with us the reasons behind the AR apps, why she chose Snap Spectacles, how she has created some of her projects, how she adapts them to different settings, and what her next projects are.
Introduction
Hi! I'm Lafiya Watson of Bad Chick Studios, and I'm a Multidisciplinary Artist and Creative Technologist. My path to spatial computing felt random at the time, but in hindsight, it was pretty inevitable. I grew up playing video games, and exploratory ones like Myst had a profound impact on me (I still have the original on CD!). That love of interactive storytelling led me to create mixed media art that combined text, photography, illustration, and whatever tech I could figure out at the time.
I've also been coding since I was a kid, moving from LOGO, DOS, Flash ActionScript, JavaScript, and eventually C#, so when I found AR, the technical side just clicked. I initially started creating AR projects in Unity using Wikitude and Vuforia, but my real turning point was getting accepted into a Snapchat artist residency and falling in love with Lens Studio. I am both a Creator and Developer, so I was impressed with how quickly I was able to go from concept to finished product.
On the project side, I create interactive experiences that utilize features like voice recognition and hand tracking. "Lines and Spaces" is an AR music education game I created in collaboration with Snapchat. "Framed!" is a location-based AR mystery built in Unity where players use a simulated UV light to uncover hidden clues across five real-world locations to help clear a friend's name.
And "Ghosts in the Machine: The Old Zoo" is a location-based AR ghost story also built in Unity with artist and choreographer Koryn Wicks that received the 2024 FIVARS Excellence in Experience Design award and was a 2025 Indiecade Finalist. But the through-line in everything I make is the same: I want the viewer to become part of the project, because you have a deeper tie to a story when you can actually be part of the journey and help influence the outcome.
Snap Spectacles as an AR Platform
Mobile AR is wonderful because most people have a phone capable of running these experiences, so the accessibility is hard to beat. But after a while, holding your phone up gets exhausting, and looking at something through a screen just doesn't feel immersive. Glasses solve a lot of that. It's a form factor everyone already understands, so there's no onboarding curve, no sense of constriction, and unlike headsets, you can share an experience with someone in the same room and actually see each other.
That opens up a kind of connection that headsets just can't replicate. I've been working with Spectacles across two developer generations now, and the hardware has evolved in ways that make longer, deeper experiences possible in ways that would be harder to sustain on a phone. From the developer side, Snap's documentation and support are genuinely great, and it's clear they want you to use the platform and feel empowered doing it.
The Spectacles Interaction Kit gives you hand tracking and gesture functionality right out of the box. Their asset library is huge, and their templates get you up and running fast. Their developer advocate team also does a lot to support developers beyond the platform itself, through events, tutorials, and hands-on help. They genuinely want developers to succeed.
Projects
"Listening Party" is an AR record player experience built for Snap Spectacles, and the inspiration came from an IRL bonding moment. One evening, I was sitting in my parents' basement with them, rifling through their massive record collection, listening to albums, having drinks, and just hanging out. When I got home, I didn't have a record player (or the space for one), so I built a digital version.
My parents are musicians, and it felt like a natural fit to feature my dad's actual music in the experience, which makes it extra special to me. On the surface, it sounds pretty simple, but this project pushed me in ways I didn't expect, and it's become one of the pieces I'm most proud of.
Building the Foundation
My starting point was Lens Studio's Spectacles Interaction Kit, which is a set of pre-built components that gives you the core interaction functionality you need right out of the gate: pinch to select, object manipulation, the ability to scale and move things in 3D space.
That became the backbone of the project, and from there I layered in the UI Kit for buttons, sliders, and all the controls needed to power the record player on and off and navigate between albums.
Placing Objects in Space
One of the biggest upgrades between the first and second versions of my project was adding Snap's surface placement asset. In the original, the record player was fixed in place. If someone wasn't paying attention when the experience launched, it might appear somewhere awkward or totally out of reach. Surface placement solved that for me.
The user looks down at the floor, finds a spot that works for them, pinches to lock it in place, and the record player appears right where they want it. That kind of user control matters a lot, especially with emerging tech, where people are still getting comfortable with the experience.
Guiding the User
I also added Snap's 3D hand hints in the second version, which are animated guides that show the gestures needed to interact with the experience. We can't assume everyone who puts on the glasses has used AR before. Honestly, most people I talk to have never tried it, so whether it's visual cues, onboarding moments, or just designing interactions that feel intuitive, guiding the user is part of the craft. I always try to meet people where they are.
The Hardest Part
Getting the needle to behave correctly was the most technically challenging piece, because the way a needle rotates on a record player was really hard to nail in the editor alone. I kept having to push the project to the glasses to test it in the actual 3D space.
This is something I can't say enough: test on the hardware, early and often, because it is truly a different experience to view it on your computer screen vs. in your physical space. That back-and-forth ultimately required some custom scripting, and getting it right felt like a real win.
Making It Feel Alive
The first version was fairly passive since you basically just put the music on and that was it, so for version two, I added small animated elements that float up from the record when a song starts playing. They serve as a visual cue that the music is running, but I also made them poppable, and reaching out and popping them releases a little burst of motion.
It's very satisfying! The final boss of this project is still multiplayer. I envision people sitting around the living room together, wearing glasses, and taking turns choosing albums. That version is still coming. One of these days!
Creating AR Apps
One of the most important things to consider is the different spaces users might be in. I once worked on a collaborative project that used voice recognition, and it worked beautifully at home, but taking it into a loud event space meant the narration was hard to hear and the triggers stopped firing reliably. We ended up adding closed captions and a timer fallback so the experience kept moving no matter what, and building those safety nets in is just part of good design.
With Spectacles specifically, I think a lot about the field of view, because people are so used to looking at flat screens that they naturally focus on whatever is directly in front of them. AR experiences can extend into the space all around you, though, so you have to actively guide people to look around, whether through visual cues, sound, or just a clear prompt to turn around. Never assume the user knows there's more to find.
Developer Community
The XR community is genuinely one of the most supportive I've ever been a part of, and I say that as someone who spent a long time working alone, thinking that was just how it had to be. Having people who understand the joy and the frustration of working in emerging tech, who will playtest your work and give you real feedback, and who freely share their knowledge, changes how you make things.
Feedback is essential. When you build a project, you already know how it works. You might anticipate where people will get stuck, but inevitably, you'll miss things. Someone coming in fresh may flag issues that escaped your radar, and sometimes come up with ideas you never would have considered on your own. Playtesting helps your project level up.
As for where to find your people: start online, because there are active XR subreddits and Discord servers worth jumping into. But don't sleep on in-person events either! MIT Reality Hack was genuinely life-changing for me, and I met people there I still connect with to this day. AWE (Augmented World Expo) honestly feels like a family reunion every year. Local XR meetups are also worth seeking out, both for the community and as a chance to show your work and get live feedback from people in the same room.
Next App
I've been circling multiplayer for a while now. Adding multiplayer functionality can always be a little tricky, so I've admittedly shied away from it. I tried adding it to a Spectacles project before, and it worked, but I ran into some issues. However, Snap released a newer version of SyncKit that is a lot more streamlined.
Instead of requiring multiple people to scan an area, now you basically look at each other, and you're synced. That streamlined feature is my cue to get back on the horse and try again. The concept is a head-to-head game influenced by Battleship, with a shared game board where players hide elements and try to figure out where the other person has tucked things away.
Kind of a spatial strategy meets puzzle game situation. I'm also planning to bring in a leaderboard, which I used in my most recent game, "Don't Pick the Banana," and loved, because there's something about a leaderboard that makes everyone way more competitive than they planned on being.
Conclusion
AR is for everybody! And contrary to popular belief, you do not need to know how to code to get started. There are tools, including those within Lens Studio itself, that let you build AR experiences using pre-made assets, templates, and components without writing a single line of script, and Snap has clearly invested in making the platform accessible to creators at every level.
Their asset library is huge, the documentation is thorough, and between the templates and the pre-built interaction components, you can get something up and running pretty quickly. Start there, play around, and experiment. I will say that the more complex your ideas get, the more coding knowledge becomes helpful, but don't let that stop you from starting.
Beyond the tools, the most important thing is flexibility. Get comfortable with pivoting, because this space moves fast: tools change, companies shut down, and new platforms emerge. A lot of the knowledge is transferable, thankfully, but learning to pivot is non-negotiable. Stay curious, stay adaptable, and don't put all your eggs in one basket.
I can't wait for the moment when everyone has a pair of glasses and spatial computing is just part of daily life. What thrills me about that future is that it's exactly the kind of work I'm already making. My work lives in the space between the physical and digital, where you aren't just watching the experience but actually connecting with it.
Any wearable device or tool that lets people uniquely interact with their surroundings, I'm here for it. I'm just really eager to see where this space leads and how my practice evolves along the way, and I can't wait to bring others along for the ride.