About Me
I'm a UX Product Designer at Arizona State University, working on creating clear and usable experiences. I got into UX during my engineering days when I struggled with information miscommunication while building products. It was frustrating to see good ideas fail. That's what pushed me toward design—making things work better for both users and teams.
Even though my title is in UX, I see myself as a design engineer. I like working at the intersection of design and development, making sure ideas aren't just visually appealing but also easy to build and use. I focus on everything from research to prototyping, ensuring designs are practical and solve real problems. I'm particularly excited about the AI-agentic approach in my design process, using tools like Cursor and V0 for rapid prototyping, along with n8n for task automation, significantly increasing my workflow efficiency.
My educational background includes an MS in Human-Computer Interaction from Arizona State and a background in IT engineering. My work has been presented at international conferences like SIGGRAPH, where I shared projects on immersive and emerging technologies. Throughout my career, I've worked in collaboration with NVIDIA, Meta, UC San Diego, and global clients.

Recently
I had the chance to attend and present at SIGGRAPH '24, the premier conference for computer graphics and interactive techniques worldwide. During the five-day event, I got to meet and learn from amazing leaders like Mark Zuckerberg and Jensen Huang. There were all sorts of presentations, including keynotes, technical papers, birds of a feather sessions, and lab workshops.
It was an awesome experience overall to meet and network with people from various industries. If you would like to know more about this, please feel free to reach out.
Above are two different teams from Meta and NVIDIA, where I had the opportunity to demonstrate our Robotics Academy project. Funded by the National Science Foundation (NSF) and developed in collaboration with NVIDIA, it leverages Machine Learning and Natural Language Processing (NLP) to adapt content dynamically, providing tailored learning experiences based on individual user progress and needs.
It was a rewarding experience to connect with professionals from diverse industries and exchange insights. If you're curious to learn more about this project, read here.
Learnings
Exploring Unity's UI Toolkit to craft custom, adaptable interfaces and design responsive, cohesive UI elements, all while enhancing my design & development workflow.
I've worked on many side projects to challenge myself and learn new technologies. Recently, I focused on React and Next.js, learning to build a modular UI component system. Next, I'm planning to learn Framer Motion to improve my skills in motion design.
Here are my top reads from the years '23 and '24, each offering incredible insights into design, business problem-solving, human biases, and innovative approaches to challenges. Among them, Rewired by McKinsey stands out as my favorite, providing comprehensive guidance on thinking from an outside-in perspective at the company level. I highly recommend it for anyone looking to deepen their understanding in these areas!
Teaching
I've been fortunate to mentor some really talented aspiring designers along the way. One memorable experience was at ASU, where I taught a class called Immersive Experience Design I. I introduced the students to Web XR frameworks like A-Frame, React 360, and Three.js, showing them how to build immersive experiences that work seamlessly across different devices. It was awesome to watch their creativity take shape and see how much they were able to learn in such a short time.
Publications
ACM SIGGRAPH '24 APPY HOUR • MOBISYS '25 PROCEEDINGS
Balboa Park Alive! Exploring Biodiversity through Augmented Reality
Published: This paper showcases Balboa Park Alive!, a series of immersive, interactive phone-based AR installations that explore the biodiversity of the San Diego/Tijuana region. By integrating Niantic Lightship ARDK, Mapbox, and mobile hand tracking, it creates embodied encounters with 3D digital representations of local plants, insects, and animals. Unlike other conservation-focused AR apps, Balboa Park Alive! emphasizes first-hand perspective taking and evidence-based inquiry, guiding families to engage actively with their environment and each other.
IEEE VIS '25 PROCEEDINGS
Fragmented Oceans: A Telepresence-Driven Visualization of Ocean Garbage Patches
Fragmented Oceans presents a telepresence-driven visualization combining robotic interaction and immersive environments to raise awareness of ocean garbage patches. Installed in two connected spaces at ASU's MIX Center, it uses a UR20 robotic arm with acrylic cubes to represent ocean waste data and interactive visuals developed in TouchDesigner. Participants engage both physically and virtually, exploring dynamic wave simulations through telepresence. Early observations indicate that this approach boosts engagement and fosters embodied understanding of ecological data.
Design Process
"Design the right thing, then design the thing right" -- that's my mantra!
One of my key strengths is working closely with my team to shape every aspect of the user experience, from initial concept to final implementation. This collaboration ensures that we create engaging and effective designs that meet user needs.
Research & Planning
Dive deep into the problem. Get to know the users' and business's needs, the surrounding context, and what the competitors are doing.
Design
Come up with a wide range of ideas. Be bold in experimenting and trying new things. Sketch, wireframes, and create prototypes.
Implement & Test
Check if the product works as intended and gather user feedback. Be ready to adjust the product based on this input.
Release
The design process is ongoing. Use data and metrics to continuously refine and enhance the product's performance.