innori
From 0 to 267%: How I boosted user engagement on a social AI app built from scratch.
My role
Founding Product Designer
Product
Mobile, Web
Timeline
Q3 2023 — Ongoing
Skills
Product Design
Visual Design
Product Strategy
Interaction Design
Prototyping
Developer Hand-offs
User Testing
Team
Michael Carroll, Ivy Wu, Ryan Collins, Yaroslav Malyk, Denys Kurets, Andrii Malyk, Bohdan Skitenko, Ashton Hughes, Vasyl Matsevko, Zahkar Rudenko
GenAI is organized chaos
Generative AI is messy. It takes multiple attempts, tweaks, and guesswork to get something right. Initially, we assumed users wanted polished AI-generated stories, but testing showed they cared more about the act of crafting and experimenting with AI, even if results weren’t perfect (occasionally users even preferred the imperfect results for a good laugh!).
Instead of trying to guarantee perfect AI outputs, we focused on making story-creation fun, interactive, and rewarding.
Intuitive > instantaneous
Our initial design assumed users wanted instant results, but through dozens of testing sessions, we saw that:
🐛 Users preferred a step-by-step creation process rather than an overwhelming, all-at-once AI input.
🍾 Frequent micro-rewards (like animations, preview images, and interactive story "twists") helped maintain engagement and reduce frustration.
🌀 A structured but flexible flow encouraged creativity without letting users feel lost.
Instead of overwhelming users with complex AI settings, we broke the process into bite-sized steps, each offering small moments of control and feedback.
What we learned
Our target audience (16-23 year olds) often had no real understanding of AI. Testing revealed that:
🧠 They expected AI to “just know” what they wanted.
🗒️ They struggled with the "blank page" phenomenon of open-ended prompts.
🤪 They didn’t care for perfect AI results—just a fun, guided way to create.
This led us to create a multi-step process that allowed users to continually tweak and modify their stories at the end of each of their story's "chapter." The intent was to allows users more control without inducing cognitive overload.
267% growth in engagement
🤨 Before: Users spent 1.5 minutes per session, often abandoning the app after only a few interactions.
🤩 After: Engagement grew to 5.5 minutes per session as the experience became more interactive and rewarding.
By shifting focus from AI outputs to the creative process, we kept users coming back.
This project taught us a key lesson: when designing for AI, the experience can matter more than the actual output. Give users control, make the process rewarding, and engagement will follow.