From Insights to Interface
Using our research, we sketched, structured, and prioritized the core functionality:
- A dashboard-first approach to centralize the experience
- A program explorer with compatibility filters
- A chatbot named Orion to provide contextual, real-time guidance
- An application tracker to follow every application milestone
We started witn paper sketching for ideation and concept validation
We returned to our focus groups to validate our early designs. A key insight was the need for a friendlier, more human experience: something less robotic and easier to use. Participants shared that having an education counselor and a way to track their applications made a big difference in their application journey, while those without support often felt lost.
This led us to reframe Glint as a personal digital counselor. A platform that guides users through their university applications with empathy, personalization, and AI-driven support.
From that point on, our goal was clear: to bridge the gap between guided and unguided students by offering real-time advice, tailored recommendations, application tracking, and a warm, branded experience led by our chatbot mascot, Orion.
...and we testing them for usability with 13 test participants from the target demographic.

To evaluate the prototype’s effectiveness, we conducted usability testing using a combination of methods: the Wizard of Oz technique to simulate chatbot responses, the Think-Aloud method for real-time user feedback, and interactive Figma prototype sessions lasting around 30 minutes.
Our goal was to uncover how users navigated the platform, used the features, understood the process, and reacted to the guidance provided.
1. Profile Setup
Scenario:
Users were asked to imagine themselves as international students creating a profile to receive personalized university recommendations.
Task Flow:
- Begin onboarding and proceed through multiple profile questions
- Navigate using Next, Back, and Skip buttons
- Choose between multiple-choice and open-ended inputs
- Complete the setup and view a confirmation or welcome message
Journey:
This flow simulated the first-time experience of onboarding, aiming to gather enough user context for tailored recommendations and future platform personalization.
2. Chatbot (Orion)
Scenario:
A student from Pakistan uses the chatbot to simplify their search for German Master's programs in Design.
Task Flow:
- Locate and launch the “Ask Orion” chatbot from the dashboard
- Read onboarding instructions and example prompts
- Ask Orion to recommend universities for design
- Ask for detailed information about a specific university (e.g., deadlines for Rhine-Waal)
- Ask the chatbot to navigate to another feature (e.g., Tracker)
Journey:
This multi-step interaction explored how users initiate help, request specific guidance, and transition between informational and navigational queries — all within a conversational interface.
3. Application Tracker
Scenario: Users are managing their university applications and want to manually log a new program to the tracker.
Task Flow:
- Navigate to the Application Tracker via the dashboard menu
- Click “Add New Item” to input a new entry
- Fill in program details such as name, university, location, degree type, deadline, and tuition
- Mark the application method (e.g., via Uni Assist) and set a priority level
- Submit the entry and confirm that it appears in the tracker list
Journey:
This flow tested how efficiently users could organize their application pipeline and whether the manual input process was intuitive and complete.
Insights uncovered from usability testing:
From testing the users, we uncovered the following insights
• Onboarding:
The onboarding experience revealed some usability gaps in task clarity and flow.
- Users struggled to locate the “Complete Profile” button, indicating a disconnect in guidance flow.
- The number of setup steps was perceived as overwhelming when introduced all at once.
- A step-by-step layout, supported by a progress bar or checklist, was suggested to improve pacing and clarity.
- Many wanted the ability to defer non-essential actions, such as uploading documents or entering optional details.
• Chatbot (Orion):
Participants appreciated the idea of conversational support, but the structure of interaction needed refinement.
- The bot’s responses felt too reactive—users preferred it to proactively guide them, asking clarifying questions instead of just replying.
- Users needed a more layered layout of information, starting with broad suggestions and narrowing based on preferences.
- They wanted contextual guidance with links or supporting explanations, especially when making program decisions.
• Application Tracker:
The application tracker was functionally appreciated, but its usage flow could be improved.
- Users liked being able to track progress but preferred integrating it into the search flow, rather than manually inputting data later.
- Many requested the ability to customize and prioritize what fields are shown, like deadlines or tuition.
- Some users felt uncertain about terminology and suggested contextual hints embedded within the layout to guide their understanding.
• Profile Setup:
The layout of the profile form created friction during completion.
- Users found it too dense, which made it hard to stay focused or know what was essential.
- They requested that required fields be clearly separated from optional ones, helping them prioritize effort.
- Adding brief explanations for why certain questions are asked would help them feel more comfortable sharing personal details.
• Navigation & Flow:
In general, participants responded well to the overall structure but identified key improvements in layout sequencing.
- There was a strong preference for progressive disclosure—only revealing sections as needed, rather than everything upfront.
- Users wanted clear directional flow, where each section naturally led to the next without feeling like a restart.
- Navigation across sections needed to feel more goal-oriented, aligned with typical application phases like explore → shortlist → apply → track.
Let's get a developer invovlved?
To bridge the gap between concept and execution, we consulted Professor Karsten Nebe, Head of the Usability Engineering program at Rhine-Waal University, experienced software engineer, and avid AI enthusiast.
The meeting was pivotal in grounding our most ambitious ideas, particularly the Chatbot and Profile Matching System, in technical reality. Professor Nebe helped us navigate backend challenges and explore a more scalable, rule-based system powered by machine learning and NLP, with RASA as a potential framework.
The Chatbot would start with predefined responses and evolve through use, while CV-based data extraction offered a privacy-conscious alternative to transcript uploads.
These insights shaped a realistic and responsible development roadmap that is smart, scalable, and student-first.

and finalize the Information Architecture.
Before jumping into high-fidelity design, we created an information architecture (IA) diagram to map out the platform’s core structure. This helped us visualize key user flows, organize content, and ensure the navigation would be intuitive. The IA served as a blueprint for aligning design decisions across the team.