NeuroFocus XR
NeuroFocus XR is an AI-based Neurofeedback therapy for ADHD adults in Mixed Reality.
NeuroFocus XR empowers users with live brainwave feedback and visual stimuli, helping them self-regulate their emotions and actions in real-time.
They can be aware of their brainwave patterns to choose tasks that are suitable for their current mental state, thus optimizing their cognitive performance and reducing stress levels.
TIMELINE
7th May - 24th July 2024
This was my Masters Thesis project guided by the Royal College of Art London MA Design Product's Senior Tutor Alex Williams.
PROCESS ๐๏ธ๐๐จ๐ปโ๐ป
Research
Identifying issue
Secondary data
Understanding solutions
Expert's insights
Integration
User persona
User journey
Information
architecture
Crafting
Moodboard
High fidelity
User Interface
Visual Stimuli
Prototyping
Proof of concept
Coding internal
Components
XR prototype
Reflections
User testing
Next steps
DIA HM RCA 2024
1
2
3
4
5
DOMAINS
User Experience Design
User Interface Design
Spatial Development
Creative Technology & IOT
Human Cognition
Behavioral Therapy
RESPONSIBILITIES
User Experience Research
Sketching
Prototyping
AI Integration
Visual Interaction
Mixed Reality Design
TOOLS
Figma
Adobe Creative Suite
Unity
Touch Designer
ESP-32 & Max 30102
Protokol

+

Unfortunately, these primary treatment options have potential limitations, such as medication side effects, lack of behavioral improvement, high costs, and major time commitments.
Long waiting lists for ADHD assessments, particularly in public healthcare systems, can significantly prolong the diagnostic process.
Moreover, people are hesitant in accepting medication due to adverse side effects, which include the loss of appetite, anxiety, insomnia, headaches and irritability.
Some individuals may have to wait months or even years to be evaluated by a specialist.
ADHD BRAINWAVES
COGNITIVE BEHAVIOUR THERAPY
ADHD SYMPTOMS & THERAPY ๐ง ๐ง๐ป๐ก
UNDERLYING QUESTIONS ๐๐ปโโ๏ธ๐โ
To outpack the contributing factors, I had look deeper and answer the underlying questions guided by Alex:
How do they operate?How do they perceive their environment, things around them?
What tasks they do well? And for how long?
Whatโs their productivity like? Are they any different?
What if we liberate ADHD people of their weighed down medication plans?
What if we let their creativity run wild and free?
How do you make these people realize their full potential?
By 'they' here I meant adults with ADHD symptoms (diagnosed or undiagnosed) who unknowingly suffer, leading to a decline in their workplace productivity.
+



WORK STYLE
MINDFULNESS

EXPERT INSIGHTS ๐ง๐ง๐ปโโ๏ธ๐
Somatic work, Meditation, Mindfulness movement are some of the initial harmless treatment options for ADHD people.
After Neurofeedback therapy, patients see themselves as improved beings in terms of focus, self-regulation of emotions, understanding of the situations, better social engagement, control in anger issues, and more.
However, existing Neurofeedback therapies are expensive, requiring long, and consistent sessions.
The games are not engaging enough, graphics are poor quality, are only appointment-based, require Neurofeedback therapist at all times throughout the therapy, and have mixed feelings among people.
New emerging ways have the potential to change the mindsets of people towards the methodologies and the technology in which they can be used.
USER PERSONA ๐ฉป ๐ฉ๐ปโ๐ฌ ๐
By developing a user persona I understood what would the end users really value. Since they were adults, there are a lot of different professions and creating personas for each would not have been an ideal choice. Hence, I choose a median and tried addressing most of their workplace issues. Both physical and mental.

USER JOURNEY ๐ ๐ฉ๐ปโ๐ป ๐
For the user journey, I explain how and where all does she encounter problems on day-to-day basis. May it be technical, or mental issues, all of them affect our brainwave patterns and influence to change.

EMOTIONS ๐ฅน ๐ ๐ป
We human beings are emotional creatures. Throughout history, we have slowly understood and categorized each of them. But what we don't understand is that's not how we feel emotions. We experience them as a continuous wave.
Here, I have categorized each emotion from Care, Joy, Guilt, Fear and Sadness. And displayed how we actually experience them.

MOODBOARD ๐ง๐ปโ๐จ๐๐จ๐ปโ๐ป
The Moodboard served the purpose of a reference to keep in mind some visual cues while designing the main platform. Since Mixed Reality is a combination of physical and digital aspects, it was tricky to find the balance, especially when dealing with brainwaves and color.

HIGH FIDELITY ๐งญ๐ช๐
While designing high-fidelity designs, I took some insights from the similar workplace platforms like Monday for example. This helped to understood which tools and parameters were the most used and how efficient they were.


Keeping the design guidelines in check, I made sure the screens were not actually blocking to much of the person's view because they were supposed to aid the user in working and obstruct them.

Experimenting with how much of the color gradient based on brainwave patterns should be visible was a challenging task because a little bit of less and the user would not see it their semi-focused vision angle and if a little more, it would interfere with the user's attention and loose it's purpose.

INITIAL USER INTERFACE ๐๐บ๐
Next I created the user interface that would let the user know about their brainwave pattern. Recognizing this, it would then suggest them appropriate tasks that were suitable for their current state. My aim was to keep the screens at minimum and be displayed when they are absolutely required, so that people can focus on their work.

Next I created the user interface that would let the user know about their brainwave pattern. Recognizing this, it would then suggest them appropriate tasks that were suitable for their current state. My aim was to keep the screens at minimum and be displayed when they are absolutely required, so that people can focus on their work.

Moreover, I understand that there would be ethical and privacy concerns about other people's brainwave states but I am sure we could come up with a solution that benefits mental health, the employee and the employer.
BRAINWAVES
VISUAL STIMULI

+

BRAINWAVES & VISUAL STIMULI ๐๏ธ๐ง ๐
The visual stimuli is placed in the semi-focused area of the human vision angle so that the user can focus on their task but simultaneously be aware of their brainwaves patterns.
This way users can get live feedback of when they get distracted or zoned out with the help of changing color gradients on the side of their vision, indicating them to switch to a different task suitable for their current brainwave state or take a break


Final working prototype that interacts with biometric sensor to display colors according to the brainwave states of the user making it optimized individually for them in Mixed Reality.
WORKING PROTOTYPE ๐โก๏ธ๐
Final working prototype that interacts with biometric sensor to display colors according to the brainwave states of the user making it optimized individually for them in Mixed Reality.
The journey was not at all smooth which I had anticipated looking at the complicated dynamics of many different aspects working together. Since brainwave sensors were expensive and out of market reach due to ethical reasons, I used a heart-rate sensor to monitor the heart beats and then calculate the brainwave state in which the person was.
This way I could prototype my design and plus understand if the components were working well together.




The most challenging part of the project was to configure the sensors to interact with the color in real-time. This had a lot of hardware, software, coding issues to deal with while developing the project. The sensor had to be also soldered so that it completes the connection perfectly without any issues.
I also 3d printed a small housing for the sensor that could be attached to posterior auricular nerve so that we could track the heart rate and calculate the brain waves.
CODING INTERNALS ๐ฉ๐ปโ๐ป๐ผ๐ป
After 8 coding failures on new files every time, and numerous visits to White City campusโ creative coding consultations, finally I was able to match the code that worked specifically for my sensor and fulfilled the goal that I wanted.
Technician Kyle helped me with understanding and building the code and Hamid from Robotics research lab helped me soldering the connection together.
Hence, shoutout to Kyle and Hamid (RCA's technicians) :)



Somehow the code was not able to connect to the schoolโs Eduroam wifi an also not to my personal iphone hotspot so I had to use my friendโs android hot spot to transmit the data via wifi from the ESP 32 to Touch Designer software.
With a little help of ChatGPT, technicians and my knowledge, together we developed the code from scratch.
Finally it was configured the right way to wirelessly transmit the heart readings to Touch Designer and from there to the user's Mixed Reality device. I want to admit this was one of the most crucial moments during the project.
You find the prototyping videos at the link.
VISUAL INTERACTION ๐คน๐ปโโ๏ธ๐๐บ





FAILURE #1โน๏ธ๐ซ๐



SUCCESSFUL TRIAL ๐๐๐


PROTOTYPING MIXED REALITY IN UNITY ๐๐ฎ๐


BRAINWAVE COLOR SCALE ๐ง ๐๐ก

USER TESTING โ๐ป๐ง๐ป๐

DESIGN INTELLIGENCE AWARD 2024 ๐โ๐
