Roadmap

One of NeuroRPG LLC's (www.neurorpg.com) many offerings: This section provides a high-level overview of the project road map and phases. The most up to date version of this can be found at the project BCI RPG Github repository.

BCI RPG Project High-Level Road Map and Phases

Introduction

The Brain-Computer Interface Role-Playing Game project is the brain child (pardon the pun) of Hawke Robinson. Thanks to the hard work of many other contributors and volunteers from RPG Research and elsewhere, this project is getting closer and closer to the reality envisioned all those years ago.

The goal of this project is to attempt to provide the ultimate in accessibility for cooperative social role-playing gaming, and the many benefits that provides to participants.

While it is hoped this project will be of interest to everyone, the ultimate testing should be that this is fully playable with just a players thoughts using various non-invasive neurotechnology equipment, such as brain-computer interface sets from OpenBCI and others. This level of accessibility is not (yet) a requirement of the Adventure Modules Creation Toolset (AMCT) aka "Modgods Toolset", or the server administration tools, just the actual game play for players.

The goal is that someone with Locked-In Syndrom or Complete Locked-In State (LIS/CLIS) can fully participate as a player with others in this turn-based thought controlled interface. See the Origin Story about this project for more details on how the Use Cases should look ultimately.

Obviously we can't get there immediately due to limitations in the technology, but we're getting closer and closer to being able to do so, and this project is in constant weekly development toward that goal.

This is an opensource project under the GNU Affero General Public License v3.0. Please give attribution.

About the Founder, "The Grandfather of Therapeutic Gaming", Hawke Robinson

Hawke Robinson is Executive Director and founder of the 501(c)3 non-profit RPG Research. Both our founder and the organization are huge advocates for accessibility and inclusiveness.

Our founder's background in nursing, recreation therapy, gaming, neuroscience, music, and more has inspired his decades-long endeavor to using non-invasive neuro-and-bio technologies for people with accessibility issues, as well as enhanced experiences for the general public.

We support accessibility not only through all of our training, advocacy, and accessible mobile facilities, but through our active projects to make gaming accessible to all. RPG Research is a 501(c)3 100% volunteer-run non-profit research charitable research and human services organization.

More about RPG Research.

Moved project name from erpg to bcirpg.

Main website: www.bcirpg.com - older page: https://www.rpgresearch.com/bci-rpg

New open source Github source code repository is now: https://github.com/RPG-Research/bcirpg the old opensource github repository used to be: https://github.com/RPG-Research/erpg (now a redirect to bcirpg)

There is also a repository, self-hosted by RPG Research using the opensource Gitea, at:

The project Wiki: https://github.com/RPG-Research/erpg/wiki

Brain-Computer Interface Role-Playing Game (BCI RPG) Development Roadmap

Phase 0 - EEG I/O controllers in NWN on Linux

Inspiration for concept: 1990

Early hardware prototyping of EMG, EEG, EOG, and similar interfaces: 1996-2006?

Learn basics of EEG, bio-and-neuro feedback, and computer interaction, then attempt to replace regular keyboard or other I/O controls for PC-based video game (preferably on Linux due to more flexiiblity of I/O controls) and play game with only EEG and/or bio-monitoring inputs.

Start with NeverWinter Nights on Linux, basic moving and interactive menu control via OpenEEG 5 electrode setup. - Completed ~20??

Phase 1 - Game design prototype in NWN:EE on Linux with EEG/BCI for I/O

Need to have a standardized adventure designed for testing baselines. Something that won't have copyright issues.

Start with branching narrative story development and using NWN Aurora Toolset for adventure creation, to give others a rough idea of where we are headed.

Does not need to be playable with EEG/EOG/EMG/BCI at this phase.

Opensource the project and recruit help from others, most likely through non-profit RPG Research volunteers initially to get some traction going.

Train new development team on game development concepts desired with the prototype and use of Aurora Toolset for NWN.

Create a new original complete (though basic initially) adventure module that will be used as test bed for future prototypes.

Chose Shakespeare's The Tempest. This is public domain, and had a good range of variables for the setting that can fit well into the abstract multi-genre goals down the road as well.

Experiment with controls using EEG and BCI equipment with the Enhanced Edition Neverwinter Nights on Linux.

Create a public open source repo.

Completed 21??

Phase 2 - Game from scratch TUI version BCI I/O online turn-based mutiiplayer

Create from scratch a text-only multiplayer online turn-based cooperative RPG that can be played with only OpenBCI (or similar) equipment. - Currently nearing end of design and documentation steps, and starting actual coding steps. Will use GUI tools with GoDot engine fhe the GM tools, but play must be all Text User Interface (TUI) and able to be run with the simple inputs of OpenBCI. Use the same adventure blueprint as Phase 1. - IN PROGRESS.

Phase 3 - Add GUI, Bots, and Optional [Matrix] Integration Support

Add basic graphics real-time graphics or pre-recorded action videos triggered like Dirk the Daring from Dragon's Lair - on top of existing TUI-based game.

Add optional integration support for bots as well as the Matrix.org protocol for chat and data sharing enhancements.

Phase 4 - Add ML, NLP, ASR, and AI,

Phase 4 - Enhance the game accessibility itself, improve the bots and other components, by adding Natural Language Processing (NLP), Automated Speech Recognition (ASR) and translation, Machine Learning (ML) / Artificial Intelligence (AI)/ Neural Networks (NN) Enhancements. Such as components of the RPG AI and AI GM projects.

To further improve accessibility and overall playability, increasingly increment in various technologies related to Natural language, Machine Learning (ML), Neural Networks (NN), Artificial Intelligence (AI) and other related technologies in different areas of the platform as appropriate and resources allow.

Begin incorporating components from the AI RPG and AI GM (ai-rpg.com) projects as needed, especially for NPCs/Creatures, but also options for PCs, world sociopolitical, events, ripple effects from PC actions, etc.

Phase 5 - Add xR (VR/AR/MR)

Add xR-related options to the graphical interface experience options available. Add AR, GPS, global world location play to game. If viable include "holographic" UI options so no glasses, etc. required. Perhaps integrate some of the ideas from the ZDAY City RPG project.

Phase 5b - Add Additional Enhanced BCI Equipment Support (Galea or similar) Integration

As available (and funding/donations permit), add/incorporate support for newer OpenBCi Galea (or similar) full systems immersion support.

Phase 6+ Add Robotics and other Physical Realm Interaction Features, To Bring it Back to the Tabletop and Live-Action Variants

Bring the technology into "play" in the physical realm so that people can use the tools to participate in in-person tabletop role-playing game (TRPG), live-action role-playing game (LRPG), and hybrid role-playing game (HRPG) experiences, not just electronic role-playing games (ERPG).

Integrate gyro, GPS, and other technologies to add additional physical local realm experiences. Migrate BCI tools to work in physical world through robotics and wireless triggers so can interface with combination of AR and physical devices, including manipulating dice roller devices, robotic arms to pick up and move objects, move miniature tokens around enhanced by AR overlaid animation, so LIS/CLIS can play at the same physical table or larp with other players in the same physical location, not just online.

Phase 0 (completed) 2006 - 2014 OpenEEG

NeverWinter Nights (NWN) Diamond Edition on Linux controlled by OpenEEG equipment.

 

 

Basic control of PC movement. Routing of I/O through OpenEEG 5 channel headset.

Began 2006, functional by 2010, stopped working on it by 2014. 

Very rudimentary movement, and on/off menu (or other on/off assigned hotkey) mapped and working. Not sufficient for LIS/CLIS, but helpful for some disabilities. Very high latency and error rate however.

Need greater resolution of signal, frequency ranges, more CPU power. OpenEEG and civilian system power still leaving something to be desired, but getting closer. Not sufficient for AR/VR addition however.

Phase 1 (completed) 2019 - 2020

NeverWinter Nights (NWN) Enhanced Edition (NWN:EE).

 

Using Windows for Aurora Toolset module development.

Not using EEG or BCI equipment at this phase. Creating prototype adventure that will be used in Phase 2 onward for baseline R&D.

Development prototyping and team training / team building, ue NWN:EE Aurora Toolset to create custom adventure. Began August 2019, completed August 2020 successfully (still adding enhancements over time). Later see if newer generation OpenBCI can be used.

Phase 1b - (pending) - OpenBCI

Is it possible to configure OpenBCI to run this game at, or better than, OpenEEG did for NWN Diamon Edition in Phase 0?

It is not expected that full functionality, or multiplayer features will work. But hoping that the 16-channel higher quality setup may make solo play viable.

Phase 2 (in-progress) 2020 - Current

Scope and build from ground-up electronic role-playing game (ERPG) that can be controlled by accessibility equipment, especially newer generation EEG/BCI equipment, such as the OpenBCI hardware (among others).

OpenBCI

 

GoDot

 

Text-based UI (TUI)

Brain-computer Interface controlled text-only turn-based menu-driven.

BCI RPG Game player, Player Text User Interface

 

 

Game is:

  1. online
  2. multi-player
  3. team-focused
  4. turn-based
  5. initially text-only
  6. menu-driven
  7. non-chat (use other chat solutions as needed) (not a MUD/MUSH/MOO) - using Matrix-synapse as underlying network communication infrastructure, especially for chat
  8. electronic role-playing game that can be played with many different adaptive devices but MUST be fully playable (without chat) using only the human brain of the player(s) (BCI). Ultimately must be playable by LIS/CLIS population.
  9. Opensource

Turn-selection process in noncombat situations for online multi-player turned based life real-time brain-computer interface controlled role-playing game.

Began scoping August 2020, currently actively in progress.

Meeting weekly (broadcast live).

Estimate this phase completion around August 2021 (potentially sooner if current development momentum maintained by team).

  1. Supports Game Master (GM) tools to participate in the game in DM roles with the players during the live game.
  2. Ability to pause game, save game, restore game.
  3. Supports multiple genres (player selected).
  4. Multiple underlying RPG systems (player selected).
  5. Phase 2b Toolset provided for potential game master to create new adventures (does not need to meet BCI requirements for toolset).
  6. Phase 2c - Replicate the NWN Tempest adventure using our Toolset to play in the BCI RPG (text-based play)

More scope details and full documents in Brain Computer Interface Role-Playing Game BCI RPG Github repository.

PC / NPC interaction dialog menues for BCI RPG text-based wireframe and example flow

 

d20 RPG extended conflict overview flowchart. Included Dungeons & Dragons 5th edition (D&D 5e), and open d20 variants.

 

General UI for BCI RPG

 

Graphical User Interface (GUI) for server management of BCI RPG. The use case assumes players are in LIS CLIS, but that DM / GM / Therapist or server admin are able to use full GUI for server administration.

Phase 2b - Develop Module Creation Tools with UI

The module creation toolset is not designed for BCI or LIS/CLIS users. The assumption is that Game Master / Dungeon Master, Educator, or Therapist will use this toolset to create new custom adventures (like the Aurora Toolset is used to create adventures for NWN).

This requires full graphical tools at this point. (Would love down the road to create a BCI-controlled toolset, but that is much more advanced than our current team is ready for. Hopefully we can circle back to this down the road.

 

 

 

Phase 2c - Create Tempest Adventure Using New Custom-built Toolset

In Phase 2 all the basic play mechanics, and the module creation tools are built.

We already created the storyline and branching narrative in Phase 1 with Twinery and implemented in NWN:EE with Aurora Toolset.

Now we need to take the same story and recreate it using our "ModGod" toolset.

Phase 2d: Begin Linking Artificial Intelligence Tools with BCI-RPG (ai-rpg and/or ai-gm)

Once the more manual NPC, weather, etc. reactions are working smoothly, begin hooking in the AI tools from the ai-rpg and/or ai-gm projects.

Phase 3 GUI (pending)

Add graphical interface (instead of just text interface) with audio and video pre-recorded events tied into the text-based events (a la Dragon's Lair). Later add more dynamic graphics/audio/video for smoother experience.Planned to begin early hooks around late 2021.

Further enhance the AI integration.

Phase 4 AI (pending)

To further improve accessibility and overall playability, increasingly increment in various tchnologies related to Natural language, Machine Learning (ML), Neural Networks (NN), Artificial Intelligence (AI) and other related technologies in different areas of the platform as appropriate and resources allow.

Phase 5 xR (pending)

Add xR/VR integration, motion integration, while still fully supporting BCI play. Include accessibility settings for VR for people with very limited or no physical mobility of their own (various drivers from third-parties hopefully to integrate, do not want to have to develop those drivers ourselves).

Phase 6+ (pending)

Bring accessibility BCI RPG back to the tabletop.

  1. integrate robotics equipment controlled through the BCI controls of the game play
  2. enable rolling physical dice
  3. moving physical miniatures
  4. Integrate "holographic" experiences with robotics, AR, etc.
  5. and other features through the BCI control

But in a TRPG environment rather than online ERPG (but using the tools and code created for this now more well-rounded continually evolving ERPG as the tools to enable TRPG play).

Your donations mean that more people can have access to our free programs.

Please donate today to help us with these efforts.

Or Volunteer Today!

 

Integrate gyro, GPS, and other technologies to add additional physical local realm experiences. Migrate BCI tools to work in physical world through robotics and wireless triggers so can interace with combination of AR and physical devices, including manipulating dice roller devices, robotic arms to pick up and move objects, move miniature tokens around enhanced by AR overlaid animation, so LIS/CLIS can play at the same physical table or larp with other players in the same physical location, not just online.