Roadmap

This section provides a high-level overview of the project road map and phases. The most up to date version of this can be found at the project BCI RPG Github repository.

BCI RPG Project High-Level Road Map and Phases

Phase 0 - EEG I/O controllers in NWN on LInux

learn basics of EEG, bio-and-neuro feedback, and computer interaction, then attempt to replace regular keyboard or other I/O controls for PC-based video game (preferably on Linux due to more flexiiblity of I/O controls) and play game with only EEG and/or bio-monitoring inputs. Start with NeverWinter Nights on Linux. - Complete

Phase 1 - Game design prototype in NWN:EE on LInux with EEG/BCI for I/O

Train new development team on game dev concepts desired with prototype use of Aurora Toolset for NWN. Create new adventure module that will be used as test bed for future prototypes. Choose Shakespeare's The Tempest.. Experiment with controls using EEG and BCI equipment with the Enhanced Edition Neverwinter Nights on Linux.  Complete.

Phase 2 - Game from scratch TUI version BCI I/O online turn-based mutiiplayer

Create from scratch a text-only multiplayer online turn-based cooperative RPG that can be played with only OpenBCI (or similar) equipment. - Currently nearing end of design and documentation steps, and starting actual coding steps. Will use GUI tools with GoDot engine fhe the GM tools, but play must be all TUI and able to be run with the simple inputs of OpenBCI. Use the same adventure bluebrint as Phase 1. - IN PROGRESS. Begin incorporating components from the AI RPG and AI GM (ai-rpg.com) projects as needed, especially for NPCs/Creatures, but also options for PCs, world sociopolitical, events, ripple effects from PC actions, etc.

Phase 3 - Add GUI and Add Optional [Matrix] Integration Support

Add basic graphics real-time graphics or pre-recorded action videos triggered like Dirk the Daring from Dragon's Lair - on top of existing TUI-based game.

Phase 4 - Add Natural Language (NL), Machine Learning (ML) / Artificial Intelligence (AI)/ Neural Networks (NN) Enhancements

To further improve accessibility and overall playability, increasingly increment in various tchnologies related to Natural language, Machine Learning (ML), Neural Networks (NN), Artificial Intelligence (AI) and other related technologies in different areas of the platform as appropriate and resources allow.

Phase 5 - Add xR (VR/AR/MR)

Add xR-related options (optional!) to the graphical interface experience options available. Add AR, GPS, global world location play to game. If viable include "holographic" UI options so no glasses, etc. required.

Phase 5b - Add OpenBCI Galea (or similiar) Integration

As available (and funding/donations permit), add/incorporate support for newer OpenBCi Galea (or similar) full systems immersion support.

Phase 6+ Add Robotics and other Physical local realm interaction features

Integrate gyro, GPS, and other technologies to add additional physical local realm experiences. Migrate BCI tools to work in physical world through robotics and wireless triggers so can interace with combination of AR and physical devices, including manipulating dice roller devices, robotic arms to pick up and move objects, move miniature tokens around enhanced by AR overlaid animation, so LIS/CLIS can play at the same physical table or larp with other players in the same physical location, not just online.

Phase 0 (completed) 2006 - 2014 OpenEEG

NeverWinter Nights (NWN) Diamond Edition on Linux controlled by OpenEEG equipment.

 

 

Basic control of PC movement. Routing of I/O through OpenEEG 5 channel headset.

Began 2006, functional by 2010, stopped working on it by 2014. 

Very rudimentary movement, and on/off menu (or other on/off assigned hotkey) mapped and working. Not sufficient for LIS/CLIS, but helpful for some disabilities. Very high latency and error rate however.

Need greater resolution of signal, frequency ranges, more CPU power. OpenEEG and civilian system power still leaving something to be desired, but getting closer. Not sufficient for AR/VR addition however.

Phase 1 (completed) 2019 - 2020

NeverWinter Nights (NWN) Enhanced Edition (NWN:EE).

 

Using Windows for Aurora Toolset module development.

Not using EEG or BCI equipment at this phase. Creating prototype adventure that will be used in Phase 2 onward for baseline R&D.

Development prototyping and team training / team building, ue NWN:EE Aurora Toolset to create custom adventure. Began August 2019, completed August 2020 successfully (still adding enhancements over time). Later see if newer generation OpenBCI can be used.

Phase 1b - (pending) - OpenBCI

Is it possible to configure OpenBCI to run this game at, or better than, OpenEEG did for NWN Diamon Edition in Phase 0?

It is not expected that full functionality, or multiplayer features will work. But hoping that the 16-channel higher quality setup may make solo play viable.

Phase 2 (in-progress) 2020 - Current

Scope and build from ground-up electronic role-playing game (ERPG) that can be controlled by accessibility equipment, especially newer generation EEG/BCI equipment, such as the OpenBCI hardware (among others).

OpenBCI

 

GoDot

 

Text-based UI (TUI)

Brain-computer Interface controlled text-only turn-based menu-driven.

BCI RPG Game player, Player Text User Interface

 

 

Game is:

  1. online
  2. multi-player
  3. team-focused
  4. turn-based
  5. initially text-only
  6. menu-driven
  7. non-chat (use other chat solutions as needed) (not a MUD/MUSH/MOO) - using Matrix-synapse as underlying network communication infrastructure, especially for chat
  8. electronic role-playing game that can be played with many different adaptive devices but MUST be fully playable (without chat) using only the human brain of the player(s) (BCI). Ultimately must be playable by LIS/CLIS population.
  9. Opensource

Turn-selection process in noncombat situations for online multi-player turned based life real-time brain-computer interface controlled role-playing game.

Began scoping August 2020, currently actively in progress.

Meeting weekly (broadcast live).

Estimate this phase completion around August 2021 (potentially sooner if current development momentum maintained by team).

  1. Supports Game Master (GM) tools to participate in the game in DM roles with the players during the live game.
  2. Ability to pause game, save game, restore game.
  3. Supports multiple genres (player selected).
  4. Multiple underlying RPG systems (player selected).
  5. Phase 2b Toolset provided for potential game master to create new adventures (does not need to meet BCI requirements for toolset).
  6. Phase 2c - Replicate the NWN Tempest adventure using our Toolset to play in the BCI RPG (text-based play)

More scope details and full documents in Brain Computer Interface Role-Playing Game BCI RPG Github repository.

PC / NPC interaction dialog menues for BCI RPG text-based wireframe and example flow

 

d20 RPG extended conflict overview flowchart. Included Dungeons & Dragons 5th edition (D&D 5e), and open d20 variants.

 

General UI for BCI RPG

 

Graphical User Interface (GUI) for server management of BCI RPG. The use case assumes players are in LIS CLIS, but that DM / GM / Therapist or server admin are able to use full GUI for server administration.

Phase 2b - Develop Module Creation Tools with UI

The module creation toolset is not designed for BCI or LIS/CLIS users. The assumption is that Game Master / Dungeon Master, Educator, or Therapist will use this toolset to create new custom adventures (like the Aurora Toolset is used to create adventures for NWN).

This requires full graphical tools at this point. (Would love down the road to create a BCI-controlled toolset, but that is much more advanced than our current team is ready for. Hopefully we can circle back to this down the road.

 

 

 

Phase 2c - Create Tempest Adventure Using New Custom-built Toolset

In Phase 2 all the basic play mechanics, and the module creation tools are built.

We already created the storyline and branching narrative in Phase 1 with Twinery and implemented in NWN:EE with Aurora Toolset.

Now we need to take the same story and recreate it using our "ModGod" toolset.

Phase 2d: Begin Linking Artificial Intelligence Tools with BCI-RPG (ai-rpg and/or ai-gm)

Once the more manual NPC, weather, etc. reactions are working smoothly, begin hooking in the AI tools from the ai-rpg and/or ai-gm projects.

Phase 3 GUI (pending)

Add graphical interface (instead of just text interface) with audio and video pre-recorded events tied into the text-based events (a la Dragon's Lair). Later add more dynamic graphics/audio/video for smoother experience.Planned to begin early hooks around late 2021.

Further enhance the AI integration.

Phase 4 AI (pending)

To further improve accessibility and overall playability, increasingly increment in various tchnologies related to Natural language, Machine Learning (ML), Neural Networks (NN), Artificial Intelligence (AI) and other related technologies in different areas of the platform as appropriate and resources allow.

Phase 5 xR (pending)

Add xR/VR integration, motion integration, while still fully supporting BCI play. Include accessibility settings for VR for people with very limited or no physical mobility of their own (various drivers from third-parties hopefully to integrate, do not want to have to develop those drivers ourselves).

Phase 6+ (pending)

Bring accessibility BCI RPG back to the tabletop.

  1. integrate robotics equipment controlled through the BCI controls of the game play
  2. enable rolling physical dice
  3. moving physical miniatures
  4. Integrate "holographic" experiences with robotics, AR, etc.
  5. and other features through the BCI control

But in a TRPG environment rather than online ERPG (but using the tools and code created for this now more well-rounded continually evolving ERPG as the tools to enable TRPG play).

Your donations mean that more people can have access to our free programs.

Please donate today to help us with these efforts.

Or Volunteer Today!

 

Integrate gyro, GPS, and other technologies to add additional physical local realm experiences. Migrate BCI tools to work in physical world through robotics and wireless triggers so can interace with combination of AR and physical devices, including manipulating dice roller devices, robotic arms to pick up and move objects, move miniature tokens around enhanced by AR overlaid animation, so LIS/CLIS can play at the same physical table or larp with other players in the same physical location, not just online.