I know that the technology exists having read but a few accounts of studies done where a sleep study subject's eye-directions during REM sleep were corresponded with descriptions of their dreams -- or in the case of a disorder study paired with the actual physical direction they moved their head during sleep.
If I can find the means to capture the signal, the direction of eyes moving under eyelids, then this action during sleep could be paired with directional "button-presses" as if on a gamepad. This gamepad would control a first-person view of an isometric pixel based dungeon in the style of older games in the genre.
The interface then becomes a visual extension of the sleeper's ocular exploration "behind closed eyes". The graphic assets in this interface are fairly underwhelming since they largely repeat themselves. You could go as far down that rabbit hole as you pleased with adding elements, like visual "encounters" during different types of registered electric activity in the body, but the focus here is purely on directional eye movement during sleep.
As an additional feature, I'd like to incorporate a mini-map that reveals and records the "area" in which the sleeper is exploring. I was thinking that a certain level of electrical pulse in a direction would procedurally generate the "possible pathway in that direction" but the "button press" would happen at a more extreme signal indicated in a direction. At the end you should have a visual map of where the sleeper "went" and "paths considered".
The alternative to receiving signals directly from a sleeper would be to turn polysomnograph data into values which performed the same function, but in a pre-generated way, rather than a "live performance", the ultimate goal.
Fortunately, I have a coding professional in my friend circle that already produces very similar work for a neurofeedback software company that produces games "played" by electrical signals in the brain. My needs would be a more concise, less demanding piece of software.
Also in my favor is the fact that this is not a piece of medical equipment, so the exactness of the EOG reading needs only to correspond within the gameworld in a "sensible" way.
The decision to design the "game" using pixel graphics is mostly one of convenience, but also of relevance to my background and skillset. A more modern design would simply be more demanding and distract from the real purpose of the piece. Also, the arduino is already an 8-bit device. Whether or not this means it could run something Nintendo Entertainment System code (6502 Assembly) or not remains to be known.
Challenges:
1. Securing the technology.
2. Find funding, possibly clinical backing. The more excitement generated over the project, the better it can be. Funds can be used to pay the software engineer, and for the equipment needed to detect directional movement in EOG.
3. Design a "set" for the "performance" (if it is to be live; in the case the software is fed a "script" this can exist purely as a self contained object)
4. Running an 8-bit visual with arduino may be difficult (or impossible). Arduino may, instead, have to serve as the controller to an externally hosted emulator. (or poo on all that and use RaspberryPi)
Benefits
1. A visual study of rapid eye movement. New patterns may emerge by recoding the visualization from a polysomnograph recording which is unwieldy to the trained eye, and meaningless to the laymen.
2. A foot forward in the understanding of the "performance of sleep". By giving the electrical signals of the sleeping brain a performing body to act them out we can come to understand and appreciate this dimension of all life.
3. An entry to "video game art" transcending the purpose and manner of video games. (I strongly believe the conventions of video games are very powerful tools in digital media and there needs to be a redefinition of this toolset in artmaking practices. The words "video game" conjure up kind of a dirty feeling and also isn't entirely accurate to these applications.Maybe this has already happened and I missed the memo.)
Steps
Hardware
EOG sensors (directional, eyelids closed)
Arduino or RaspberryPi
sleeping mask...or some such nonsense
output monitor
"bed" (maybe a blackened, comfortable space)
"performer"
Software
6502 Assembly (or substitute, just needs to run an 8-bit visual)
convert EOG to directional "button presses"
Game Design Doc
Assets:
Every "screen" can contain the HUD, Minimap, floor, wall, and ceiling tiles.
We only need wall variations:
Wall (Left, Right, Face, can be identical)
One-Way Corner (mirror)
Two-Way Corner
Three-Way Corner
HUD(or not)
Minimap:
tiles indicating path shape
left, right corridors (horizontal)
forward, backward corridors (vertical)
"L" shape (rotatable)
"+" shape
Position Marker
To be continued... (but feel free to comment anytime...this is brainstorming, yo)

No comments:
Post a Comment