-
Notifications
You must be signed in to change notification settings - Fork 0
Dev: Brainstorming
markusTraber edited this page Sep 2, 2020
·
6 revisions
- Creating new instruments.
- Enhancing current instruments.
- Experimenting with XR.
- Integration of other sensors?
- How could the IVA project become more convenient for new coders? - How to enhance the iva provided functions/tools.
- How to enable a mapping between different visualizations and synths?
- Separate synth from visualization
- Create a "mapping" interface, where each visualization can create multiple mapping to different synths.
- How to make it easy to create an IVA instrument? E.g. for workshops?
- Create a template file, for others to copy
- Create an instrument creation app, which creates the base files after a template.
- Franzi found this example: https://experiments.withgoogle.com/scan-sequencer
- Web app with P5.js, Tone.js
- Utilizes webcam to trigger a signal, when color changes occur in part of webcam image
- Vertically divided into zones, much like my IVA instrument, but much more tones.
- Multiple tones can be played simultaneously
- Tone selection is nice, the resulting audio output is nice
- Code is structured and relatively easy to read
- Execution as web app is really convenient
- offer to play tones only when wanted
- this could be easily realized by pushing a button on the keyboard
- use gestures / object recognition
- use flow detection (like simon's app)
- color detection
- make it possible to lay multiple "lanes" over one another (in terms of recordings)
- would need multiple oscillators
- webcam recognizes objects and flow
- certain objects get specific synths assigned und will be played back according to flow
- https://github.com/TetsuakiBaba/ofxOpenCvDnnObjectDetection