Wasn’t supported on Quest. Watched on YouTube, ok as an interesting experiment.
More of a fan of this tbh:
You are here
Where from here?
For performing musicians and engineers: stagecraft, engineering and gear.
BJG145 wrote:Did anyone see it...?
I watched the end of it in 2D without a VR headset. I thought it worked and it had the feel of an event. I liked the visuals, even in 2D. A lot of flying through tunnels reminiscent of the 'Stargate' sequence in 2001. There weren't many avatars, but the ones that were there were pretty nutty, and that was good, because it gave it an eccentric sci-fi feel.
So I'd say this was a 21st century concert perhaps brought into being early because of the Covid situation.
So a VR event can work in conventional 2D media too, which is relevant to the OP's original question. Even if the interface to the OP's device could use VR or AR it would be best if it first worked in 2D or 2D with a z-axis, like a video game.
- Posts: 85
- Joined: Thu Nov 07, 2019 3:15 am
This is how we are currently thinking the interface is described - as a map. GeoJSON is the format google maps uses, we need that plus the Z for "folding" things into manageable lumps.merlyn wrote:So a VR event can work in conventional 2D media too, which is relevant to the OP's original question. Even if the interface to the OP's device could use VR or AR it would be best if it first worked in 2D or 2D with a z-axis, like a video game.
A better description of "folding" would be to take a photograph of a Reactable "setup" and then put that photograph on a new puck which can be used on a new Reactable.. and keep on going up until you get to the final design. At any time you can bring back the "photograph" of whatever stage and edit it.
On the other hand, you could also view it with a VR headset once a "visualiser" has been written for the data.
If the data format is open source then lots of other people can get involved doing whatever they want to with it.
- Posts: 53
- Joined: Mon Feb 19, 2007 1:00 am