Is it possible to develop “dance training”-like game with real-time motion tracking using iOS/Android camera? I wanna create simple dance training game that does the following
The game will have animated dance coach for each level. Player will learn dancing by following the move shown by the coach.
The phone will be placed on a tripod so that the camera will be able to do live motion capture of the player’s dance move using the built-in iPhone/Android camera. The game could be displayed on TV (via HDMI, AirPlay, etc) for best viewing. The game will display both the animated coach and the live motion-captured avatar of the player side by side on screen.
Are there any existing GoDot games that does live motion capture retargeting to avatar? I would like to take a look.
I think it might be possible to do this; it would be an interesting challenge. I don’t know of anyone who’s tried it with Godot, but there have been attempts at purely camera-based motion capture (by which I mean, just a single camera visual spectrum camera with no special lighting and no special gear for the target) before.
I think this is a research project if you want to tackle it; you’ll probably find more useful info in SIGGRAPH papers than Godot stuff, though.
Are there any existing GoDot games that uses full body tracking as controller? I’m trying to decide if I should spend time learning Unity or GoDot for this not-so-simple game.
I’m not sure about Android , the Apple published few Swift snippets and demo project to keep track of motion thanks to CoreML / CreateMLComponents - example of counting exercise repeats .