Due to the advancement of science and technology, Augmented Reality (AR) has been attracted more attention in recent years and applied in many different fields. With the game applications of Pokemon GO combined with GPS and IKEA’s App with furniture layout functions, AR has been widely used and successfully attracted more companies to invest in application development. However, traditional AR is simply to add virtual objects to the actual scene. Users can only watch the result and cannot interact with the target. In addition, the traditional AR can only display virtual objects by detecting specially-made simple markers. It can't be used by selecting an arbitrary region in the camera window as a marker to identify the target, and makes the detection procedures complicated. For Mixed Reality (MR) technology, it is not
convenient to wear equipment and the price is also high.
This study builds an interactive system that expands reality so that users can use physical gestures to direct objects in the virtual scene to experience the immersion in the virtual world. The system is designed and developed using a Unity3D game engine and a seabed scene as the virtual environment. The KinectV2 sensor is used to capture human skeleton information and then Microsoft’s SDK is applied to videos for background removal and gesture recognition. A set of gestures is also defined to manipulate a school of fish. Fish behavior is simulated through swarm intelligence,
allowing fish swimming patterns to be presented in a very natural way.