Mobile phones and 3D vision enter a honeymoon period
At present, smart phones have become the largest carrier for the development of advanced technologies such as AR/AR and computational vision, face recognition and AR functions have become the hot spot of the current development of smart phones, in fact, whether it is in the field of AR/VR or recognition technology, it is inseparable from computational vision. The field of computational vision is actually a simulation of biological vision using computer technology, in which depth recognition and multi-dimensional imaging are its core technologies.
Depth recognition is a key prerequisite for computational vision, which can recognize biological vision, including the current popular Apple face recognition technology, multidimensional imaging will include the current 3D display outcome, that is, the reproduction of 3D pictures and videos. Using depth recognition and multidimensional imaging technology, in addition to restoring what we can see with the naked eye, with the continuous integration of technologies in the future, depth recognition technology can also be a three-dimensional display of what we can’t see with the naked eye. For example, future smartphones can use depth recognition technology and analysis of artificial intelligence technology in the sun to identify the intensity of ultraviolet rays and remind us of sunscreen skin care.
Eye-tracking technology in AR/VR
With the progress of technology, we are now able to use human eyes for iris recognition, iris recognition is more effective and secure than face recognition, fingerprint recognition, many mobile phone manufacturers began to develop the use of iris recognition function.
In addition to iris recognition, there is eye-tracking technology. Eye tracking refers to a technology that can track eye movements and use these eye movements to enhance the experience of a product or service.
Eye-tracking technology has been popular in the smartphone world for a while, probably dating back to the Galaxy S4 in 2013 when it was first equipped with eye-tracking, which was mainly used for video playback. For example, if you are watching a video and a classmate behind you taps you on the shoulder, when you turn your head, because your eyes are no longer on the screen, the video will automatically pause, and when you turn back, the video will automatically resume. You don’t need your hand to hit pause and play; Or if you look at a web page on your phone, the page automatically turns when your eyes reach the bottom of the screen. That same year, LG also launched an LG Optimus G Pro phone with eye-tracking capabilities.
Unfortunately, eye tracking has failed to take off in the mobile world for two reasons. First of all, the average size of a smart phone is only about 5 inches, in such a small place, people prefer to interact directly with their fingers, and most of the functions of the phone are to use fingers to interact, so they do not play/pause this link; The second reason is that the technology was not mature at that time, the resolution was low, and the recognition was not accurate enough, resulting in users feeling tired of their eyes.