The idea is that a near-sighted person can take off their glasses and the display will somehow produce an image that is in focus anyway despite being too far away from the person. The whole patent is about how you detect what glasses the user is wearing, which is sort of trivial if you have a phone with a face-ID sensor.
The hard part is making a display that appears to be somewhere else than its physical position. You could achieve that effect by placing a large (Fresnel) lens in front of the screen, but there needs to be some distance (say, at least about the diagonal size of the screen) between the screen and the lens. Also, it would only work if you hold the screen-lens combination at a particular angle.
The least unpractical way to do this is if the device has a telescopic arm holding your prescription glasses about 2 cm n front of your eyes.
Without a lens in front of the screen, you'd need each pixel of the display to emit different intensities in different directions with a fairly high resolution. There is no technology for that.