crwdns2933803:03crwdne2933803:0
crwdns2933797:0Arthur Shicrwdnd2933797:0crwdne2933797:0
crwdns2936043:0crwdne2936043:0 crwdns2933505:0crwdne2933505:0 Arthur Shi
- crwdns2933769:0crwdne2933769:0
- crwdns2933771:0crwdne2933771:0
- crwdns2933801:0crwdne2933801:0
crwdns2933807:0crwdne2933807:0
[* icon_note] Before we get started, let's get located with a high-level tour: | |
[* black] Content creation begins in the Lightpack. It provides power and handles the processing, sending image and sound data to the headset. | |
- | [* black] Images are projected |
+ | [* black] Images are projected as such: two sets of RGB LEDs (one for each focus plane) shine light, which is reflected off of a [https://electronics.howstuffworks.com/lcos3.htm|liquid crystal on silicon (LCOS) display|new_window=true] and into a [https://uploadvr.com/waveguides-smartglasses/|waveguide|new_window=true], presenting a 3D image to the user's eyes. |
[* icon_note] A waveguide (what Magic Leap calls a "photonic lightfield chip") allows light to [https://en.wikipedia.org/wiki/Total_internal_reflection|bounce inside|new_window=true] a thin piece of glass—without mirrors—to form an image at a specific angle. | |
[* black] Meanwhile, the headset communicates with the controller, and tracks your surroundings to locate the virtual elements. |