crwdns2933423:0crwdne2933423:0

crwdns2933803:012crwdne2933803:0

crwdns2933797:0Sam Goldheartcrwdnd2933797:0crwdne2933797:0

crwdns2936043:0crwdne2936043:0 crwdns2933505:0crwdne2933505:0 Sam Goldheart

crwdns2933769:0crwdne2933769:0
crwdns2933771:0crwdne2933771:0
crwdns2933801:0crwdne2933801:0

crwdns2933807:0crwdne2933807:0

-[* black] Next we pry out this chunk of stuff, which turns out to be an earpiece speaker, microphone, ambient light sensor—and one Soli chip, for interpreting your gestures using the power of [https://youtu.be/rGvblGCD7qM?t=49|radar|new_window=true].
+[* black] Next we pry out this hunk of ... stuff, which turns out to be an earpiece speaker, microphone, ambient light sensor—''and'' a Soli chip, for interpreting your gestures using the power of [https://youtu.be/rGvblGCD7qM?t=49|radar|new_window=true].
[* black] Google calls its implementation ''Motion Sense''. Although radar has been in use for a long time, and seems simple enough in principle, we're at a loss as to how Google stuffed the entire system into a tiny featureless rectangle with no moving parts.
[* icon_note] In principle, the system emits precisely tuned waves of electromagnetic energy. When those waves bounce off of something (like your hand), some of them reflect back to the antenna.
[* black] The Soli chip then studies the reflected waves and analyzes their time delay, frequency shift, and other data to learn the ''characteristics'' of the object that reflected them—how big it is, how fast it’s moving, which direction, etc.
[* black] Soli then runs that data against its known gesture database to determine what action, if any, needs to be performed in the OS.
[* black] In short: magic rectangle knows your every move.