crwdns2933423:0crwdne2933423:0

Smontaggio Google Pixel 4 XL

crwdns2936315:0crwdne2936315:0
crwdns2936321:0crwdne2936321:0
crwdns2931653:012crwdne2931653:0
Google Pixel 4 XL Teardown: crwdns2935265:00crwdnd2935265:01crwdnd2935265:03crwdne2935265:0 Google Pixel 4 XL Teardown: crwdns2935265:00crwdnd2935265:02crwdnd2935265:03crwdne2935265:0 Google Pixel 4 XL Teardown: crwdns2935265:00crwdnd2935265:03crwdnd2935265:03crwdne2935265:0
  • Next we pry out this hunk of ... stuff, which turns out to be an earpiece speaker, mic, ambient light sensor (AMS TMD3702VC), and the Soli chip, for interpreting your gestures using the power of radar.

  • Google calls this implementation of its in-house Project Soli Motion Sense.

  • Although radar technology has been in use for a long time and seems simple enough on paper, we're at a loss as to how Google stuffed the entire system into a tiny featureless rectangle with no moving parts.

  • Motion Sense works by emitting precisely tuned waves of electromagnetic energy. When those waves bounce off of something (like your hand), some of them reflect back to the antenna.

  • The Soli chip then studies the reflected waves and analyzes their time delay, frequency shift, and other data to learn the characteristics of the object that reflected them—how big it is, how fast it's moving, in which direction, etc.

  • Soli then runs that data against its known gesture database to determine what action, if any, needs to be performed in the OS.

  • TL;DR: magic rectangle knows your every move.

Prossimo estratto è questo pezzo di… roba, che si rivela un insieme di altoparlante voce, microfono, sensore luce ambiente e del chip Soli, per interpretare i tuoi gesti usando il potere del radar.

Google chiama 'Motion Sense'' questa implementazione del suo Project Soli.

Benché la tecnologia radar sia usata da un bel po' di tempo e sembri abbastanza semplice sulla carta, non sappiamo bene come Google sia riuscita far stare l'intero sistema in un piccolo rettangolo uniforme senza parti mobili.

Motion Sense funzione emettendo con grande precisione onde di energia elettromagnetica. Quando queste onde si scontrano con qualcosa (come la tua mano), alcune di esse vengono riflesse indietro verso l'antenna.

Il chip Soli a quel punto studia le onde riflesse e analizza il reciproco distacco temporale, lo slittamento della frequenza e altri dati per capire le caratteristiche dell'oggetto che le ha riflesse: quanto è grande, a che velocità si sta muovendo, in quale direzione, ecc.

Quindi Soli confronta i dati con il suo database di gesture note per determinare quale azione, se rilevata, deve essere eseguita nell'OS.

TL;DR: il rettangolo magico conosce ogni tua mossa.

-[* black] Next we pry out this hunk of ... stuff, which turns out to be an earpiece speaker, mic, ambient light sensor, ''and'' the Soli chip, for interpreting your gestures using the power of [https://youtu.be/rGvblGCD7qM?t=49|radar|new_window=true].
+[* black] Next we pry out this hunk of ... stuff, which turns out to be an earpiece speaker, mic, ambient light sensor (AMS [link|https://ams.com/tmd3702vc|TMD3702VC]), ''and'' the Soli chip, for interpreting your gestures using the power of [https://youtu.be/rGvblGCD7qM?t=49|radar|new_window=true].
[* black] Google calls this implementation of its in-house [https://atap.google.com/soli/|Project Soli|new_window=true] ''Motion Sense''.
[* black] Although radar technology has been in use for a long time and seems simple enough on paper, we're at a loss as to how Google stuffed the entire system into a tiny featureless rectangle with no moving parts.
[* icon_note] Motion Sense works by emitting precisely tuned waves of electromagnetic energy. When those waves bounce off of something (like your hand), some of them reflect back to the antenna.
[* black] The Soli chip then studies the reflected waves and analyzes their time delay, frequency shift, and other data to learn the ''characteristics'' of the object that reflected them—how big it is, how fast it's moving, in which direction, etc.
[* black] Soli then runs that data against its known gesture database to determine what action, if any, needs to be performed in the OS.
[* black] TL;DR: magic rectangle knows your every move.

crwdns2944171:0crwdnd2944171:0crwdnd2944171:0crwdnd2944171:0crwdne2944171:0