Inside the
Instrument

Seven physical phenomena hidden in the phone you're holding — from spinning silicon to the color your eye can't see.

↓ scroll to begin

Chapter 01 — Gyroscope

The spinning
mass that holds still

Conservation of angular momentum

Spin a top and try to push it over. Instead of falling, it precesses — the axis drifts sideways. This is angular momentum resisting change: a spinning mass wants to keep spinning in the same direction in space, regardless of what its housing does.

A mechanical gyroscope suspends its spinning rotor in gimbal rings — hinged frames that let the housing rotate freely around the rotor while the rotor stays pointed in its original direction. Mount one on a ship and it becomes a stable reference for navigation regardless of waves.

L = Iω — Angular momentum (L) equals moment of inertia (I) times angular velocity (ω). The larger I or ω, the harder it is to change L. Gyroscopes maximise I by placing mass at the rim.

Precession: when you apply a torque trying to tilt the spin axis, the axis moves — but 90° away from where you pushed. The gyroscope "processes" the torque and outputs rotation perpendicular to both the input torque and the spin axis.

The problem with mechanical gyroscopes: they drift. Bearing friction, air resistance, manufacturing imperfections. A precision navigation-grade mechanical gyro drifts ~0.01°/hour. Acceptable for a missile; terrible for a pocket device.

Chapter 02 — MEMS Gyroscope

The Coriolis
effect in silicon

No spinning parts. Just vibrating silicon.

Your phone's gyroscope has no spinning parts. The entire mechanism is etched into a silicon wafer thinner than a human hair and costs a fraction of a cent to manufacture.

A tiny proof mass is electrostatically driven to oscillate back and forth at its resonant frequency (~30,000 times per second). When the chip rotates, the Coriolis force deflects this oscillating mass perpendicular to both its motion and the rotation axis.

F = 2m(v × Ω) — Coriolis force. When a mass moving with velocity v is inside a rotating frame spinning at rate Ω, it experiences a force perpendicular to both. The cross-product (×) means the force is always 90° from the motion.

Capacitive plates spaced ~1 micron apart detect this tiny deflection. As the mass shifts, the gap between plates changes, and capacitance changes with it. An ADC reads this as a digital rotation rate in degrees per second.

No bearings, no friction, no drift from mechanical wear. The phone has three separate MEMS gyroscopes oriented on perpendicular axes to measure pitch, roll, and yaw simultaneously. The whole assembly fits in a package 3mm × 3mm × 0.9mm.

Chapter 03 — Orientation

Pitch, Yaw,
Roll & the Right Hand

Three axes of rotation, one rule to remember them

Any rigid body has exactly three independent ways to rotate. Aviation named them first; phones inherited the same convention.

Roll is rotation around the axis pointing forward (nose-to-tail). An airplane banks into a turn. On your phone, tilting it sideways like a book is roll (γ, gamma in the sensor API).

Pitch is rotation around the axis pointing to the side (wingtip-to-wingtip). A plane's nose rises or dips. On your phone, tilting the top toward or away from you is pitch (β, beta).

Yaw is rotation around the vertical axis. A plane turns left or right while staying level. On your phone, rotating flat on a table changes yaw. The compass gives yaw (α, alpha).

Right-hand rule: point your right thumb along the positive axis. Your fingers naturally curl in the direction of positive rotation. Thumb forward → roll. Thumb right → pitch. Thumb up → yaw. All rotation directions follow from this single rule.

In the phone's deviceorientation event: alpha (0–360°) is yaw from magnetic north, beta (−180° to 180°) is front-back pitch, gamma (−90° to 90°) is side tilt. These three numbers completely describe the phone's orientation in space.

Chapter 04 — Accelerometer

A mass on
a spring in silicon

Not the speaker. Not even close.

A common misconception: that phones sense acceleration by tracking the speaker cone's position. Not so. The accelerometer is an entirely separate MEMS device, and it predates the modern phone by decades.

The mechanism: a tiny proof mass is suspended inside a silicon frame by flexible beams that act as springs. Under acceleration, the whole frame moves but the mass lags behind — Newton's second law: F = ma means the mass needs a force to accelerate, and that force comes from the spring flexing.

The displacement of the mass is proportional to the acceleration. Capacitive plates on either side of the mass measure the gap — smaller on one side, larger on the other. The differential capacitance gives a signed acceleration reading with microgravity precision.

When the phone sits still, the accelerometer reads approximately 9.8 m/s² upward — Earth's gravity. The spring is deflected by gravitational force even with no motion. This is how the phone knows it's horizontal or vertical without moving: orientation changes which axis gravity pulls along.

The accelerationIncludingGravity value in the browser API includes this gravitational component. To get just motion acceleration, the operating system fuses gyroscope data to subtract gravity's component in each axis.

Chapter 05 — Propagation

How water
steals your signal

Molecular absorption and the frequency that rain kills

Water molecules are electric dipoles: oxygen is slightly negative, the two hydrogens slightly positive. When a radio wave's oscillating electric field passes through water vapor, these dipoles spin trying to align with the field — converting the wave's energy into rotational kinetic energy, then heat. The wave loses amplitude.

This matters very differently depending on frequency. At 700 MHz (4G low band), the wavelength is ~43 cm — vastly larger than a raindrop (~2 mm). The drop is essentially invisible to the wave. Rain has almost no effect.

At 28 GHz (5G mmWave), wavelength ≈ 10 mm — comparable to a raindrop. Geometric interaction kicks in: drops scatter and absorb the wave. Heavy rain causes 10–15 dB/km of attenuation. This is why mmWave 5G cells are 100–200 m apart instead of kilometers.

At 60 GHz, oxygen molecules themselves (O₂) resonate and absorb, limiting range to ~100 m regardless of weather — useful for short-range unlicensed point-to-point links precisely because it can't travel far enough to interfere with neighbors.

Water vapor in the atmosphere (humidity, not rain) absorbs at 22.235 GHz and 183 GHz. Even a clear day with high humidity measurably degrades frequencies near these bands. Rain is visible; humidity is invisible and omnipresent.

Chapter 06 — Acoustics

Bass from
something the size of a coin

Physical limits, DSP tricks, and your brain doing the work

A 40 Hz bass note has a wavelength of 8.6 meters. Your phone's speaker is ~3 cm across — roughly 1/280th of that wavelength. By conventional acoustics, it should produce essentially zero bass. And yet you hear it. Several things are happening at once.

Excursion and DSP control: the voice coil travels much farther than any speaker of this size would have historically attempted. An onboard DSP monitors back-EMF (the voltage the moving coil generates) to track the cone's exact position and velocity, preventing over-excursion while maximizing the air moved per watt.

Acoustic tuning: the phone body is a sealed or ported enclosure. The volume and port dimensions are tuned to shift the system's resonant frequency lower, extending bass response.

The missing fundamental: if you play 80 Hz + 120 Hz + 160 Hz simultaneously (harmonics 2, 3, and 4 of a 40 Hz note), the human auditory system reconstructs a strong perception of 40 Hz — even though no 40 Hz signal was produced. Modern phones deliberately generate these harmonics for bass notes using real-time harmonic synthesis. Your brain supplies the bass.

This psychoacoustic trick is not a bug or a cheat — it exploits the same pitch perception mechanism that lets you identify the fundamental of a piano note even when the lowest string on the instrument doesn't go that low.

Chapter 07 — Photoreception

How a camera
sees color differently

The Bayer filter, demosaicing, and accidental colorblindness aid

Silicon sensors are colorblind: they count photons but can't distinguish wavelength. Color sensing requires a physical filter. Modern cameras use a Bayer pattern — a 2×2 repeating grid of tiny color filters: one red (R), two green (G), one blue (B). More green than red or blue because human luminance sensitivity peaks in the green range.

Each pixel only measures one color. A process called demosaicing (also called interpolation) fills in the missing two channels for each pixel by borrowing from neighbors. This is an estimate, not a measurement — and the estimate introduces color artifacts at edges.

The camera's color matrix: sensor RGB does not match human RGB. The silicon's red-filter response peaks at a slightly different wavelength than the L cones in your eye. A 3×3 color correction matrix (the "color science" of the camera) transforms sensor values toward human-visible color — but the transformation is an approximation.

Color blindness and the camera error: Ishihara test plates are designed so their number and background are isoluminant to red-green dichromats — same perceived brightness, different hue. The dichromat sees no contrast and can't read the number.

A camera's R and G channel spectral responses differ from human L and M cones. Colors that appear equal-luminance to a human dichromat may appear different-luminance to a camera. Photographing an Ishihara plate and then viewing the photo can reveal numbers invisible to the naked eye — the camera's systematic "error" creates luminance contrast where the human visual system has none. HDR processing, AI scene enhancement, and JPEG chroma subsampling can all amplify this effect.