Seven physical phenomena hidden in the phone you're holding — from spinning silicon to the color your eye can't see.
↓ scroll to begin
Spin a top and try to push it over. Instead of falling, it precesses — the axis drifts sideways. This is angular momentum resisting change: a spinning mass wants to keep spinning in the same direction in space, regardless of what its housing does.
A mechanical gyroscope suspends its spinning rotor in gimbal rings — hinged frames that let the housing rotate freely around the rotor while the rotor stays pointed in its original direction. Mount one on a ship and it becomes a stable reference for navigation regardless of waves.
Precession: when you apply a torque trying to tilt the spin axis, the axis moves — but 90° away from where you pushed. The gyroscope "processes" the torque and outputs rotation perpendicular to both the input torque and the spin axis.
The problem with mechanical gyroscopes: they drift. Bearing friction, air resistance, manufacturing imperfections. A precision navigation-grade mechanical gyro drifts ~0.01°/hour. Acceptable for a missile; terrible for a pocket device.
Your phone's gyroscope has no spinning parts. The entire mechanism is etched into a silicon wafer thinner than a human hair and costs a fraction of a cent to manufacture.
A tiny proof mass is electrostatically driven to oscillate back and forth at its resonant frequency (~30,000 times per second). When the chip rotates, the Coriolis force deflects this oscillating mass perpendicular to both its motion and the rotation axis.
Capacitive plates spaced ~1 micron apart detect this tiny deflection. As the mass shifts, the gap between plates changes, and capacitance changes with it. An ADC reads this as a digital rotation rate in degrees per second.
No bearings, no friction, no drift from mechanical wear. The phone has three separate MEMS gyroscopes oriented on perpendicular axes to measure pitch, roll, and yaw simultaneously. The whole assembly fits in a package 3mm × 3mm × 0.9mm.
Any rigid body has exactly three independent ways to rotate. Aviation named them first; phones inherited the same convention.
Roll is rotation around the axis pointing forward (nose-to-tail). An airplane banks into a turn. On your phone, tilting it sideways like a book is roll (γ, gamma in the sensor API).
Pitch is rotation around the axis pointing to the side (wingtip-to-wingtip). A plane's nose rises or dips. On your phone, tilting the top toward or away from you is pitch (β, beta).
Yaw is rotation around the vertical axis. A plane turns left or right while staying level. On your phone, rotating flat on a table changes yaw. The compass gives yaw (α, alpha).
In the phone's deviceorientation event: alpha (0–360°) is yaw from magnetic north, beta (−180° to 180°) is front-back pitch, gamma (−90° to 90°) is side tilt. These three numbers completely describe the phone's orientation in space.
A common misconception: that phones sense acceleration by tracking the speaker cone's position. Not so. The accelerometer is an entirely separate MEMS device, and it predates the modern phone by decades.
The mechanism: a tiny proof mass is suspended inside a silicon frame by flexible beams that act as springs. Under acceleration, the whole frame moves but the mass lags behind — Newton's second law: F = ma means the mass needs a force to accelerate, and that force comes from the spring flexing.
When the phone sits still, the accelerometer reads approximately 9.8 m/s² upward — Earth's gravity. The spring is deflected by gravitational force even with no motion. This is how the phone knows it's horizontal or vertical without moving: orientation changes which axis gravity pulls along.
The accelerationIncludingGravity value in the browser API includes this gravitational component. To get just motion acceleration, the operating system fuses gyroscope data to subtract gravity's component in each axis.
Water molecules are electric dipoles: oxygen is slightly negative, the two hydrogens slightly positive. When a radio wave's oscillating electric field passes through water vapor, these dipoles spin trying to align with the field — converting the wave's energy into rotational kinetic energy, then heat. The wave loses amplitude.
This matters very differently depending on frequency. At 700 MHz (4G low band), the wavelength is ~43 cm — vastly larger than a raindrop (~2 mm). The drop is essentially invisible to the wave. Rain has almost no effect.
At 60 GHz, oxygen molecules themselves (O₂) resonate and absorb, limiting range to ~100 m regardless of weather — useful for short-range unlicensed point-to-point links precisely because it can't travel far enough to interfere with neighbors.
Water vapor in the atmosphere (humidity, not rain) absorbs at 22.235 GHz and 183 GHz. Even a clear day with high humidity measurably degrades frequencies near these bands. Rain is visible; humidity is invisible and omnipresent.
A 40 Hz bass note has a wavelength of 8.6 meters. Your phone's speaker is ~3 cm across — roughly 1/280th of that wavelength. By conventional acoustics, it should produce essentially zero bass. And yet you hear it. Several things are happening at once.
Excursion and DSP control: the voice coil travels much farther than any speaker of this size would have historically attempted. An onboard DSP monitors back-EMF (the voltage the moving coil generates) to track the cone's exact position and velocity, preventing over-excursion while maximizing the air moved per watt.
Acoustic tuning: the phone body is a sealed or ported enclosure. The volume and port dimensions are tuned to shift the system's resonant frequency lower, extending bass response.
This psychoacoustic trick is not a bug or a cheat — it exploits the same pitch perception mechanism that lets you identify the fundamental of a piano note even when the lowest string on the instrument doesn't go that low.
Silicon sensors are colorblind: they count photons but can't distinguish wavelength. Color sensing requires a physical filter. Modern cameras use a Bayer pattern — a 2×2 repeating grid of tiny color filters: one red (R), two green (G), one blue (B). More green than red or blue because human luminance sensitivity peaks in the green range.
Each pixel only measures one color. A process called demosaicing (also called interpolation) fills in the missing two channels for each pixel by borrowing from neighbors. This is an estimate, not a measurement — and the estimate introduces color artifacts at edges.
Color blindness and the camera error: Ishihara test plates are designed so their number and background are isoluminant to red-green dichromats — same perceived brightness, different hue. The dichromat sees no contrast and can't read the number.
A camera's R and G channel spectral responses differ from human L and M cones. Colors that appear equal-luminance to a human dichromat may appear different-luminance to a camera. Photographing an Ishihara plate and then viewing the photo can reveal numbers invisible to the naked eye — the camera's systematic "error" creates luminance contrast where the human visual system has none. HDR processing, AI scene enhancement, and JPEG chroma subsampling can all amplify this effect.