p5-phone library illustration

Illustration by Angela Torchio

Overview

P5-phone bridges the gap between the existing software functions inside p5.js and the realities of contemporary mobile browsers to enable phones to be used for experimental interactions. This is done both through addition and subtraction. Firstly, it streamlines gaining access to phone sensors so that their data can be used inside p5. Secondly, it disables default gestures so that you can create your own.

  • Simplifies accessing phone hardware from the browser (accelerometers, gyroscopes, microphone, vibration motor)
  • Simplifies disabling default phone gestures (Zoom, refresh, back, etc)
  • Simplifies enabling audio output
  • Simplifies using an on-screen console to display errors and debug info

Linking to the Library

Minified version (recommended)

<script src="https://cdn.jsdelivr.net/npm/p5-phone@1.6.4/dist/p5-phone.min.js"></script>

Development version (larger, with comments)

<script src="https://cdn.jsdelivr.net/npm/p5-phone@1.6.4/dist/p5-phone.js"></script>

Browser Compatibility

  • iOS 13+ (Safari)
  • Android 7+ (Chrome)
  • Chrome 80+
  • Safari 13+
  • Firefox 75+

Accessing Phone Hardware in p5.js

p5.js has several existing commands that are specific to phone hardware. However, it requires extra steps to use them effectively. To use the Touch based commands, you need to disable the default gestures on your phone browser. For Motion and Microphone date you need to provide specific permissions for your browser to read the data.

Touch Events

Device Motion & Orientation

Audio Input (requires p5.sound)

Basic Setup

Index HTML

<!DOCTYPE html>
<html lang="en">
<head>
  <meta charset="UTF-8">
  <meta name="viewport" content="width=device-width, initial-scale=1.0">
  <title>Mobile p5.js App</title>
  
  <style>
    body {
      margin: 0;
      padding: 0;
      overflow: hidden;
    }
  </style>
  
  <script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.11.10/p5.min.js"></script>
  <script src="https://cdn.jsdelivr.net/npm/p5-phone@1.6.4/dist/p5-phone.min.js"></script>
</head>
<body>
  <script src="sketch.js"></script>
</body>
</html>

sketch.js

let mic;

function setup() {
  // Show debug panel FIRST to catch setup errors
  showDebug();
  
  createCanvas(windowWidth, windowHeight);
  
  // Lock mobile gestures to prevent browser interference
  lockGestures();
  
  // Enable motion sensors with tap-to-start
  enableGyroTap('Tap to enable motion sensors');
  
  // Enable microphone with tap-to-start
  mic = new p5.AudioIn();
  enableMicTap('Tap to enable microphone');
}

function draw() {
  background(220);
  
  // Always check status before using hardware features
  if (window.sensorsEnabled) {
    // Use device rotation and acceleration
    fill(255, 0, 0);
    circle(width/2 + rotationY * 5, height/2 + rotationX * 5, 50);
  }
  
  if (window.micEnabled) {
    // Use microphone input
    let level = mic.getLevel();
    fill(0, 255, 0);
    rect(10, 10, level * 200, 20);
  }
}

// Prevent default touch behavior
function touchStarted() {
  return false;
}

API Reference

Core Functions

Block Default Gestures

Function Description
lockGestures() Prevent browser gestures (call in setup()). Blocks: pinch-to-zoom, pull-to-refresh, swipe navigation, long-press context menus, text selection, double-tap zoom

Motion Sensor Activation

Function Description
enableGyroTap(message) Tap anywhere to enable motion sensors. Once enabled (when window.sensorsEnabled is true), provides access to: rotationX, rotationY, rotationZ, accelerationX, accelerationY, accelerationZ, deviceShaken, deviceMoved
enableGyroButton(text) Button-based sensor activation. Once enabled (when window.sensorsEnabled is true), provides access to: rotationX, rotationY, rotationZ, accelerationX, accelerationY, accelerationZ, deviceShaken, deviceMoved

Microphone Activation

Function Description
enableMicTap(message) Tap anywhere to enable microphone (requires p5.sound library). Once enabled (when window.micEnabled is true), use with mic.getLevel() and other p5.AudioIn methods
enableMicButton(text) Button-based microphone activation (requires p5.sound library). Once enabled (when window.micEnabled is true), use with mic.getLevel() and other p5.AudioIn methods

Sound Output Activation

Function Description
enableSoundTap(message) Tap anywhere to enable sound playback (no microphone input)
enableSoundButton(text) Button-based sound activation (no microphone input)

Speech Recognition

Function Description
enableSpeechTap(message) Enable speech recognition support (requires p5.js-speech library). Activates audio context without creating p5.AudioIn to avoid microphone hardware conflicts on mobile devices

Vibration Motor (Android Only)

Function Description
enableVibrationTap(message) Tap anywhere to enable vibration motor (Android only - not supported on iOS)
enableVibrationButton(text) Button-based vibration activation (Android only - not supported on iOS)
vibrate(pattern) Trigger vibration with duration (ms) or pattern array. Example: vibrate(50) or vibrate([100, 50, 100])
stopVibration() Stop any ongoing vibration

Camera/Video Functions

Function Description
createPhoneCamera(mode, mirror, displayMode) Create a camera optimized for phone use. Returns a PhoneCamera instance with automatic coordinate mapping for ML5 models (BodyPose, FaceMesh, HandPose). Parameters: mode ('user' for front camera, 'environment' for back camera), mirror (boolean, flip horizontally), displayMode ('fitHeight', 'fitWidth', 'cover', 'contain', or 'fixed'). Use with ML5 models for automatic coordinate mapping between video and canvas space.
enableCameraTap(message) Tap anywhere to enable camera permissions. Automatically initializes all PhoneCamera instances. Required for iOS camera access.
enableCameraButton(text) Button-based camera activation. Creates a button that enables camera permissions when clicked.

PhoneCamera ML5 Integration Methods

Method Description
cam.mapPoint(x, y) Map a simple point from video coordinates to canvas display coordinates. Handles scaling and mirroring automatically. Returns {x, y} object. Use for drawing custom points on top of scaled video.
cam.mapKeypoint(keypoint) Map an ML5 keypoint object to display coordinates. Handles scaling and mirroring automatically. Preserves all keypoint properties (z, confidence, etc.). Returns mapped keypoint object. Use with ML5 BodyPose, FaceMesh, or HandPose keypoints.
cam.mapKeypoints(keypoints) Map an array of ML5 keypoints to display coordinates. Handles scaling and mirroring automatically. Returns array of mapped keypoints. Use when processing multiple keypoints at once.
cam.onReady(callback) Set a callback function to run when the video is fully ready for ML5 detection. Use this before initializing ML5 models to ensure the video element has loaded. Example: cam.onReady(() => { /* create ML5 model here */ })
cam.getDimensions() Get dimension information for the current display mode. Returns {x, y, width, height, scaleX, scaleY} object. Use for custom coordinate calculations or understanding the video layout.

PhoneCamera Properties

Property Description
cam.videoElement Read-only. Returns the native HTML video element for use with ML5 libraries. Pass this to ML5 model constructors.
cam.ready Read-only. Boolean indicating if camera is ready for use.
cam.width Read-only. Current display width of the video on canvas.
cam.height Read-only. Current display height of the video on canvas.
cam.active Read-write. Camera facing mode: 'user' (front) or 'environment' (back). Changing this switches the camera.
cam.mirror Read-write. Boolean to control horizontal mirroring of the video.
cam.mode Read-write. Display mode: 'fitWidth', 'fitHeight', 'cover', 'contain', or 'fixed'.

Status Variables

Variable Description
window.sensorsEnabled Boolean: true when motion sensors are active
window.micEnabled Boolean: true when microphone is active
window.soundEnabled Boolean: true when sound output is active
window.vibrationEnabled Boolean: true when vibration is available (Android only)

Debug System

Function Description
showDebug() Show on-screen debug panel
hideDebug() Hide debug panel
toggleDebug() Toggle panel visibility
debug(...args) Console.log with on-screen display
debugError(...args) Display errors with red styling
debugWarn(...args) Display warnings with yellow styling

Getting Started

Blank Template

Basic setup from the README - a starting point for your own projects

Web Editor → View Code

Touch Interactions

01. Basic Touch

Detect touch events and track duration

Launch Example → Web Editor →
02. Touch Zones

Screen divided into 4 interactive zones

Launch Example → Web Editor →
03. Touch Count

Count simultaneous touch points

Launch Example → Web Editor →
04. Touch Distance

Measure distance between two touches

Launch Example → Web Editor →
05. Touch Angle

Calculate angle between two touch points

Launch Example → Web Editor →

Device Motion

01. Orientation

Basic device orientation detection

Launch Example → Web Editor →
02. Rotational Velocity

Calculate rotation speed using sensors

Launch Example → Web Editor →
03. Acceleration

Display device acceleration data

Launch Example → Web Editor →

Audio Input

01. Microphone Level

Visualize audio input with threshold

Launch Example → Web Editor →
02. Speech Recognition

Touch-to-talk speech recognition using Web Speech API

Launch Example →

Audio Playback

01. Sound Basic

Play/pause audio with touch control

Launch Example →
02. Volume by Touches

Control audio volume with multiple touches

Launch Example →

Vibration Motor (Android Only)

01. Haptic Feedback

Touch zones with different vibration patterns (Android only)

Launch Example →

ML5 Machine Learning Examples

Gaze Detector Class

Eye gaze tracking with ml5.js FaceMesh

Launch Example → View Code
PHONE BodyPose Two Points

Body tracking with phone sensors and ml5.js BodyPose

Launch Example → View Code
PHONE FaceMesh Two Points

Face tracking with phone sensors and ml5.js FaceMesh

Launch Example → View Code
PHONE HandPose Two Points

Hand tracking with phone sensors and ml5.js HandPose

Launch Example → View Code
THREE BodyPose Two Points

3D body tracking visualization with Three.js and ml5.js

Launch Example → View Code
THREE FaceMesh Two Points

3D face tracking visualization with Three.js and ml5.js

Launch Example → View Code
THREE HandPose Two Points

3D hand tracking visualization with Three.js and ml5.js

Launch Example → View Code

Interactive GIF Examples

Fetch

Touch-controlled corgi fetching animation

Launch Example → View Code
Roll

Tilt-controlled pencil rolling animation

Launch Example → View Code
Fly

Acceleration-controlled airplane window

Launch Example → View Code

UX Comparison Examples

Button vs Shake Detection

Compare clicking a button vs shaking your device to trigger actions

Launch Example → View Code
Button vs Movement Detection

Compare button clicks vs device movement for interaction control

Launch Example → View Code
Button vs Orientation

Compare button interface vs device orientation for control

Launch Example → View Code
Sliders vs Device Rotation

RGB color control: traditional sliders vs device rotation

Launch Example → View Code
Sliders vs Device Acceleration

RGB color control: traditional sliders vs device acceleration

Launch Example → View Code
Slider vs Microphone

Volume control: traditional slider vs microphone input level

Launch Example → View Code
Slider vs Multi-Touch

Value control: single slider vs multiple finger touches

Launch Example → View Code
Slider vs Distance

Value control: traditional slider vs finger distance measurement

Launch Example → View Code
Slider vs Angle Control

Value control: linear slider vs multi-touch angle detection

Launch Example → View Code