Hosting sketches and limitations with using mic/camera

I’ve been coding in Processing.org for 18 months, now exploring moving to p5.js, and trying to get my head around lots of things, including where/how to host my sketches.

I watched Dan Schiffman’s video on how to use GitHub Pages to host sketches which I thought would be ideal, and worked ok for simple sketches.

But when I tried to use a sketch that uses the mic or camera it works fine in the p5.js web editor, but not on GitHub Pages.

Does anyone know if this is a limitation of GitHub Pages, or is there a trick?

Else where/how would you advise a newbie to easily host their sketches? thanks.

*** update - I’ve just discovered the “share” feature in the p5.js editor (e.g. https://editor.p5js.org/p5/full/mxvdk2yswx) and realised that it seems to have the same limitations, so maybe it’s nothing to do with GitHub ***

Interesting issue. Looking at the way GitHub pages hosts content, I don’t see any reason things shouldn’t work assuming you use HTTPS for you script tag src URLs (p5.sound must be loaded over HTTPS unless you use a polyfill for AudioWorklet. See this issue). There is also an error that sometimes shows up but doesn’t seem to have any effect on the sketch’s function:

Uncaught TypeError: Cannot read property 'length' of undefined
    at RingBuffer.push (0e7a7213-1cb4-4381-9e93-d0034267500e:75)
    at AudioWorkletProcessor.process (0e7a7213-1cb4-4381-9e93-d0034267500e:170)

The only strange thing I’m seeing from GitHub Pages is that my external script dependencies are for some reason getting mapped to to strange URLs like blob:https://sflanker.github.io/0e7a7213-1cb4-4381-9e93-d0034267500e.

However, when I tested out a simple microphone sketch I did run into some strange issues where even after I allowed the page access to the microphone the sketch wouldn’t work (I didn’t debug in depth exactly how it “wasn’t working” but the sketch is supposed to display a graph based on an FFT connected to the AudioInput and suffice it to say this graph was just a flatline. What’s more is it would work sometimes, if I refreshed the page and started the audio components very quickly, but if I waited a few seconds it would fail (flatline graph). I was able to resolve the issue by explicitly calling userStartAudio(). Here is my example sketch:


let fft;
let input;
let startButton;

function setup() {
  let cnv = createCanvas(windowWidth, windowHeight);
  startButton = createButton('Start!');
  startButton.position(width / 2, height / 2);
  startButton.mousePressed(async () => {
    startButton.remove();
    await userStartAudio();
    // Oddly enough the await above does not block until the user clicks "Allow".
    // However, this does still seem to consistently work.
    fft = new p5.FFT();
    input = new p5.AudioIn();
    input.start();
    fft.setInput(input);
  });
}

function draw() {
  background('black');
  if (fft && input) {
    let w = width / 2;
    drawSpectrumGraph(0, 0, w, height);
    drawWaveformGraph(w, 0, w, height);
  }
}

// Graphing code adapted from https://jankozeluh.g6.cz/index.html by Jan Koželuh
function drawSpectrumGraph(left, top, w, h) {
  let spectrum = fft.analyze();

  stroke('limegreen');
  fill('darkgreen');
  strokeWeight(1);

  beginShape();
  vertex(left, top + h);

  for (let i = 0; i < spectrum.length; i++) {
    vertex(
      left + map(log(i), 0, log(spectrum.length), 0, w),
      top + map(spectrum[i], 0, 255, h, 0)
    );
  }

  vertex(left + w, top + h);
  endShape(CLOSE);
}

function drawWaveformGraph(left, top, w, h) {
  let waveform = fft.waveform();

  stroke('limegreen');
  noFill();
  strokeWeight(1);

  beginShape();

  for (let i = 0; i < waveform.length; i++) {
    let x = map(i * 5, 0, waveform.length, 0, w);
    let y = map(waveform[i], -1, 2, h / 10 * 8, 0);
    vertex(left + x, top + y);
  }

  endShape();
}

The full code is here: https://github.com/sflanker/sflanker.github.io/tree/master/p5js-test
And you can see it live here: https://sflanker.github.io/p5js-test/

If you’re still having trouble you should share your code. There might be something specific you are doing that is causing the problems.

Lastly, if nothing works you could try hosting your code on Glitch.com. I’ve had lots of success with this for static p5.js hosting.

just to add, one problem with glitch.com is that they don’t have a redirect rule to https so you always have to make sure that you access https

1 Like

Thanks for the quick and amazingly helpful response, I’ve only just found time to have a look…

The link you posted to your sketch (Test P5.js Sketch) works absolutely fine for me (on 3 difference devices all running Chrome), so it’s odd that you say you are seeing issues with it.

I’ve then tried copying your code into my repo and it now works fine from mine too, for me
https://merkluuv.github.io/MicListener/index.html

I’m still unsure why the original didn’t work (I was just using a standard FFT visualiser example) but now that it is working for me with your code I’ve got the basis to adapt.

thanks again

Sorry I didn’t make it clear, but this was the secret sauce for me:


  startButton.mousePressed(async () => {
    startButton.remove();
    await userStartAudio();
    // Oddly enough the await above does not block until the user clicks "Allow".
    // However, this does still seem to consistently work.
    fft = new p5.FFT();
    input = new p5.AudioIn();
    input.start();
    fft.setInput(input);
  });

Especially the call to await userStartAudio(); before trying to create and start p5.AudioIn. Without that, if I just created the AudioIn and FFT when the button was clicked, then I saw intermittent issue with my sketch not working on GitHub pages (worked fine in other contexts).

1 Like

Thanks for the clarification, and for the help. Big change stepping into p5.js after only ever having used Processing.org, but excited by the possibilities.

thanks again.