How to Add Audio to a React Three Fiber App
Published on 11 Jan, 2025 | ~6 min read
In this tutorial, we'll explore three different ways to add audio to a React Three Fiber scene.
Audio
First things first, set up a React Three Fiber project. To save time, you can use my React Three Fiber boilerplate.
Here's the codebase in case you decide to create the project yourself.
App.jsx:
import { Canvas, useThree } from '@react-three/fiber';
import { OrbitControls, GizmoHelper, GizmoViewport } from '@react-three/drei';
import { Color } from 'three';
import { useControls } from 'leva';
function UpdateSceneBackground() {
const { scene } = useThree();
const { color } = useControls({
color: '#ffffff',
});
scene.background = new Color(color);
return null;
}
function App() {
return (
<div id='canvas-container'>
<Canvas camera={{ position: [0.5, 5, 6] }}>
<UpdateSceneBackground />
<axesHelper args={[10]} />
<gridHelper args={[20, 20, 0xff22aa, 0x55ccff]} />
<GizmoHelper alignment='bottom-right' margin={[80, 80]}>
<GizmoViewport />
</GizmoHelper>
<OrbitControls />
</Canvas>
</div>
);
}
export default App;
Also, ensure your audio file is in the public folder of your project directory.
Now, to add background audio to the scene, we need to use the Audio
class from the core of the standard Three.js library, as there isn't a dedicated React component for this purpose (at least in the current version).
That said, let's import the required classes.
import { Color, AudioListener, AudioLoader, Audio } from 'three';
We'll also need the useEffect()
hook.
import { useEffect } from 'react';
Next, we'll create a new custom component where we'll access the camera
and use the useEffect()
hook.
function AudioComponent() {
const { camera } = useThree();
useEffect(() => {
}, []);
return null;
}
<Canvas camera={{ position: [0.5, 5, 6] }}>
{/*
...
Helpers
and
gizmo */}
<AudioComponent />
</Canvas>
Once that's done, in the useEffect()
hook, we'll create an AudioListener
instance and add it to the camera
.
Before we dive into the code, what exactly is an AudioListener
, and why does it need to be added to the camera
? These are probably questions you're asking yourself right now, right?
The best explanation I've found for this question is from the Unity Docs:
"The Audio Listener acts as a microphone-like device. It receives input from any given Audio Source in the scene and plays sounds through the computer speakers. For most applications it makes the most sense to attach the listener to the Main Camera."
const listener = new AudioListener();
camera.add(listener);
The lines above create the AudioListener
and add it to the camera, but we haven't created the source and linked it to it. Let's do that now.
const sound = new Audio(listener);
At this point, our source is empty, so we'll need to use the AudioLoader
to load the audio file and assign the audio data stored in a buffer to the source.
const audioLoader = new AudioLoader();
audioLoader.load('/sound.mp3', (buffer) => {
sound.setBuffer(buffer);
sound.setLoop(true);
sound.setVolume(0.5);
});
So far, we've prepared the audio source. The final step is to play it.
const handleClick = () => {
sound.play();
};
window.addEventListener('click', handleClick);
Full snippet:
function AudioComponent() {
const { camera } = useThree();
useEffect(() => {
const listener = new AudioListener();
camera.add(listener);
const sound = new Audio(listener);
const audioLoader = new AudioLoader();
audioLoader.load('/sound.mp3', (buffer) => {
sound.setBuffer(buffer);
sound.setLoop(true);
sound.setVolume(0.5);
const handleClick = () => {
sound.play();
};
window.addEventListener('click', handleClick);
});
}, []);
return null;
}
Positional Audio
Positional audio is a type of audio affected by the distance to the camera (the listener more precisely). The closer the camera is to the audio source, the louder the volume—and vice versa.

That said, there are two ways to integrate this type of audio into the scene.
Method 1
The first method is to use the PositionalAudio
class from Three.js.
So what we'll do now is import the PositionalAudio
class from the core of the library and the useRef()
hook.
import {
Color,
AudioListener,
AudioLoader,
Audio,
PositionalAudio,
} from 'three';
import { useEffect, useRef } from 'react';
Next, we'll create a new component, just like we did earlier, except this time the component will return a box that serves as the source of the sound.
Basically, when we zoom closer to the box, the volume will increase, and it will decrease as we scroll away.
function AudioComponent2() {
const { camera } = useThree();
useEffect(() => {
}, []);
return (
<mesh>
<boxGeometry />
<meshNormalMaterial />
</mesh>
);
}
{/* <AudioComponent /> */}
<AudioComponent2 />
Since we'll attach the sound to the box, we need a reference, so we'll create one outside the useEffect()
hook.
const { camera } = useThree();
const audioRef = useRef();
Next, within useEffect()
, we'll use the same code as in the earlier example, with a few minor adjustments.
useEffect(() => {
const listener = new AudioListener();
camera.add(listener);
const sound = new PositionalAudio(listener);
const audioLoader = new AudioLoader();
audioLoader.load('/sound.mp3', (buffer) => {
sound.setBuffer(buffer);
sound.setLoop(true);
sound.setVolume(0.5);
sound.setRefDistance(5);
audioRef.current = sound;
const handleClick = () => {
sound.play();
};
window.addEventListener('click', handleClick);
});
}, []);
return (
<mesh ref={audioRef}>
<boxGeometry />
<meshNormalMaterial />
</mesh>
);
The first change is using a PositionalAudio
instance instead of Audio
.
The second change, or rather addition, is using setRefDistance()
. This method sets the distance from the audio source where the audio plays at its original volume.
Finally, we assigned the sound to the reference that we set on the mesh.
Method 2
Earlier in the article, I mentioned there's no audio component, but that's not entirely true. There's actually a Drei component for positional audio.
So once again, we'll import the component and create a custom component that returns a mesh.
import {
OrbitControls,
GizmoHelper,
GizmoViewport,
PositionalAudio as PositionalAudio2,
} from '@react-three/drei';
Note: I imported PositionalAudio
as PositionalAudio2
because the name was already used for another imported module.
function AudioComponent3() {
const audioRef = useRef();
const handleClick = () => {
};
return (
<mesh onClick={handleClick}>
<boxGeometry />
<meshBasicMaterial color={0x00ff00} />
</mesh>
);
}
{/* <AudioComponent /> */}
{/* <AudioComponent2 /> */}
<AudioComponent3 />
As you can see here, I added a bit of variety by making the audio play when the mesh is clicked, instead of playing for the entire scene.
Now, we'll add the <PositionalAudio2 />
component with some self-explanatory props to the mesh.
<mesh onClick={handleClick}>
<boxGeometry />
<meshBasicMaterial color={0x00ff00} />
<PositionalAudio2
url='/sound.mp3'
distance={1}
loop
autoplay={false}
ref={audioRef}
/>
</mesh>
Now that we've set up the component, we need to play the audio when the box is clicked and the audio file is loaded.
const handleClick = () => {
if (audioRef.current) {
audioRef.current.play();
}
};
Wrap Up
And that's it for this tutorial!
Remember, audio is a powerful tool that can take your interactive app to a whole nother level, especially with the use of positional audio if used mindfully.
See you!