Here is my frontend code:
let audioFile = require("assets/hello.wav");
let blob = new Blob([audioFile], { type: "audio/wav" });
try {
await customFetch(`${API_URL}/new-audio-message`, {
method: "POST",
body: JSON.stringify({
audio: blob,
cloneId: cloneId,
}),
});
} catch (error) {
console.log(error);
}
Here is is how I upload the file to s3:
const { audio } = JSON.parse(event.body);
const fileKey = `${sub}/${cloneId}/audio/${uuidv4()}.wav`;
const buffer = Buffer.from(JSON.stringify(audio));
try {
await s3
.putObject({
Bucket: PUBLIC_BUCKET,
Key: fileKey,
Body: buffer,
})
.promise();
} catch (err) {
console.error(err);
}
The file uploads to s3 but the file size for every audio file is 155 B
irrespective of the length of the audio file.
CodePudding user response:
The issue seems to be that the audio
file is not being properly converted to a buffer before being sent to S3. The line const buffer = Buffer.from(JSON.stringify(audio))
is attempting to convert the audio
object to a string and then create a buffer from that string. However, this is not the correct way to convert a Blob
object to a buffer.
Updated frontend code
let audioFile = require("assets/hello.wav");
let blob = new Blob([audioFile], { type: "audio/wav" });
const reader = new FileReader();
reader.readAsArrayBuffer(blob);
reader.onloadend = async () => {
const buffer = Buffer.from(reader.result);
try {
await customFetch(`${API_URL}/new-audio-message`, {
method: "POST",
body: JSON.stringify({
audio: buffer,
cloneId: cloneId,
}),
});
} catch (error) {
console.log(error);
}
};
Updated backend code
const { audio } = JSON.parse(event.body);
const fileKey = `${sub}/${cloneId}/audio/${uuidv4()}.wav`;
try {
await s3
.putObject({
Bucket: PUBLIC_BUCKET,
Key: fileKey,
Body: audio,
})
.promise();
} catch (err) {
console.error(err);
}