Create and write an archive directly to disk in one operation.
This is more efficient than creating an archive and then writing it separately, as it streams the data directly to disk.
method
Create and write an archive directly to disk in one operation.
This is more efficient than creating an archive and then writing it separately, as it streams the data directly to disk.
The file path to write the archive to
The input data for the archive (same as Archive.from())
Optional compression: "gzip", true for gzip, or false/undefined for none
A promise that resolves when the write is complete
Write uncompressed tarball:
await Bun.Archive.write("output.tar", {
"file1.txt": "content1",
"file2.txt": "content2",
});
A class for creating and extracting tar archives with optional gzip compression.
Bun.Archive provides a fast, native implementation for working with tar archives. It supports creating archives from in-memory data or extracting existing archives to disk or memory.
Create an archive from an object:
const archive = Bun.Archive.from({
"hello.txt": "Hello, World!",
"data.json": JSON.stringify({ foo: "bar" }),
"binary.bin": new Uint8Array([1, 2, 3, 4]),
});
Optional compression: "gzip", true for gzip, or false/undefined for none
A promise that resolves with the archive data as a Blob
Get uncompressed tarball:
const blob = await archive.blob();
Get the archive contents as a Uint8Array.
Optional compression: "gzip", true for gzip, or false/undefined for none
A promise that resolves with the archive data as a Uint8Array
Get uncompressed tarball bytes:
const bytes = await archive.bytes();
Extract the archive contents to a directory on disk.
Creates the target directory and any necessary parent directories if they don't exist. Existing files will be overwritten.
The directory path to extract to
Optional extraction options
A promise that resolves with the number of entries extracted (files, directories, and symlinks)
Extract all entries:
const archive = Bun.Archive.from(tarballBytes);
const count = await archive.extract("./extracted");
console.log(`Extracted ${count} entries`);
Get the archive contents as a Map of File objects.
Each file in the archive is returned as a File object with:
name: The file path within the archivelastModified: The file's modification time from the archivetext(), arrayBuffer(), stream(), etc.)Only regular files are included; directories are not returned. File contents are loaded into memory, so for large archives consider using extract() instead.
Optional glob pattern(s) to filter files. Supports the same syntax as Bun.Glob, including negation patterns (prefixed with !). Patterns are matched against paths normalized to use forward slashes (/).
A promise that resolves with a Map where keys are file paths (always using forward slashes / as separators) and values are File objects
Get all files:
const entries = await archive.files();
for (const [path, file] of entries) {
console.log(`${path}: ${file.size} bytes`);
}
The input data for the archive:
A new Archive instance
From an object (creates new tarball):
const archive = Bun.Archive.from({
"hello.txt": "Hello, World!",
"nested/file.txt": "Nested content",
});
Create and write an archive directly to disk in one operation.
This is more efficient than creating an archive and then writing it separately, as it streams the data directly to disk.
The file path to write the archive to
The input data for the archive (same as Archive.from())
Optional compression: "gzip", true for gzip, or false/undefined for none
A promise that resolves when the write is complete
Write uncompressed tarball:
await Bun.Archive.write("output.tar", {
"file1.txt": "content1",
"file2.txt": "content2",
});
Input data for creating an archive. Can be:
Compression format for archive output.
"gzip" - Compress with gziptrue - Same as "gzip"false - Explicitly disable compression (no compression)undefined - No compression (default behavior when omitted)Both false and undefined result in no compression; false can be used to explicitly indicate "no compression" in code where the intent should be clear.