No native dependencies. No JVM. No DLLs. PDFluent runs on Lambda, Cloudflare Workers, and Fly.io straight from a single Rust binary.
use lambda_runtime::{run, service_fn, Error, LambdaEvent};
use pdfluent::Document;
use serde::{Deserialize, Serialize};
#[derive(Deserialize)]
struct Request {
/// Base64-encoded PDF bytes
pdf_b64: String,
}
#[derive(Serialize)]
struct Response {
page_count: usize,
}
async fn handler(event: LambdaEvent<Request>) -> Result<Response, Error> {
let bytes = base64::decode(&event.payload.pdf_b64)?;
let doc = Document::open_bytes(&bytes)?;
Ok(Response { page_count: doc.page_count() })
}
#[tokio::main]
async fn main() -> Result<(), Error> {
run(service_fn(handler)).await
}Run cargo add pdfluent to get started.
PDFluent is written entirely in Rust. There are no C extensions, no JNI bridges, no shared libraries to bundle. The compiled binary is all you need. Zip it and deploy.
Because there is no JVM or .NET CLR to initialize, Lambda cold starts are under 50 ms on arm64 with the provided.al2 runtime. Subsequent invocations are effectively instant.
The complete PDFluent binary and its dependencies compile to under 8 MB when stripped. Your entire Lambda deployment package stays well inside the 250 MB uncompressed limit.
PDFluent compiles to WASM for Cloudflare Workers. Process PDFs at the edge in 130+ locations worldwide without provisioning any server infrastructure.
Build for aarch64-unknown-linux-musl and deploy to AWS Lambda arm64. Graviton2 instances cost 20% less than x86 equivalents at the same performance level.
Any environment that runs a Linux binary works. No sidecar processes, no dependency installers. Drop the binary into a scratch container and ship it.