Export files to Amazon S3 in Lua with LuaSocket

Exporting files to Amazon S3 using Lua might seem challenging since there is no official AWS SDK for Lua. However, you can integrate with S3 by leveraging Lua libraries to make direct REST API calls. In this post, we detail two approaches: using the robust lua-resty-aws library (recommended for OpenResty users) and a pure Lua solution with luasocket that includes a custom AWS Signature Version 4 implementation.
Installation and setup
Install the required dependencies based on your environment:
# For openresty users (recommended):
luarocks install lua-resty-aws
# For pure Lua environments:
luarocks install luasocket
luarocks install luaossl # Provides HTTPS and cryptographic support
luarocks install lua-resty-hmac # Optional, for HMAC support if needed
AWS credential setup
- Create an IAM user with the appropriate S3 permissions.
- Generate and securely store your access keys.
- Create an S3 bucket with the needed configuration.
- Set your credentials as environment variables:
local AWS_ACCESS_KEY = os.getenv("AWS_ACCESS_KEY_ID")
local AWS_SECRET_KEY = os.getenv("AWS_SECRET_ACCESS_KEY")
local AWS_REGION = "us-west-2"
local BUCKET_NAME = "your-bucket-name"
Using lua-resty-aws (recommended)
The lua-resty-aws
library provides built-in AWS Signature Version 4 support, making it easier to
interact with S3. The example below demonstrates how to upload a file using this library.
local AWS = require "resty.aws"
-- Initialize the AWS client with credentials
local aws, err = AWS.new({
aws_access_key = AWS_ACCESS_KEY,
aws_secret_key = AWS_SECRET_KEY,
aws_region = AWS_REGION
})
if not aws then
error("Failed to initialize AWS: " .. tostring(err))
end
local s3 = aws:S3()
local function upload_file(file_path, content_type)
local file, err = io.open(file_path, "rb")
if not file then
return nil, "Failed to open file: " .. tostring(err)
end
local content = file:read("*a")
file:close()
local key = file_path:match("([^/]+)$") -- Extract filename from path
local res, err = s3:putObject({
Bucket = BUCKET_NAME,
Key = key,
Body = content,
ContentType = content_type
})
if not res then
return nil, "Upload error: " .. tostring(err)
end
return res
end
Pure Lua implementation using LuaSocket
For environments without OpenResty, you can implement AWS Signature Version 4 signing manually. The
following example demonstrates a complete upload function using luasocket
along with luaossl
for
cryptographic operations.
local http = require("socket.http")
local https = require("ssl.https")
local ltn12 = require("ltn12")
local openssl_digest = require("openssl.digest")
local openssl_hmac = require("openssl.hmac")
-- Helper function to convert binary data to hexadecimal
local function tohex(str)
return (str:gsub('.', function(c) return string.format("%02x", string.byte(c)) end))
end
-- HMAC-SHA256 function
local function hmac_sha256(key, msg)
return openssl_hmac.new(key, "sha256"):final(msg)
end
-- Compute SHA256 hash and return hexadecimal
local function sha256_hex(data)
local digest = openssl_digest.new("sha256")
digest:update(data)
return tohex(digest:final())
end
-- Get timestamps in the required AWS format
local function get_amz_date()
return os.date("!%Y%m%dT%H%M%SZ")
end
local function get_datestamp()
return os.date("!%Y%m%d")
end
local function upload_file_alt(file_path, content_type)
local file = io.open(file_path, "rb")
if not file then
return nil, "Failed to open file"
end
local content = file:read("*a")
file:close()
local key = file_path:match("([^/]+)$")
local amz_date = get_amz_date()
local datestamp = get_datestamp()
local host = BUCKET_NAME .. ".s3." .. AWS_REGION .. ".amazonaws.com"
local payload_hash = sha256_hex(content)
local canonical_uri = "/" .. key
local canonical_querystring = ""
-- Construct canonical headers and signed headers
local canonical_headers =
"content-type:" .. content_type .. "\n" ..
"host:" .. host .. "\n" ..
"x-amz-content-sha256:" .. payload_hash .. "\n" ..
"x-amz-date:" .. amz_date .. "\n"
local signed_headers = "content-type;host;x-amz-content-sha256;x-amz-date"
-- Create canonical request
local canonical_request = table.concat({
"PUT",
canonical_uri,
canonical_querystring,
canonical_headers,
signed_headers,
payload_hash
}, "\n")
local request_hash = sha256_hex(canonical_request)
local credential_scope = datestamp .. "/" .. AWS_REGION .. "/s3/aws4_request"
local string_to_sign = table.concat({
"AWS4-HMAC-SHA256",
amz_date,
credential_scope,
request_hash
}, "\n")
-- Generate the Signing Key
local kDate = hmac_sha256("AWS4" .. AWS_SECRET_KEY, datestamp)
local kRegion = hmac_sha256(kDate, AWS_REGION)
local kService = hmac_sha256(kRegion, "s3")
local kSigning = hmac_sha256(kService, "aws4_request")
-- Compute signature
local signature = tohex(hmac_sha256(kSigning, string_to_sign))
local authorization = "AWS4-HMAC-SHA256 " ..
"Credential=" .. AWS_ACCESS_KEY .. "/" .. credential_scope .. ", " ..
"SignedHeaders=" .. signed_headers .. ", " ..
"Signature=" .. signature
local url = string.format("https://%s%s", host, canonical_uri)
local response_body = {}
local res, code = https.request{
url = url,
method = "PUT",
headers = {
["Host"] = host,
["Content-Type"] = content_type,
["Content-Length"] = tostring(#content),
["x-amz-date"] = amz_date,
["x-amz-content-sha256"] = payload_hash,
["Authorization"] = authorization,
},
source = ltn12.source.string(content),
sink = ltn12.sink.table(response_body)
}
if code ~= 200 and code ~= 204 then
return nil, table.concat(response_body)
end
return true
end
Error handling and retries
Implementing error handling with exponential backoff can improve the robustness of file uploads. The example below demonstrates a retry mechanism:
local function upload_with_retry(upload_func, file_path, content_type, max_retries)
local retries = 0
local wait_time = 1
while retries < max_retries do
local success, err = upload_func(file_path, content_type)
if success then
return true
end
retries = retries + 1
if retries == max_retries then
return nil, string.format("Failed after %d retries: %s", retries, err)
end
-- Exponential backoff with jitter
wait_time = wait_time * 2 * (0.5 + math.random())
os.execute(string.format("sleep %d", wait_time))
end
end
Content type handling
Determining the correct MIME type for files is important when uploading to S3. Use the function below to map file extensions to content types:
local mime_types = {
["txt"] = "text/plain",
["html"] = "text/html",
["css"] = "text/css",
["js"] = "application/javascript",
["json"] = "application/json",
["png"] = "image/png",
["jpg"] = "image/jpeg",
["jpeg"] = "image/jpeg",
["gif"] = "image/gif",
["webp"] = "image/webp",
["pdf"] = "application/pdf",
["zip"] = "application/zip"
}
local function get_content_type(filename)
local ext = filename:match("%.([^%.]+)$")
return ext and mime_types[ext:lower()] or "application/octet-stream"
end
Conclusion
Both approaches provide practical ways to export files to Amazon S3 using Lua. The lua-resty-aws
method is recommended for its built-in AWS Signature Version 4 support and streamlined integration.
The pure Lua implementation offers deeper insights but requires managing AWS authentication
manually. For a fully managed solution that simplifies file exports, you might consider exploring
comprehensive services like Transloadit's file exporting service.