Skip to content

Fix: AWS Lambda Layer Not Working — Module Not Found or Layer Not Applied

FixDevs ·

Quick Answer

How to fix AWS Lambda Layer issues — directory structure, runtime compatibility, layer ARN configuration, dependency conflicts, size limits, and container image alternatives.

The Problem

A Lambda function using a Layer throws a module not found error:

Error: Cannot find module 'sharp'
Runtime.ImportModuleError: Unable to import module 'lambda_function': No module named 'pandas'

Despite the Layer being attached in the console, the module isn’t available.

Or the Layer is attached but Lambda uses the wrong version:

Error: Runtime.ImportModuleError: Unable to import module 'handler':
  /opt/python/lib/python3.9/site-packages/numpy/core/_multiarray_umath.cpython-39...
  ...incompatible with python3.11

Or a Layer created for one architecture (x86) fails on another (arm64):

Error: /opt/nodejs/node_modules/sharp/build/Release/sharp-linux-x64.node:
  invalid ELF header

Why This Happens

Lambda Layers have strict requirements that are easy to get wrong:

  • Wrong directory structure — dependencies must be in a specific path inside the layer zip. For Node.js: nodejs/node_modules/. For Python: python/ or python/lib/pythonX.Y/site-packages/. The wrong path means Lambda can’t find the packages.
  • Runtime mismatch — a Layer compiled for Python 3.9 won’t work on Python 3.11. Native extensions (C bindings like numpy, pandas, sharp) are compiled for a specific Python/Node version and CPU architecture.
  • Architecture mismatch — packages with native binaries compiled for x86_64 fail on arm64 (Graviton) Lambda functions and vice versa.
  • Layer not selected for the correct runtime — when creating a Layer, you specify compatible runtimes. If the function’s runtime isn’t listed, AWS may prevent attaching it (or it silently doesn’t work).
  • Layer order matters — if two layers contain the same package but different versions, the layer listed first wins. Lambda merges layer contents, with later layers overwriting earlier ones.

Fix 1: Use the Correct Directory Structure

The directory structure inside the layer zip is mandatory:

Node.js layer zip structure:
nodejs/
  node_modules/
    sharp/
    axios/
    ...

Python layer zip structure:
python/
  lib/
    python3.11/
      site-packages/
        pandas/
        numpy/
        ...
  OR (shorter path, also works):
python/
  pandas/
  numpy/
  ...

Ruby layer:
ruby/
  gems/
    3.2.0/
      gems/
        ...

Java layer:
java/
  lib/
    my-library.jar

Build a Node.js Layer correctly:

# Create the correct directory structure
mkdir -p layer/nodejs

# Install packages INTO the layer directory
cd layer/nodejs
npm init -y
npm install sharp axios lodash

# Go back and zip the layer directory
cd ../..
zip -r layer.zip layer/nodejs/node_modules

# WRONG — this would zip the top-level (missing nodejs/ prefix)
# zip -r layer.zip node_modules/

Build a Python Layer correctly:

# Method 1: Install directly to the layer path
mkdir -p layer/python
pip install pandas numpy -t layer/python/

# Zip preserving the 'python/' directory
cd layer
zip -r ../layer.zip python/

# Method 2: Using pip with --target
pip install \
  --platform manylinux2014_x86_64 \
  --target=layer/python \
  --implementation cp \
  --python-version 3.11 \
  --only-binary=:all: \
  pandas numpy

zip -r layer.zip python/

Fix 2: Build for the Correct Architecture

Native packages (numpy, pandas, psycopg2, sharp) must be compiled for the Lambda runtime OS and architecture:

# Build for Linux x86_64 (default Lambda architecture)
# Use Docker to match Lambda's OS environment

# Node.js — build inside the Lambda Docker image
docker run --rm \
  -v $(pwd):/workspace \
  -w /workspace/layer/nodejs \
  public.ecr.aws/lambda/nodejs:20 \
  npm install sharp

# Python — use pip's --platform flag for cross-compilation
pip install \
  --platform manylinux2014_x86_64 \
  --target=./layer/python \
  --implementation cp \
  --python-version 311 \
  --only-binary=:all: \
  pandas numpy psycopg2-binary

# For arm64 (Graviton Lambda functions)
pip install \
  --platform manylinux2014_aarch64 \
  --target=./layer/python \
  --implementation cp \
  --python-version 311 \
  --only-binary=:all: \
  pandas numpy

Using AWS SAM or CDK to build layers:

# template.yaml (SAM)
Resources:
  DependenciesLayer:
    Type: AWS::Serverless::LayerVersion
    Properties:
      ContentUri: layer/
      CompatibleRuntimes:
        - python3.11
      CompatibleArchitectures:
        - x86_64
    Metadata:
      BuildMethod: python3.11  # SAM builds this layer in a Lambda-compatible environment
# SAM builds the layer with the correct environment
sam build
sam deploy

Fix 3: Configure the Layer in Your Function

Attaching a layer in the console vs code vs CLI:

# AWS CLI — attach layer to existing function
aws lambda update-function-configuration \
  --function-name my-function \
  --layers arn:aws:lambda:us-east-1:123456789:layer:my-layer:3

# Attach multiple layers (order matters — later layers override earlier)
aws lambda update-function-configuration \
  --function-name my-function \
  --layers \
    arn:aws:lambda:us-east-1:123456789:layer:base-layer:1 \
    arn:aws:lambda:us-east-1:123456789:layer:my-overrides:2

Terraform:

resource "aws_lambda_function" "my_function" {
  function_name = "my-function"
  handler       = "index.handler"
  runtime       = "nodejs20.x"
  filename      = "function.zip"
  role          = aws_iam_role.lambda_role.arn

  layers = [
    aws_lambda_layer_version.my_layer.arn,
  ]

  architectures = ["x86_64"]  # Must match layer architecture
}

resource "aws_lambda_layer_version" "my_layer" {
  filename            = "layer.zip"
  layer_name          = "my-dependencies"
  compatible_runtimes = ["nodejs20.x"]
  compatible_architectures = ["x86_64"]
}

CDK:

import * as lambda from 'aws-cdk-lib/aws-lambda';

const dependenciesLayer = new lambda.LayerVersion(this, 'DependenciesLayer', {
  code: lambda.Code.fromAsset('layer'),
  compatibleRuntimes: [lambda.Runtime.NODEJS_20_X],
  compatibleArchitectures: [lambda.Architecture.X86_64],
  description: 'Node.js dependencies',
});

const myFunction = new lambda.Function(this, 'MyFunction', {
  runtime: lambda.Runtime.NODEJS_20_X,
  architecture: lambda.Architecture.X86_64,
  handler: 'index.handler',
  code: lambda.Code.fromAsset('src'),
  layers: [dependenciesLayer],
});

Fix 4: Use Public Layers for Common Libraries

AWS and the community publish ready-made layers for common libraries:

# AWS SDK for JavaScript (Node.js 16 and below only — Node 18+ includes it)
# Check available public layers:
aws lambda list-layers --compatible-runtime nodejs18.x

# Klayers (community Python layers by Keith Rozario)
# Pre-built for each Python version and region
# https://github.com/keithrozario/Klayers

# Get the ARN for pandas in us-east-1, Python 3.11
aws lambda list-layer-versions \
  --layer-name arn:aws:lambda:us-east-1:770693421928:layer:Klayers-p311-pandas \
  --query 'LayerVersions[0].LayerVersionArn'

# Use in your function
aws lambda update-function-configuration \
  --function-name my-function \
  --layers arn:aws:lambda:us-east-1:770693421928:layer:Klayers-p311-pandas:13

AWS Parameters and Secrets Lambda Extension (official layer):

# Add the Parameters and Secrets Extension layer
aws lambda update-function-configuration \
  --function-name my-function \
  --layers arn:aws:lambda:us-east-1:177933569100:layer:AWS-Parameters-and-Secrets-Lambda-Extension:11

Fix 5: Debug Layer Contents

Verify the layer is mounted correctly inside the Lambda environment:

# Python — debug handler to inspect the layer
import os
import sys

def handler(event, context):
    # List /opt directory (where layers are mounted)
    opt_contents = os.listdir('/opt')
    print("Contents of /opt:", opt_contents)

    # Check if python packages are available
    python_path = '/opt/python'
    if os.path.exists(python_path):
        print("Python layer contents:", os.listdir(python_path))

    # Check sys.path to see where Python looks for modules
    print("sys.path:", sys.path)

    # Try importing the module
    try:
        import pandas
        print("pandas version:", pandas.__version__)
    except ImportError as e:
        print("Import error:", e)

    return {"statusCode": 200}
// Node.js — debug handler
exports.handler = async (event) => {
  const fs = require('fs');
  const path = require('path');

  // /opt is where layers are mounted
  const optContents = fs.readdirSync('/opt');
  console.log('Contents of /opt:', optContents);

  // Node.js layer path
  const nodeModulesPath = '/opt/nodejs/node_modules';
  if (fs.existsSync(nodeModulesPath)) {
    console.log('Layer modules:', fs.readdirSync(nodeModulesPath));
  }

  // Try requiring the module
  try {
    const sharp = require('sharp');
    console.log('sharp version:', sharp.versions);
  } catch (e) {
    console.error('Require error:', e.message);
  }

  return { statusCode: 200 };
};

Fix 6: Switch to Container Images for Complex Dependencies

For very large dependencies (machine learning libraries, headless browsers) that exceed the 250MB layer limit, use container images:

# Dockerfile for Lambda container image
FROM public.ecr.aws/lambda/python:3.11

# Install large dependencies directly in the image (no size limit)
RUN pip install \
    pandas \
    numpy \
    scikit-learn \
    torch \
    --no-cache-dir

# Copy function code
COPY lambda_function.py ${LAMBDA_TASK_ROOT}/

CMD ["lambda_function.handler"]
# Build and push to ECR
aws ecr create-repository --repository-name my-lambda
docker build -t my-lambda .
docker tag my-lambda:latest 123456789.dkr.ecr.us-east-1.amazonaws.com/my-lambda:latest
docker push 123456789.dkr.ecr.us-east-1.amazonaws.com/my-lambda:latest

# Deploy Lambda from container image
aws lambda create-function \
  --function-name my-function \
  --package-type Image \
  --code ImageUri=123456789.dkr.ecr.us-east-1.amazonaws.com/my-lambda:latest \
  --role arn:aws:iam::123456789:role/lambda-role

Lambda container image limits: 10GB (vs 250MB unzipped for layers). Container images also support local testing with docker run.

Still Not Working?

Layer size limits — each layer can be up to 50MB zipped, 250MB unzipped. The total unzipped size of all layers plus your function code must stay under 250MB. For larger dependencies, use container images.

/opt/python vs /opt/python/lib/python3.11/site-packages — both paths are searched, but if your packages have conflicting __init__.py files, the path order matters. Lambda searches /opt/python before /opt/python/lib/pythonX.Y/site-packages.

Layer permissions — layers can be private (your account only), shared with specific accounts, or public. If using a layer from another account, the layer owner must grant permission via aws lambda add-layer-version-permission.

Updating a layer doesn’t update functions automatically — layers are versioned. When you publish a new layer version, you must update each Lambda function to use the new ARN (new version number). Functions keep using the old layer version until explicitly updated.

For related AWS Lambda issues, see Fix: AWS Lambda Import Module Error and Fix: AWS Lambda Timeout.

F

FixDevs

Solo developer based in Japan. Every solution is cross-referenced with official documentation and tested before publishing.

Was this article helpful?

Related Articles