Building a serverless application with Laravel, DynamoDB and React
Building a serverless application with Laravel, DynamoDB and React
Architecture & Technology Stack
This diagram shows the architectural components of this sample application:

Here you can find all the involved languages and frameworks:

Requirements
Here is a list of tools that should be installed and configured on your development workstation:
- AWS CLI
- Nodejs v12.x
- Serverless Framework
- PHP 7.4
- Laravel 8.x
I recommend downloading the local version of DynamoDB, so you can try locally before deploying to the cloud.
Finally, I suggest downloading NoSQL Workbench, which provides a simple graphical client for DynamoDB. It can connect to both the cloud and local instances of DynamoDB.
Source Code
The source code for both backend API and React frontend is available here, and you can clone it on your local computer:
git clone https://github.com/code-runner-2017/larvel-react-lambda.git
The bookstore/
directory contains the Laravel backend, and the bookstore-ui
contains the React frontend.
Summary
What I’m doing in the next sections:
- creating a Laravel application, enriched with a package to support DynamoDB in Eloquent and Bref.sh for serverless deployment;
- creating a Books table in DynamoDB, mapping it with a Laravel model, adding some sample records to the table;
- creating a simple REST endpoint that returns all books in the DynamoDB table in JSON format;
- creating a simple React UI that consumes that API to show my book list;
- deploying everything to the AWS cloud.
Setting up the Laravel Backend
Let’s create a Laravel application bookstore
, add Bref, and add the “baopham/dynamodb” package, which enables Eloquent ORM mapping for DynamoDB:
laravel new bookstore
cd bookstore
composer require bref/bref bref/laravel-bridge
php artisan vendor:publish --tag=serverless-config
composer require baopham/dynamodb
Install service provider:
// config/app.php
'providers' => [
...
BaoPham\DynamoDb\DynamoDbServiceProvider::class,
...
];
and finally run:
php artisan vendor:publish \
--provider=BaoPham\DynamoDb\DynamoDbServiceProvider
Now, let’s create a model that represents a book in our database:
php artisan make:model Book
and change the generated file to map a DynamoDB table, extending DynamoDbModel
:
// File app/models/Book.php:class Book extends \BaoPham\DynamoDb\DynamoDbModel
{
use HasFactory;
}
Let’s add a simple REST endpoint that returns all the available books:
// File routes/api.php:Route::get('/books', function (Request $request) {
return App\Models\Book::all();
});
Creating and Populating a DynamoDb Table
Creating a new migration
Several options are available here to create the DynamoDB table. You might use the AWS console or use a CloudFormation script, for example. However, since I’m using Laravel, I decided to go with migrations. In this way, I can easily keep my local DynamoDB in sync with the cloud environments.
Unfortunately, the baopham package doesn’t include migration support, but I adapted the code found in this thread. First, I created a new migration:
php artisan make:migration DynamoTable
and then added the code that you can find here.
Preparing the Environment for Migration
I wanted to be able to run the same migration for the local and cloud versions of DynamoDB. This is why I configured .env
and .env.dev
pointing to two SQLite databases that track the migration status. This is how they’re configured:
# file: .env
APP_NAME=Laravel
APP_ENV=local
APP_KEY=XXXXXXXXXXXXXXXXXXXXXXXXXXXX
APP_DEBUG=true
APP_URL=http://localhostLOG_CHANNEL=slack
LOG_LEVEL=debugDB_CONNECTION=sqlite
DB_DATABASE=database/database.local.sqliteDYNAMODB_CONNECTION=local
DYNAMODB_LOCAL_ENDPOINT=http://localhost:8000SESSION_DRIVER=array
and:
# file: .env.dev
APP_NAME=Laravel
APP_ENV=cloud
APP_KEY=XXXXXXXXXXXXXXXXXXXXXXXXXXXXX
APP_DEBUG=true
APP_URL=http://localhostLOG_CHANNEL=stderr
LOG_LEVEL=debugDB_CONNECTION=sqlite
DB_DATABASE=database/database.cloud.sqliteDYNAMODB_CONNECTION=aws_iam_roleSESSION_DRIVER=array
You won’t find the .env
files on my Github repository, as they might contain sensitive information and should never be committed.
Finally, let’s create the two migration databases:
touch database/database.local.sqlite
touch database/database.cloud.sqlite
On Windows, you can use TYPE NUL > database\database...
instead.
With this configuration, when I run migration commands, they run against the local DynamoDB instance. When I add --env dev
, they run against the AWS account.
Using the migration
You can create and delete the table in your local DynamoDB instance at any time with:
php artisan migrate
php artisan migrate:rollback ## WARNING: all data will be lost!
If you want to create/remove the table from the AWS account, simply add --env cloud
to the above commands, e.g.:
php artisan --env dev migrate
Creating a database seeder
Quoting the Laravel documentation: “Laravel includes a simple method of seeding your database with test data using seed classes. All seed classes are stored in the database/seeders
directory.”
Let’s modify the default DatabaseSeeder
class to insert a few book entries into DynamoDB, replacing the body of the run()
method as follows:
...
use BaoPham\DynamoDb\Facades\DynamoDb;
use Ramsey\Uuid\Uuid;
use App\Models\Book;
...public function run() {
$records = [
[
'title' => 'The Grapes of Wrath',
'author_name' => 'John Steinbeck',
'cover_url' => 'https://i.gr-assets.com/images/S/compressed.photo.goodreads.com/books/1375670575l/18114322.jpg'
],
[
'title' => 'Love in the Time of Cholera',
'author_name' => 'Gabriel Garcia Marquez',
'cover_url' => 'https://i.gr-assets.com/images/S/compressed.photo.goodreads.com/books/1348243057l/9714.jpg'
],
[
'title' => 'The Post-American World',
'author_name' => 'Fareed Zakaria',
'cover_url' => 'https://i.gr-assets.com/images/S/compressed.photo.goodreads.com/books/1347716469l/2120783.jpg'
],
[
'title' => 'Leonardo da Vinci',
'author_name' => 'Walter Isaacson',
'cover_url' => 'https://i.gr-assets.com/images/S/compressed.photo.goodreads.com/books/1523543570l/34684622._SY475_.jpg'
],
]; foreach ($records as $record) {
$record['id'] = Uuid::uuid4()->toString();
$book = Book::create($record);
$book->save();
}
For each record, we generate a UUID v4, which will be the primary key, right before inserting it into the database. The full code is available here.
Let’s populate the DynamoDB table as follows:
php artisan db:seed # for the local DynamoDB
php artisan --env dev db:seed # for DynamoDB in AWS
If you need more data for your tests, you can invoke the seeder multiple times, creating duplicate entries, or you can write a better seeder that generates random data.
Local Testing
Since I’m using local DynamoDB, I’m running my local server on port 8001, instead of 8000:
php -S localhost:8001 -t public
Let’s open ‘http://localhost:8001/api/books’ in the browser and view the results:
[
{
"author_name": "Gabriel Garcia Marquez",
"created_at": "2020-10-11T15:58:22.000000Z",
"cover_url": "https://i.gr-assets.com/images/S/compressed.photo.goodreads.com/books/1348243057l/9714.jpg",
"id": "b306753f-7e8e-4357-80f9-4cf833baf8b5",
"title": "Love in the Time of Cholera",
"updated_at": "2020-10-11T15:58:22.000000Z"
},
...
]
Since .env
points to the local DynamoDB, the API is querying the local database.
Preparing for Deployment
Before deploying, I change the provider section in the serverless.yml
file as follows:
provider:
...
environment:
APP_ENV: ${opt:stage, self:provider.stage}
DYNAMODB_REGION: ${opt:region, self:provider.region, 'us-east-1'}
iamRoleStatements:
- Effect: Allow
Action:
- dynamodb:Query
- dynamodb:Scan
- dynamodb:GetItem
- dynamodb:PutItem
Resource: "*"
In this way, the AWS Lambda function:
- is going to use
.env.dev
- is authorized to access DynamoDB.
Deploying to AWS Lambda
Now we’re ready to deploy to AWS Lambda:
composer install --prefer-dist --optimize-autoloader --no-dev
php artisan config:clear
sls deploy
Because Bref, under the hood, relies on CloudFormation, it might take a while. At the end, you get an output like the following:
service: laravel
stage: dev
...
endpoints:
ANY - https://XXXXXXX.execute-api.us-east-1.amazonaws.com/dev
ANY - https://XXXXXXX.execute-api.us-east-1.amazonaws.com/dev/{proxy+}
functions:
web: laravel-dev-web
artisan: laravel-dev-artisan
...
Save the first URL (this is the way you invoke your Lambda). If you append /api/books
and you copy it to your browser, you should get the same JSON output we got locally. Since the deployed Lambda uses .env.dev
, the query is run against the AWS DynamoDB service.
If you get an error or an empty output, make sure that you’ve run artisan migrate
and artisan db:seed
with the --env dev
option.
Connecting the React App
You can now play around with the React frontend. If you cloned my repository, you’ll find it in the bookstore-ui
subdirectory. Here are the steps:
cd bookstore-ui
npm install
cp .env.example .env
Edit the .env
file to enter the URL printed on the console. The URL should end with /dev
, without any trailing ‘/’. It should look like:
REACT_APP_API_URL=https://abcdef.execute-api.eu-west-1.amazonaws.com/dev
Now you can run the app with:
npm start
I’m using one of the many React patterns to fetch data from a REST API. In particular, App.js
uses the high-order component WithListLoading
that handles the state change when the list of books is loading, and when it’s loaded. Remember that the AJAX call is asynchronous, and it might take a while to complete — that’s why we need the WithListLoading
component. Feel free to use any other data fetch pattern that you like. By the way, this article was my source. I just adapted the code found there.
The List
component, which is responsible for the actual rendering of the book list, is invoked only once data are available.
App.js:
...function App() {
const ListLoading = withListLoading(List);
const [appState, setAppState] = useState({
loading: false,
repos: null,
});useEffect(() => {
setAppState({ loading: true });
const apiUrl = process.env.REACT_APP_API_URL + '/api/books';
axios.get(apiUrl).then((books) => {
const allBooks = books.data;
setAppState({ loading: false, books: allBooks });
});
}, [setAppState]);
return (
...
<div className='repo-container'>
<ListLoading isLoading={appState.loading} books={appState.books} />
</div>
...
);
}
withListLoading.js:
function WithListLoading(Component) {
return function WithLoadingComponent({ isLoading, ...props }) {
if (!isLoading) return <Component {...props} />;
return (
<p style={{ textAlign: 'center', fontSize: '30px' }}>
Loading...
</p>
);
};
}
List.js:
const List = (props) => {
const { books } = props;
if (!books || books.length === 0) return <p>No books, sorry</p>;
return (
<ul>
<h2 className='list-head'>Available Books</h2>
{books.map((book) => {
return (
<li key={book.id} className='list'>
<div>
<span className='repo-title'>{book.title} </span>
(<span className='repo-author'>{book.author_name}</span>)
</div>
<img className='repo-img' src={book.cover_url}/>
</li>
);
})}
</ul>
);
};
Cleaning up
If you want to remove everything from AWS, including the database, you can run:
php artisan migrate:rollback ## WARNING: all data will be lost!
sls remove
Final Thoughts
If you’re considering building a real application with the same stack, you’ll need to take into account several aspects that I’ve omitted here, such as caching and authentication. Also, make sure you’ve got a solid understanding of DynamoDB concepts, design patterns, and pricing model — I know about projects that underestimated one or more of them.
One limitation that I’ve found mapping DynamoDB tables to Eloquent is that it’s not easy to set the table name at runtime. Since all DynamoDB tables are shared in the same AWS account, a common practice is to use a suffix or prefix, such as ‘_dev’, ‘_test’, ‘_production’ to distinguish them. Using Eloquent, the only possibility is to use separated AWS accounts for the different environments.
If you’re building a commercial application based on serverless Laravel, I recommend Laravel Vapor.