Tech News is a blog created by Wasim Akhtar to deliver Technical news with the latest and greatest in the world of technology. We provide content in the form of articles, videos, and product reviews.
Java URL Encoder/Decoder Example – In this tutorial we will see how to URL encode/decode attributes in Java.
Overview
URL encoding, is a mechanism for encoding information in a Uniform Resource Identifier (URI) under certain circumstances.
When we submit an HTML form using GET or POST request, the data/fields in the form are encoded using application/x-www-form-urlencoded type. There is an optional attribute on <form> element called enctype. Its default value is application/x-www-form-urlencoded. This specification defines how the values are encoded (for e.g. replacing space with +) and decoded on server.
When HTML form data is sent in an HTTP GET request, it is included in the query component of the request URI. When sent in an HTTP POST request, the data is placed in the request body, and application/x-www-form-urlencoded is included in the message’s Content-Type header.
Encode URL in Java
To encode the query parameters or attributes, Java provides a utility class URLEncoder with a couple of encode() methods.
The URLEncoder.encode() method escapes all characters except:
Java also provides utility class URLDecoder to decode the value. Usually on server side when we receive the form attributes, using URLDecoder we can decode them to original values.
Java provides utility classes URLEncoder and URLDecoder to encode/decode text using URLencoding to transfer data on http. In this tutorial we saw how to use Java URL encoder and decoder.
Show Multiple Examples in OpenAPI – OpenAPI (aka Swagger) Specifications has become a defecto standard for documenting and sharing REST API. When using OpenAPI it is always best practice to add as much detail as we can. The API specification should be built from the API consumers perspective. The DX or developer experience is important when developing the API.
To improve the API experience we must define attributes with descriptions and example. It is also possible to define multiple examples to show different way the API can be consumed / requested.
First, let us see how swagger editor (editor.swagger.io) shows multiple examples. The examples are shown in a dropdown where user can choose and see appropriate request payload. sample1 and sample2 are two examples for Pet store API.
Adding Multiple Examples in OpenAPI
To add multiple examples in OpenAPI, we can define examples attribute as shown below. Notice how we defined sample1 and sample2. You can give any meaningful name relevant to your API.
In OpenAPI, we can also provide example at attribute level. While it is good to define an attribute example (e.g. petType) so the consumer of API know what to pass or what to expect from attribute. However it is also a good idea to provide example at broader request/response level.
The request/response level example would provide much broader context to API consumer and also helps documenting API better. Furthermore many mock tools can generate mock responses from the examples provided in Swagger file.
Multiple Examples in API Response
The multiple example works with both API Request and Response. Similar to what we did above, the same can be specified for API Response. In below screenshot we can see how swagger editor shows multiple response example.
openapi.yaml with examples in response
paths: /pets: get: description: Returns all pets operationId: findPets parameters: - name: tags in: query description: tags to filter by required: false style: form schema: type: array items: type: string - name: limit in: query description: maximum number of results to return required: false schema: type: integer format: int32 responses: '200': description: pet response content: application/json: schema: type: array items: $ref: '#/components/schemas/Pet' examples: sample1: value: name: Cupcake tag: Chihuahua sample2: value: name: Prince tag: Poodle default: description: unexpected error content: application/json: schema: $ref: '#/components/schemas/Error'
Hope this little trick will make your API documentation awesome :-)
Local WordPress using Docker – Running a local WordPress development environment is crucial for testing themes and plugin before we push into staging or production environment. To run WordPress locally, we need to install and setup PHP, MySQL (or MariaDB) and WordPress which is not straightforward.
Docker provides an ideal way of setting up local WordPress development setup (for Windows you might prefer WAMP). Using simple docker commands we can quickly spin up a new environment where we can test WordPress themes and plugins. Assuming you already have setup Docker in your machine, starting WordPress is quite rapid.
Since we are running WordPress in Docker, the same setup will work in Window, Mac and Linux.
Local WordPress Setup with Docker
Let us see how to run local WordPress setup for development using Docker.
Setup Docker Compose for WordPress. Start up a command line terminal and create wp-local folder.
$ mkdir wp-local && cd wp-local $ touch docker-compose.yml
To setup WordPress + MySQL + phpMyAdmin images in the docker-compose, copy following content into it.
In above docker-compose.yml file we are creating 3 containers; mysql, wordpress and phpmyadmin. The wordpress container exposing the wordpress at port 80. Similarly phpmyadmin is exposed at port 8080. Both wordpress and phpmyadmin depends on the db container which runs MySQL image.
Docker Compose UP
Save the docker-compose.yml file and run docker-compose up command to create and start the docker containers with WordPress, MySQL and phpMyAdmin.
$ docker-compose up -d
When running first time, Docker will build up the stack and download all the images. Hence it might take a while. However subsequent invocation is going to be instant.
Setup WordPress
Once the docker-compose is completed, open up the browser and goto http://localhost
We can start the local wordpress setup. Enter Site Title, Username and Password and press Install WordPress.
Once WordPress setup is completed, login using the username/password provided in previous step and you will be greeted with WordPress Dashboard.
phpMyAdmin Setup
Since we also setup phpMyAdmin in our Docker compose file, login to phpMyAdmin to view/update WordPress database.
The username/password for phpMyAdmin is the value of WORDPRESS_DB_USER and WORDPRESS_DB_PASSWORD environment used in Docker compose file.
Username: wordpress
Password: wordpress
Bonus: Increase File Upload Size in Local WordPress Docker
If you are trying to import settings from existing WordPress site once you start your local WordPress docker container, you will realise the default max upload size is 2mb.
To increase the upload file size, we can specify custom php.ini file (in our case upload.ini) and setup the Docker compose file to copy it within container.
Create file upload.ini in the same folder as docker-compose.yml
$ touch upload.ini
Add following in upload.ini to change the upload_max_filesize.
JSON Web Token (JWT) is an open standard defines a compact and self-contained way for securely transmitting information between parties as a JSON object. This information can be verified and trusted because it is digitally signed. JWTs can be signed using a secret (with the HMAC algorithm) or a public/private key pair using RSA or ECDSA.
Although JWTs can be encrypted to also provide secrecy between parties, we will focus on signed tokens. Signed tokens can verify the integrity of the claims contained within it, while encrypted tokens hide those claims from other parties. When tokens signs using public/private key pairs, the signature also certifies that only the party holding the private key is the one signed it.
1.1 What is JSON Web Token (JWT) Structure?
JWT tokens consist of 3 parts separated by a period ( . ).
These parts are:
The above code to generate JWT is pretty self-explanatory however let’s check step by step how are we generating JWT token:
Add claims name and email with value Jane Doe and jane@example.com respectively
Add subject in JWT token with value jane
Set Id for the JWT token using randomly generate GUID
Set issued at to current time
Set expiration to current time plus 5 minutes. So the JWT is valid for only 5 minutes
The JWT generated above is not signed (Check algorithm alg attribute in the header). We have just encoded the claims in JSON format. If using JWT for authentication or authorization it is advisable to Sign the JWT, so it can be verified.
4. Validate/Parse JWT Token
To validate or parse the JWT token, Jwts.parserBuilder() method is used.
While parsing the JWT token we need to pass Signing key to verify the JWT signature. Let us see how to sign the JWT token using different algorithms.
5. Create and Validate JWT Token Signed using HMAC Secret
The simplest way of creating a signed JWT token is by using HMAC secret. HMAC stands for hash-based message authentication code and is cryptographic hash function. It is used to simultaneously verify both the data integrity and the authenticity of a token.
5.1 Create JWT Token signed with HMAC
To create JWT token signed with HMAC shared secret, we need to specify signature using .signWith() method.
import io.jsonwebtoken.SignatureAlgorithm;
import javax.crypto.spec.SecretKeySpec;
import java.security.Key;
import java.util.Base64;
//...// Key is hardcoded here for simplicity. // Ideally this will get loaded from env configuration/secret vault
String secret = "asdfSFS34wfsdfsdfSDSD32dfsddDDerQSNCK34SOWEK5354fdgdf4";
Key hmacKey = new SecretKeySpec(Base64.getDecoder().decode(secret),
SignatureAlgorithm.HS256.getJcaName());
Instant now = Instant.now();
String jwtToken = Jwts.builder()
.claim("name", "Jane Doe")
.claim("email", "jane@example.com")
.setSubject("jane")
.setId(UUID.randomUUID().toString())
.setIssuedAt(Date.from(now))
.setExpiration(Date.from(now.plus(5l, ChronoUnit.MINUTES)))
.signWith(hmacKey)
.compact();
To validate/parse the JWT token generated using HMAC shared secret, the same steps can be applied. We need to use setSigningKey() method to set the key before we parse the JWT token.
If the JWT token expires (exp claim value is less than current system time), the parseClaimsJws() method will throw SignatureException.
Exception in thread "main" io.jsonwebtoken.security.SignatureException: JWT signature does not match locally computed signature. JWT validity cannot be asserted and should not be trusted.
at io.jsonwebtoken.impl.DefaultJwtParser.parse(DefaultJwtParser.java:411)
at io.jsonwebtoken.impl.DefaultJwtParser.parse(DefaultJwtParser.java:541)
at io.jsonwebtoken.impl.DefaultJwtParser.parseClaimsJws(DefaultJwtParser.java:601)
at io.jsonwebtoken.impl.ImmutableJwtParser.parseClaimsJws(ImmutableJwtParser.java:173)
at net.viralpatel.jwt.JWTGenerator.parseJwt(JWTGenerator.java:32)
at net.viralpatel.jwt.JWTGenerator.main(JWTGenerator.java:20)
6. Create and Validate JWT Token Signed using RSA Private Key
When using JWT token for microservice authentication/authorization it is advisable to sign with RSA Private/Public Keys instead of using Shared HMAC Secret. The token is generated and signed by a central authority (usually an Authorization Server) and each microservice can validate the JWT token using the Public Key exposed from Authorization Server.
Before we see how to generate JWT token with Private/Public key, let us see how to generate a Private and Public RSA Key pairs.
6.1 Generate Private and Public RSA Key
Generate an RSA private key, of size 2048, and output it to a file named key.pem:
$ openssl genrsa -out key.pem 2048
Generating RSA private key, 2048 bit long modulus
..........+++
..........................................................................+++
e is 65537 (0x10001)
Extract the public key from the key pair, which can be used in a certificate:
For simplicity the Private Key is hard coded in above example. However, in real production system it will be loaded from environment variable, or a secret vault (Hashicorp Vault or AWS Parameter Store).
In above example the method getPrivateKey() gets the java.security.PrivateKey which is then used in Jwts.builder to sign the JWT token using Private key.
6.3 Validate/Parse JWT Token signed with RSA Private/Public Keys
Next, let us validate and parse the JWT signed using RSA. For that we will need Public Key instance in java.security.PublicKey format which Jwts.parserBuilder will use to validate the JWT.
In above example, we have hardcoded the Public Key. In a production system, it is usually configured through environment variable or service configuration.
7. Source Code – Generate and Validate JWT Tokens using Java & JJWT
Source code for the Creating and Validating JWT token in Java.
GraphQL Subscription provides a great way of building real-time API. In this tutorial we will build a realtime GraphQL subscription API using Spring Boot.
1. Overview
Spring Boot GraphQL Subscription – GraphQL provides an alternate way of building APIs. Similar to REST, with GraphQL we can query data from server/backend and also update data. This is called Query and Mutation in GraphQL. Checkout the tutorial on Getting Started with GraphQL with Spring Boot in which both Query and Mutation is discussed in detail.
Oftentimes clients want to get pushed updates from the server when data they care about changes. GraphQL Subscription provides this capability of long-lived stream of source events that client can subscribe to.
2. GraphQL Subscription
GraphQL subscriptions are a way to push data from the server to the clients that choose to listen to real time messages from the server. Subscriptions are similar to queries in that they specify a set of fields to be delivered to the client, but instead of immediately returning a single answer, a result is sent every time a particular event happens on the server.
When using Subscription, the client will connect to GraphQL server using Websocket. A persistent websocket connect gets established at the beginning and used across the lifecycle until client wants to consume messages or until server is available. Each time a new message is available, server pushes it on the websocket connection to client.
Designing a realtime API using GraphQL subscription requires careful considerations. Unlike Query and Mutation which are stateless operation types, Subscription is a stateful operation. A stateful connection using WebSocket is established with server. Server should be designed to handle multiple connections and scale along with new connections. Usually to achieve this server can use a messaging service such as RabbitMQ, Redis Pub/Sub or Kafka.
3. Spring Boot GraphQL Subscription API
For this tutorial, we will going to mock the messaging server and send mock data to client through Subscription. Let us start with initializing Spring Boot app.
The next step would be to create Spring Boot app with GraphQL support. Let us scaffold the app with start.spring.io using following options:
The GraphQL schema can be written using GraphQL SDL (Schema Definition Language). Following is our app’s GraphQL schema.
Schema definition file is under src/resources directory. GraphQL Spring Boot starter will read this and configure it using graphql-java-tools.
src/resources/schema.graphqls
type Query {
stockDetail(symbol: String): StockDetail
}
type Subscription {
stockPrice(symbol: String): StockPrice
}
type StockDetail {
symbol: String,
name: String,
marketCap: String
}
type StockPrice {
symbol: String
price: String
timestamp: String
}
The schema consist of GraphQL Query and Subscription default operation types. It also contains user defined types StockDetail and StockPrice.
The Query stockDetail returns the type StockDetail and is used as GET API to get the stock detail once when called. The Subscription method stockPrice returns the type StockPrice. Since it is Subscription method, instead of returning the result once it subscribe to the method and listen for any changes from server. Server will push changes in realtime whenever there is new data available.
3.3 Query Resolver
The Spring Boot Graphql starter expects an instance of GraphQLQueryResolver in classpath when the app boots.
The custom QueryResolver class defines method stockDetail which matches to the GraphQL SDL we defined above. This method will be invoked whenever the client calls the Query method stockDetail.
The response is just hardcoded value of stock symbol, name and price. Instead of hardcoding, these values can come from database or any backend API.
3.3 Subscription Resolver
Similar to QueryResolver, the custom SubscriptionResolver class is required on classpath when using subscription type in GraphQL SDL.
SubscriptionResolver.java
package net.viralpatel.springbootgraphqlsubscription;
import graphql.kickstart.tools.GraphQLSubscriptionResolver;
import net.viralpatel.springbootgraphqlsubscription.model.StockPrice;
import org.reactivestreams.Publisher;
import org.springframework.stereotype.Component;
import reactor.core.publisher.Flux;
import java.time.Duration;
import java.time.LocalDateTime;
import java.util.Random;
@ComponentpublicclassSubscriptionResolverimplementsGraphQLSubscriptionResolver{
public Publisher<StockPrice> stockPrice(String symbol){
Random random = new Random();
return Flux.interval(Duration.ofSeconds(1))
.map(num -> new StockPrice(symbol, random.nextDouble(), LocalDateTime.now()));
}
}
The custom SubscriptionResolver class contains stockPrice method which matches to the GraphQL SDL we defined earlier. This method will be invoked whenever client subscribe for this method. The client would establish a websoket connection with server and the stockPrice() method will return a project reactor’s Publisher instance. The Publisher instace does not return the result immediately rather it will send the result whenever server pushes it. This is the same way Spring Boot Webflux pushes the result non-blocking way.
For simplicity, we have hardcoded the result. The stockPrice() method would return a random value every second using Flux.interval method. Instead of hardcoding these results can also come from a Kafka topic or RabbitMQ queue.
Spring Boot Webflux DynamoDB Integration tests – In this tutorial we will see how to setup integration test for a Spring Boot Webflux project with DynamoDB using Testcontainers. This post will use the example from previous Spring Boot Webflux DynamoDB tutorial. Let us add integration tests using Testcontainers and integrate DynamoDB tests with our Spring Boot project.
2. What is Testcontainers?
Testcontainers is a Java library that supports JUnit tests, providing lightweight, throwaway instances of common databases, Selenium web browsers, or anything else that can run in a Docker container.
The key here is Docker containers. It allows us to use any Docker image and run it as container within our integration tests. There are number of standard Testcontainers available to test integration with technologies such as Elastic search, Kafka, RabbitMQ etc. We can also use any Docker container using Testcontainers GenericContainer api.
One key thing to remember while using Testcontainers is that local Docker installation is required to run these test cases. If you are running your integration tests in continues integration (CI) pipelines, remember to use machine instance instead of docker. As Testcontainers need Docker to run, we cannot run it inside a Docker container in CI. Usually all the standard CI platforms such as Circle CI do provides machine instance where you can run your integration tests.
3. Spring Boot DynamoDB Integration Tests using Testcontainers
3.1 Gradle test dependencies
Add following dependencies for Testcontainers and Junit support for Testcontainers.
Once the dependencies are defined, we can start using the Testcontainers in JUnit. First thing is to define a GenericContainer object with official docker image of DynamoDB amazon/dynamodb-local. The withExposedPorts() method defines the port where DynamoDB will be listening to inside the Docker container. The GenericContainer object is annotated with @Container annotation. This annotation works in conjunction with @Testcontainers annotation to mark containers that should be managed by the Testcontainers extension.
privatestaticfinal int DYNAMODB_PORT = 8000;
@Container
publicstatic GenericContainer dynamodb =
new GenericContainer<>("amazon/dynamodb-local")
.withExposedPorts(DYNAMODB_PORT);
Testcontainers will start the Docker container with DynamoDB on the given DYNAMO_PORT 8000, however that will be the internal port which we need to map to actual random port which the AWS DynamoDB client from Spring Boot app can connect to.
Since our Spring Boot app connects to DynamoDB on the host and port defined under application.dynamodb.endpoint config in application.yaml file, for the test case we need to override this config at runtime to point to actual host and port where Docker is running.
To achieve this we are using ApplicationContextInitializer to override the application.dynamodb.endpoint property. We are using GenericContainer‘s getContainerIpAddress and getMappedPort method to get the ip address and actual port number of DynamoDB running inside Docker.
In following RoutesTests, we have one testcase that creates a blank table customers and then uses Spring’s WebTestClient to call POST /customers API. The testcase checks the response from API. Since this is an integration test, the whole service is booted and the customer record is created in DynamoDB database using Testcontainers.
Let us go through the above code step by step and see what is going on.
@SpringBootTest – Specify that the test case is an integration test. Spring should load the full application context and make all beans available to the test case.
@Testcontainers – It is a JUnit Jupiter extension to activate automatic startup and stop of containers used in a test case. The test containers extension finds all fields that are annotated with Container and calls their container lifecycle methods. Containers declared as static fields will be shared between test methods. They will be started only once before any test method is executed and stopped after the last test method has executed. Containers declared as instance fields will be started and stopped for every test method.
@ContextConfiguration – It overrides the Spring properties. We use it to override the host and port of DynamoDB where our DynamoDB client connects. The DynamoDBInitializer static class defined within the test case override this property. We can move this logic into an abstract class and reuse it across multiple tests.
4. Gotchas / Errors in Integration Test
If you haven’t setup your local AWS CLI, you might get following error when running the test case. This is because when the DynamoDB client initialize, it tries to find credentials to connect to dynamodb. In our DynamoDBConfig client we are using DefaultCredentialsProvider which tries to find the credentials at number of places.
Unable to load credentials from service endpoint
software.amazon.awssdk.core.exception.SdkClientException: Unable to load credentials from any of the providers in the chain AwsCredentialsProviderChain(credentialsProviders=[SystemPropertyCredentialsProvider(), EnvironmentVariableCredentialsProvider(), ProfileCredentialsProvider(), WebIdentityTokenCredentialsProvider(), ContainerCredentialsProvider(), InstanceProfileCredentialsProvider()]) : [SystemPropertyCredentialsProvider(): Unable to load credentials from system settings. Access key must be specified either via environment variable (AWS_ACCESS_KEY_ID) or system property (aws.accessKeyId)., EnvironmentVariableCredentialsProvider(): Unable to load credentials from system settings. Access key must be specified either via environment variable (AWS_ACCESS_KEY_ID) or system property (aws.accessKeyId)., ProfileCredentialsProvider(): Profile file contained no credentials for profile 'default': ProfileFile(profiles=[]), WebIdentityTokenCredentialsProvider(): Either the environment variable AWS_WEB_IDENTITY_TOKEN_FILE or the javaproperty aws.webIdentityTokenFile must be set., ContainerCredentialsProvider(): Cannot fetch credentials from container - neither AWS_CONTAINER_CREDENTIALS_FULL_URI or AWS_CONTAINER_CREDENTIALS_RELATIVE_URI environment variables are set., InstanceProfileCredentialsProvider(): Unable to load credentials from service endpoint.]
at software.amazon.awssdk.core.exception.SdkClientException$BuilderImpl.build(SdkClientException.java:97) ~[sdk-core-2.10.40.jar:na]
Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException:
Error has been observed at the following site(s):
|_ checkpoint ⇢ HTTP POST "/customers" [ExceptionHandlingWebHandler]
Stack trace:
at software.amazon.awssdk.core.exception.SdkClientException$BuilderImpl.build(SdkClientException.java:97) ~[sdk-core-2.10.40.jar:na]
To resolve this error, either setup your AWS CLI so that client can find the credential or define environment variables:
Spring Boot Webflux DynamoDB Tutorial – Let us integrate AWS DynamoDB with Spring Boot Webflux. In this tutorial will be try to integrate DynamoDB with Webflux in Spring Boot. Instead of using the default AWS Sync Client which blocks the thread, we will use Async client with Webflux.
We are going to create a REST API with basic CRUD operations on a Customer entity. The customer record will be stored in AWS DynamoDB. Also we will use Webflux to connect with DynamoDB.
2. Project Structure
Let’s start by bootstrapping a new Spring Boot project using start.spring.io. Following project settings were selected in start.spring.io.
Project: Gradle Project
Language: Java
Spring Boot: 2.2.2
Dependencies: Spring Reactive Web
Once generated, import the project in your favorite IDE.
Update the build.gradle file as following. We have added awssdk dynamodb 2.10.40 through dependencyManagement. Also not that spring-boot-starter-weblux is on classpath.
build.gradle
plugins {
id 'org.springframework.boot' version '2.2.2.RELEASE'
id 'io.spring.dependency-management' version '1.0.8.RELEASE'
id 'java'
}
group = 'net.viralpatel'
version = '0.0.1-SNAPSHOT'
sourceCompatibility = '11'
repositories {
mavenCentral()
maven {
url 'https://s3-us-west-2.amazonaws.com/dynamodb-local/release'
}
}
dependencies {
implementation 'org.springframework.boot:spring-boot-starter-webflux'
implementation 'software.amazon.awssdk:dynamodb'
testImplementation('org.springframework.boot:spring-boot-starter-test') {
exclude group: 'org.junit.vintage', module: 'junit-vintage-engine'
}
testImplementation 'io.projectreactor:reactor-test'
}
dependencyManagement {
imports {
mavenBom 'software.amazon.awssdk:bom:2.10.40'
}
}
test {
useJUnitPlatform()
}
Add following configuration in application.yaml file. We have defined a couple of properties for dynamodb endpoint and table name. Note the endpoint is pointing to our local DynamoDB. In production you might want to change this to point to your AWS region. We can also use Spring profiles to switch the value for this properties in different environment.
Following is the directory structure for our REST API project. Note that the customer related classes are segregated in a customer package.
3. Setting up DynamoDB Locally without Docker
Before we proceed with the rest of tutorial, we will setup a local dynamodb instance where we can test our changes. Instead of relying on AWS environment for DynamoDB this would speed up the development process.
We will create customers table in local DynamoDB. Follow the steps at https://ift.tt/375nb2N
Download the AWS DynamoDB Local JAR from above link and unzip it
Let us start by defining the Repository and DynamoDB configuration to access the data from DynamoDB. As noted earlier we will use DynamoDbAsyncClient to access DynamoDB.
4.1 Repository with AWS Async Client
The repository class contains basic CRUD methods to maintain Customer entity. Note how we use dynamoDbAsyncClient to access DynamoDB using different GetItemRequest, DeleteItemRequest, ScanRequest APIs. Also we map the return type from DynamoDbAsyncClient which is CompletableFuture to Reactor’s Mono class using Mono.fromCompletionStage.
In following configuration class we create an instance of DynamoDbAsyncClient. Note how we mapped the endpoint from the application.yaml using Spring @Value annotation.
Next we are defining Spring’s Service class to abstract the repository from our routes (controller in old world) layer. Note that we are calling CustomerRepository from the service class and mapping the response into ServerResponse with appropriate Http Status.
Finally we glue everything up using Routes.java. In this class we utilize Spring Webflux RouterFunctions to define the route endpoints for Customer REST API.
We defined a bunch of methods using GET, PUT, POST, DELETE methods in Spring Webflux RouterFunctions and invoked appropriate CustomerService methods.
Make sure your local instance of DynamoDB is up and running and customers table is created before starting the project. If everything is setup correctly, the customer API should be able to communicate with DynamoDB.
Open Postman and fire up the APIs.
Create new customer
List all customers
Download – Spring Boot Webflux DynamoDB example
Source code of this project is available on Github.