Showing posts with label ViralPatel.net. Show all posts
Showing posts with label ViralPatel.net. Show all posts

Java URL Encoder/Decoder Example

Java URL Encoder Decoder Example

Java URL Encoder/Decoder Example – In this tutorial we will see how to URL encode/decode attributes in Java.

Overview

URL encoding, is a mechanism for encoding information in a Uniform Resource Identifier (URI) under certain circumstances.

When we submit an HTML form using GET or POST request, the data/fields in the form are encoded using application/x-www-form-urlencoded type. There is an optional attribute on <form> element called enctype. Its default value is application/x-www-form-urlencoded. This specification defines how the values are encoded (for e.g. replacing space with +) and decoded on server.

When HTML form data is sent in an HTTP GET request, it is included in the query component of the request URI. When sent in an HTTP POST request, the data is placed in the request body, and application/x-www-form-urlencoded is included in the message’s Content-Type header.

Encode URL in Java

To encode the query parameters or attributes, Java provides a utility class URLEncoder with a couple of encode() methods.

The URLEncoder.encode() method escapes all characters except:


A-Z a-z 0-9 - _ . ! ~ * ' ( )

import java.io.UnsupportedEncodingException; import java.net.URLEncoder; import java.nio.charset.StandardCharsets; //.. URLEncoder.encode(plainText, StandardCharsets.UTF_8.toString());

In following example, we encode a few plain text strings using URLEncoder.encode.

URLEncoderTest.java


package net.viralpatel; import org.junit.Test; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import java.io.UnsupportedEncodingException; import java.net.URLEncoder; import java.nio.charset.StandardCharsets; public class URLEncoderTest { private static final Logger LOGGER = LoggerFactory.getLogger(URLEncoderTest.class); public static void encode(String plain) throws UnsupportedEncodingException { String encoded = URLEncoder.encode(plain, StandardCharsets.UTF_8.toString()); LOGGER.info("Plain text: {}, Encoded text: {}", plain, encoded); } @Test public void encodeTests() throws UnsupportedEncodingException { encode("john+doe@example.com"); encode("sample text"); encode("+1653-124-23"); } }

Run the above test in your favourite IDE and check the output. The plain text is encoded using Java URL Encoder.

Output:


Plain text: john+doe@example.com, Encoded text: john%2Bdoe%40example.com Plain text: sample text, Encoded text: sample+text Plain text: +1653-124-23, Encoded text: %2B1653-124-23

Decode URL in Java

Java also provides utility class URLDecoder to decode the value. Usually on server side when we receive the form attributes, using URLDecoder we can decode them to original values.


import java.io.UnsupportedEncodingException; import java.net.URLDecoder; import java.nio.charset.StandardCharsets; //.. URLDecoder.decode(encodedText, StandardCharsets.UTF_8.toString());

URLDecoderTest.java


package net.viralpatel; import org.junit.Test; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import java.io.UnsupportedEncodingException; import java.net.URLDecoder; import java.nio.charset.StandardCharsets; public class URLDecoderTest { private static final Logger LOGGER = LoggerFactory.getLogger(URLEncoderTest.class); public static void decode(String encoded) throws UnsupportedEncodingException { String plain = URLDecoder.decode(encoded, StandardCharsets.UTF_8.toString()); LOGGER.info("Encoded text: {}, Plain text: {}", encoded, plain); } @Test public void decodeTests() throws UnsupportedEncodingException { decode("john%2Bdoe%40example.com"); decode("sample+text"); decode("%2B1653-124-23"); } }

Run the above test in your favourite IDE and check the output. The encoded text is decoded using Java URL Decoder.

Output:


Encoded text: john%2Bdoe%40example.com, Plain text: john+doe@example.com Encoded text: sample+text, Plain text: sample text Encoded text: %2B1653-124-23, Plain text: +1653-124-23

Conclusion

Java provides utility classes URLEncoder and URLDecoder to encode/decode text using URLencoding to transfer data on http. In this tutorial we saw how to use Java URL encoder and decoder.

Download or browse the code on Github.

Github – source code

Reference

Further Read: Getting Started with Java 8 Lambda Expression



via ViralPatel.net https://ift.tt/30toOaH

How to Show Multiple Examples in OpenAPI Spec

Show Multiple Examples in OpenAPI – OpenAPI (aka Swagger) Specifications has become a defecto standard for documenting and sharing REST API. When using OpenAPI it is always best practice to add as much detail as we can. The API specification should be built from the API consumers perspective. The DX or developer experience is important when developing the API.

To improve the API experience we must define attributes with descriptions and example. It is also possible to define multiple examples to show different way the API can be consumed / requested.

First, let us see how swagger editor (editor.swagger.io) shows multiple examples. The examples are shown in a dropdown where user can choose and see appropriate request payload. sample1 and sample2 are two examples for Pet store API.

openapi multiple examples

Adding Multiple Examples in OpenAPI

To add multiple examples in OpenAPI, we can define examples attribute as shown below. Notice how we defined sample1 and sample2. You can give any meaningful name relevant to your API.

openapi.yaml


paths: /pets: post: description: Creates a new pet in the store. Duplicates are allowed operationId: addPet requestBody: description: Pet to add to the store required: true content: application/json: schema: $ref: '#/components/schemas/NewPet' examples: sample1: value: name: Cupcake tag: Chihuahua sample2: value: name: Prince tag: Poodle

In OpenAPI, we can also provide example at attribute level. While it is good to define an attribute example (e.g. petType) so the consumer of API know what to pass or what to expect from attribute. However it is also a good idea to provide example at broader request/response level.

The request/response level example would provide much broader context to API consumer and also helps documenting API better. Furthermore many mock tools can generate mock responses from the examples provided in Swagger file.

Multiple Examples in API Response

The multiple example works with both API Request and Response. Similar to what we did above, the same can be specified for API Response. In below screenshot we can see how swagger editor shows multiple response example.

multiple examples in api response

openapi.yaml with examples in response


paths: /pets: get: description: Returns all pets operationId: findPets parameters: - name: tags in: query description: tags to filter by required: false style: form schema: type: array items: type: string - name: limit in: query description: maximum number of results to return required: false schema: type: integer format: int32 responses: '200': description: pet response content: application/json: schema: type: array items: $ref: '#/components/schemas/Pet' examples: sample1: value: name: Cupcake tag: Chihuahua sample2: value: name: Prince tag: Poodle default: description: unexpected error content: application/json: schema: $ref: '#/components/schemas/Error'

Hope this little trick will make your API documentation awesome :-)

Reference

https://swagger.io/docs/specification/adding-examples/



via ViralPatel.net https://ift.tt/2BAVggS

How to Run Local WordPress using Docker

Local WordPress setup using Docker container

Local WordPress using Docker – Running a local WordPress development environment is crucial for testing themes and plugin before we push into staging or production environment. To run WordPress locally, we need to install and setup PHP, MySQL (or MariaDB) and WordPress which is not straightforward.

Docker provides an ideal way of setting up local WordPress development setup (for Windows you might prefer WAMP). Using simple docker commands we can quickly spin up a new environment where we can test WordPress themes and plugins. Assuming you already have setup Docker in your machine, starting WordPress is quite rapid.

Since we are running WordPress in Docker, the same setup will work in Window, Mac and Linux.

Local WordPress Setup with Docker

Let us see how to run local WordPress setup for development using Docker.

Setup Docker Compose for WordPress. Start up a command line terminal and create wp-local folder.


$ mkdir wp-local && cd wp-local $ touch docker-compose.yml
create docker-compose.yml for wordpress

To setup WordPress + MySQL + phpMyAdmin images in the docker-compose, copy following content into it.

docker-compose.yml


version: "3" services: db: image: mysql:5.7 restart: always volumes: - db_data:/var/lib/mysql environment: MYSQL_ROOT_PASSWORD: password MYSQL_DATABASE: wordpress MYSQL_USER: wordpress MYSQL_PASSWORD: wordpress networks: - wp wordpress: depends_on: - db image: wordpress restart: always volumes: - ./:/var/www/html/wp-content environment: WORDPRESS_DB_HOST: db:3306 WORDPRESS_DB_USER: wordpress WORDPRESS_DB_PASSWORD: wordpress ports: - 80:80 - 443:443 networks: - wp phpmyadmin: depends_on: - db image: phpmyadmin/phpmyadmin ports: - 8080:80 environment: PMA_HOST: db MYSQL_ROOT_PASSWORD: password networks: - wp networks: wp: volumes: db_data:

In above docker-compose.yml file we are creating 3 containers; mysql, wordpress and phpmyadmin. The wordpress container exposing the wordpress at port 80. Similarly phpmyadmin is exposed at port 8080. Both wordpress and phpmyadmin depends on the db container which runs MySQL image.

Docker Compose UP

Save the docker-compose.yml file and run docker-compose up command to create and start the docker containers with WordPress, MySQL and phpMyAdmin.


$ docker-compose up -d

When running first time, Docker will build up the stack and download all the images. Hence it might take a while. However subsequent invocation is going to be instant.

wordpress docker-compose up

Setup WordPress

Once the docker-compose is completed, open up the browser and goto http://localhost

local wordpress docker installation

We can start the local wordpress setup. Enter Site Title, Username and Password and press Install WordPress.

local wordpress docker installation steps

Once WordPress setup is completed, login using the username/password provided in previous step and you will be greeted with WordPress Dashboard.

wordpress docker dashboard

phpMyAdmin Setup

Since we also setup phpMyAdmin in our Docker compose file, login to phpMyAdmin to view/update WordPress database.

The username/password for phpMyAdmin is the value of WORDPRESS_DB_USER and WORDPRESS_DB_PASSWORD environment used in Docker compose file.

Username: wordpress
Password: wordpress

phpmyadmin docker login page

Bonus: Increase File Upload Size in Local WordPress Docker

If you are trying to import settings from existing WordPress site once you start your local WordPress docker container, you will realise the default max upload size is 2mb.

To increase the upload file size, we can specify custom php.ini file (in our case upload.ini) and setup the Docker compose file to copy it within container.

Create file upload.ini in the same folder as docker-compose.yml


$ touch upload.ini

Add following in upload.ini to change the upload_max_filesize.

upload.ini


file_uploads = On memory_limit = 64M upload_max_filesize = 64M post_max_size = 64M max_execution_time = 600

Update the docker-compose.yml file and mount the local upload.ini file.


wordpress: depends_on: - db image: wordpress restart: always volumes: - ./:/var/www/html/wp-content - ./upload.ini:/usr/local/etc/php/conf.d/uploads.ini environment: WORDPRESS_DB_HOST: db:3306 WORDPRESS_DB_USER: wordpress WORDPRESS_DB_PASSWORD: wordpress ports: - 80:80 - 443:443 networks: - wp

Restart the docker container by running docker-compose up -d

Check the max image size under Media > Add New

wordpress max file upload size docker image

Source Code

The docker-compose.yml is available in Github for further update.

Github – source code

Happy WordPressing :-)



via ViralPatel.net https://ift.tt/3ctUDlK

Create and Validate JWT Token in Java using JJWT

Create and Validate JWT Token using JJWT

1. JWT Token Overview

JSON Web Token (JWT) is an open standard defines a compact and self-contained way for securely transmitting information between parties as a JSON object. This information can be verified and trusted because it is digitally signed. JWTs can be signed using a secret (with the HMAC algorithm) or a public/private key pair using RSA or ECDSA.

Although JWTs can be encrypted to also provide secrecy between parties, we will focus on signed tokens. Signed tokens can verify the integrity of the claims contained within it, while encrypted tokens hide those claims from other parties. When tokens signs using public/private key pairs, the signature also certifies that only the party holding the private key is the one signed it.

1.1 What is JSON Web Token (JWT) Structure?

JWT tokens consist of 3 parts separated by a period ( . ).
These parts are:

  1. Header
  2. Payload
  3. Signature

The JWT typically looks like:

aaaa.bbbb.cccc

2. Setting up JJWT library

compile 'io.jsonwebtoken:jjwt-api:0.11.1'
runtime 'io.jsonwebtoken:jjwt-impl:0.11.1'
runtime 'io.jsonwebtoken:jjwt-jackson:0.11.1'
<dependency>
    <groupId>io.jsonwebtoken</groupId>
    <artifactId>jjwt-api</artifactId>
    <version>0.11.1</version>
</dependency>
<dependency>
    <groupId>io.jsonwebtoken</groupId>
    <artifactId>jjwt-impl</artifactId>
    <version>0.11.1</version>
    <scope>runtime</scope>
</dependency>
<dependency>
    <groupId>io.jsonwebtoken</groupId>
    <artifactId>jjwt-jackson</artifactId>
    <version>0.11.1</version>
    <scope>runtime</scope>
</dependency>

3. Create JWT Token

Jwts.builder() is used to create a JWT token. We can specify claims, subject and other JWT attribute.

import io.jsonwebtoken.Jwts;

import java.time.Instant;
import java.time.temporal.ChronoUnit;
import java.util.Date;
import java.util.UUID;

//...

String jwtToken = Jwts.builder()
        .claim("name", "Jane Doe")
        .claim("email", "jane@example.com")
        .setSubject("jane")
        .setId(UUID.randomUUID().toString())
        .setIssuedAt(Date.from(now))
        .setExpiration(Date.from(now.plus(5l, ChronoUnit.MINUTES)))
        .compact();

Generated JWT Token:

eyJhbGciOiJub25lIn0.eyJuYW1lIjoiSmFuZSBEb2UiLCJlbWFpbCI6ImphbmVAZXhhbXBsZS5jb20iLCJzdWIiOiJqYW5lIiwianRpIjoiNDVmNGRiNTEtYTRlYy00YjYzLWJhMDgtNTE3MDJjYTI1MmEzIiwiaWF0IjoxNTkwNTc3NTY0LCJleHAiOjE1OTA1Nzc4NjR9.

The above code to generate JWT is pretty self-explanatory however let’s check step by step how are we generating JWT token:

  1. Add claims name and email with value Jane Doe and jane@example.com respectively
  2. Add subject in JWT token with value jane
  3. Set Id for the JWT token using randomly generate GUID
  4. Set issued at to current time
  5. Set expiration to current time plus 5 minutes. So the JWT is valid for only 5 minutes
JWT.io Parse JWT Header and Payload

The JWT generated above is not signed (Check algorithm alg attribute in the header). We have just encoded the claims in JSON format. If using JWT for authentication or authorization it is advisable to Sign the JWT, so it can be verified.

4. Validate/Parse JWT Token

To validate or parse the JWT token, Jwts.parserBuilder() method is used.

import io.jsonwebtoken.Jwts;
import io.jsonwebtoken.Jws;
import io.jsonwebtoken.Claims;
//...

Jws<Claims> jwt = Jwts.parserBuilder()
      .setSigningKey(...)
      .build()
      .parseClaimsJws(jwt);

While parsing the JWT token we need to pass Signing key to verify the JWT signature. Let us see how to sign the JWT token using different algorithms.

5. Create and Validate JWT Token Signed using HMAC Secret

The simplest way of creating a signed JWT token is by using HMAC secret. HMAC stands for hash-based message authentication code and is cryptographic hash function. It is used to simultaneously verify both the data integrity and the authenticity of a token.

5.1 Create JWT Token signed with HMAC

To create JWT token signed with HMAC shared secret, we need to specify signature using .signWith() method.

import io.jsonwebtoken.SignatureAlgorithm;
import javax.crypto.spec.SecretKeySpec;
import java.security.Key;
import java.util.Base64;
//...

// Key is hardcoded here for simplicity. 
// Ideally this will get loaded from env configuration/secret vault
String secret = "asdfSFS34wfsdfsdfSDSD32dfsddDDerQSNCK34SOWEK5354fdgdf4";

Key hmacKey = new SecretKeySpec(Base64.getDecoder().decode(secret), 
                            SignatureAlgorithm.HS256.getJcaName());

Instant now = Instant.now();
String jwtToken = Jwts.builder()
        .claim("name", "Jane Doe")
        .claim("email", "jane@example.com")
        .setSubject("jane")
        .setId(UUID.randomUUID().toString())
        .setIssuedAt(Date.from(now))
        .setExpiration(Date.from(now.plus(5l, ChronoUnit.MINUTES)))
        .signWith(hmacKey)
        .compact();

Generated JWT Token:

eyJhbGciOiJIUzI1NiJ9.eyJuYW1lIjoiSmFuZSBEb2UiLCJlbWFpbCI6ImphbmVAZXhhbXBsZS5jb20iLCJzdWIiOiJqYW5lIiwianRpIjoiYjMwMmU5NmUtODg2OS00NTJkLTg1ZjMtZGZjNDQzNTY3ZGUwIiwiaWF0IjoxNTkwNjE4NjI2LCJleHAiOjE1OTA2MTg5MjZ9.LJaar-3rnb3iuRXJBYK024l1PtEu3ay5ds24Q0bkf7w

5.2 Validate/Parse JWT Token signed with HMAC

To validate/parse the JWT token generated using HMAC shared secret, the same steps can be applied. We need to use setSigningKey() method to set the key before we parse the JWT token.

public static Jws<Claims> parseJwt(String jwtString) {
    String secret = "asdfSFS34wfsdfsdfSDSD32dfsddDDerQSNCK34SOWEK5354fdgdf4";
    Key hmacKey = new SecretKeySpec(Base64.getDecoder().decode(secret), 
                                    SignatureAlgorithm.HS256.getJcaName());

    Jws<Claims> jwt = Jwts.parserBuilder()
            .setSigningKey(hmacKey)
            .build()
            .parseClaimsJws(jwtString);

    return jwt;
}

Output:

{name=Jane Doe, email=jane@example.com, sub=jane, jti=21da0729-577a-4d76-9c09-0ee11dcc2fdb, iat=1590619729, exp=1590620029}

If the JWT token expires (exp claim value is less than current system time), the parseClaimsJws() method will throw SignatureException.

Exception in thread "main" io.jsonwebtoken.security.SignatureException: JWT signature does not match locally computed signature. JWT validity cannot be asserted and should not be trusted.
    at io.jsonwebtoken.impl.DefaultJwtParser.parse(DefaultJwtParser.java:411)
    at io.jsonwebtoken.impl.DefaultJwtParser.parse(DefaultJwtParser.java:541)
    at io.jsonwebtoken.impl.DefaultJwtParser.parseClaimsJws(DefaultJwtParser.java:601)
    at io.jsonwebtoken.impl.ImmutableJwtParser.parseClaimsJws(ImmutableJwtParser.java:173)
    at net.viralpatel.jwt.JWTGenerator.parseJwt(JWTGenerator.java:32)
    at net.viralpatel.jwt.JWTGenerator.main(JWTGenerator.java:20)

6. Create and Validate JWT Token Signed using RSA Private Key

When using JWT token for microservice authentication/authorization it is advisable to sign with RSA Private/Public Keys instead of using Shared HMAC Secret. The token is generated and signed by a central authority (usually an Authorization Server) and each microservice can validate the JWT token using the Public Key exposed from Authorization Server.

Before we see how to generate JWT token with Private/Public key, let us see how to generate a Private and Public RSA Key pairs.

6.1 Generate Private and Public RSA Key

Generate an RSA private key, of size 2048, and output it to a file named key.pem:

$ openssl genrsa -out key.pem 2048
Generating RSA private key, 2048 bit long modulus
..........+++
..........................................................................+++
e is 65537 (0x10001)

Extract the public key from the key pair, which can be used in a certificate:

$ openssl rsa -in key.pem -outform PEM -pubout -out public.pem
writing RSA key

The key.pem file contains the private key generated using RSA and public.pem file contains public key.

6.2 Create JWT Token signed with RSA

Following code snippets shows how to generate JWT Token Signed using RSA.

import io.jsonwebtoken.Jwts;
import java.security.KeyFactory;
import java.security.NoSuchAlgorithmException;
import java.security.PrivateKey;
import java.security.spec.InvalidKeySpecException;
import java.security.spec.PKCS8EncodedKeySpec;
import java.time.Instant;
import java.time.temporal.ChronoUnit;
import java.util.Base64;
import java.util.Date;
import java.util.UUID;
// ...

public static String createJwtSignedHMAC() throws InvalidKeySpecException, NoSuchAlgorithmException {

    PrivateKey privateKey = getPrivateKey();

    Instant now = Instant.now();
    String jwtToken = Jwts.builder()
            .claim("name", "Jane Doe")
            .claim("email", "jane@example.com")
            .setSubject("jane")
            .setId(UUID.randomUUID().toString())
            .setIssuedAt(Date.from(now))
            .setExpiration(Date.from(now.plus(5l, ChronoUnit.MINUTES)))
            .signWith(privateKey)
            .compact();

    return jwtToken;
}

private static PrivateKey getPrivateKey() throws NoSuchAlgorithmException, InvalidKeySpecException {
    String rsaPrivateKey = "-----BEGIN PRIVATE KEY-----" +
            "MIIEvwIBADANBgkqhkiG9w0BAQEFAASCBKkwggSlAgEAAoIBAQDK7c0HtOvefMRM" +
            "s1tkdiJm+A16Df85lQlmXjQvMHNgY4P/znvl4kRON9DdBdo3K81OG7pR/0H9XvdB" +
            "TEojj/6vCVuMDeeIiBrgx0OJjhv0r8oUD4d52+1kXXITaniyZcbJ08s4sF7fUSCu" +
            "IZOhvvwQTd/tIwXGU1qqfg+bsomQ6h2czPSKXAux54vUiRO2IWf/Y6twyk8cy1PH" +
            "IOfelCVUJ4kzmP+CsOH7Rh3JMwZ0Mc4GAzndWpKwNXKjVM20/bKE9FgIiIjzmEQd" +
            "VpSdUz2MbAKM1kskdaHXQyuaHoHfPwESYuEwBld4vh9AGMF3jYMu8ggnAzVRIoWG" +
            "Mr5eCE2tAgMBAAECggEBAKBPXiKRdahMzlJ9elyRyrmnihX7Cr41k7hwAS+qSetC" +
            "kpu6RjykFCvqgjCpF+tvyf/DfdybF0mPBStrlkIj1iH29YBd16QPSZR7NkprnoAd" +
            "gzl3zyGgcRhRjfXyrajZKEJ281s0Ua5/i56kXdlwY/aJXrYabcxwOvbnIXNxhqWY" +
            "NSejZn75fcacSyvaueRO6NqxmCTBG2IO4FDc/xGzsyFKIOVYS+B4o/ktUOlU3Kbf" +
            "vwtz7U5GAh9mpFF+Dkr77Kv3i2aQUonja6is7X3JlA93dPu4JDWK8jrhgdZqY9p9" +
            "Q8odbKYUaBV8Z8CnNgz2zaNQinshzwOeGfFlsd6H7SECgYEA7ScsDCL7omoXj4lV" +
            "Mt9RkWp6wQ8WDu5M+OCDrcM1/lfyta2wf7+9hv7iDb+FwQnWO3W7eFngYUTwSw5x" +
            "YP2uvOL5qbe7YntKI4Q9gHgUd4XdRJJSIdcoY9/d1pavkYwOGk7KsUrmSeoJJ2Jg" +
            "54ypVzZlVRkcHjuwiiXKvHwj2+UCgYEA2w5YvWSujExREmue0BOXtypOPgxuolZY" +
            "pS5LnuAr4rvrZakE8I4sdYjh0yLZ6qXJHzVlxW3DhTqhcrhTLhd54YDogy2IT2ff" +
            "0GzAV0kX+nz+mRhw0/u+Yw6h0QuzH9Q04Wg3T/u/K9+rG335j/RU1Tnh7nxetfGb" +
            "EwJ1oOqcXikCgYEAqBAWmxM/mL3urH36ru6r842uKJr0WuhuDAGvz7iDzxesnSvV" +
            "5PKQ8dY3hN6xfzflZoXssUGgTc55K/e0SbP93UZNAAWA+i29QKY6n4x5lKp9QFch" +
            "dXHw4baIk8Z97Xt/kw07f6FAyijdC9ggLHf2miOmdEQzNQm/9mcJ4cFn+DECgYEA" +
            "gvOepQntNr3gsUxY0jcEOWE3COzRroZD0+tLFZ0ZXx/L5ygVZeD4PwMnTNrGvvmA" +
            "tAFt54pomdqk7Tm3sBQkrmQrm0+67w0/xQ9eJE/z37CdWtQ7jt4twHXc0mVWHa70" +
            "NdPhTRVIAWhil7rFWANOO3Gw2KrMy6O1erW7sAjQlZECgYBmjXWzgasT7JcHrP72" +
            "fqrEx4cg/jQFNlqODNb515tfXSBBoAFiaxWJK3Uh/60/I6cFL/Qoner4trNDWSNo" +
            "YENBqXLZnWGfIo0vAIgniJ6OD67+1hEQtbenhSfeE8Hou2BnFOTajUxmYgGm3+hx" +
            "h8TPOvfHATdiwIm7Qu76gHhpzQ==" +
            "-----END PRIVATE KEY-----";

    rsaPrivateKey = rsaPrivateKey.replace("-----BEGIN PRIVATE KEY-----", "");
    rsaPrivateKey = rsaPrivateKey.replace("-----END PRIVATE KEY-----", "");

    PKCS8EncodedKeySpec keySpec = new PKCS8EncodedKeySpec(Base64.getDecoder().decode(rsaPrivateKey));
    KeyFactory kf = KeyFactory.getInstance("RSA");
    PrivateKey privKey = kf.generatePrivate(keySpec);
    return privKey;
}

Generated JWT Token:

eyJhbGciOiJSUzI1NiJ9.eyJuYW1lIjoiSmFuZSBEb2UiLCJlbWFpbCI6ImphbmVAZXhhbXBsZS5jb20iLCJzdWIiOiJqYW5lIiwianRpIjoiNGJiMDU4YzUtZDZhNi00NGNiLWEwOWYtZmM5MjdiNjcwYTJiIiwiaWF0IjoxNTkwODA1NDM3LCJleHAiOjE1OTA4MDU3Mzd9.HZa6ckOvT73KqmYluW5vww10ESNGKRrcYumDjE6ESctrIkLyXrc_fasjDfMAit5SPgIWv8u2GO2XsZzFhyL1Ar_VfCMGY1m6-2Aq-clci0jfzk-q1_SBB57fBQE6io9lPuZcuGtRHjsSmwUUFwHA_bcvVU1Ka85UnQnYilgPoE0BQG9v1ZJy_hjPSisfQh56flHMqs-W4P3X6zY4Ak_92A8FHaaxWABJhJCY1Hl7OR2kSnbqcJaPRYI064x-p1gIePScPTjuDRjNOoyaA-5QEgvWV2BTW6lWIS9yE6NdzvmOt2Nv_5YN5ESuQcVT6XgxNwscoQu3iG3UiOthU6kz-w

For simplicity the Private Key is hard coded in above example. However, in real production system it will be loaded from environment variable, or a secret vault (Hashicorp Vault or AWS Parameter Store).

In above example the method getPrivateKey() gets the java.security.PrivateKey which is then used in Jwts.builder to sign the JWT token using Private key.

6.3 Validate/Parse JWT Token signed with RSA Private/Public Keys

Next, let us validate and parse the JWT signed using RSA. For that we will need Public Key instance in java.security.PublicKey format which Jwts.parserBuilder will use to validate the JWT.

import io.jsonwebtoken.Claims;
import io.jsonwebtoken.Jws;
import io.jsonwebtoken.Jwts;

import java.security.KeyFactory;
import java.security.NoSuchAlgorithmException;
import java.security.PublicKey;
import java.security.spec.InvalidKeySpecException;
import java.security.spec.X509EncodedKeySpec;
import java.util.Base64;
//..

public static Jws<Claims> parseJwt(String jwtString) throws InvalidKeySpecException, NoSuchAlgorithmException {

    PublicKey publicKey = getPublicKey();

    Jws<Claims> jwt = Jwts.parserBuilder()
            .setSigningKey(publicKey)
            .build()
            .parseClaimsJws(jwtString);

    return jwt;
}

private static PublicKey getPublicKey() throws NoSuchAlgorithmException, InvalidKeySpecException {
    String rsaPublicKey = "-----BEGIN PUBLIC KEY-----" +
            "MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAyu3NB7Tr3nzETLNbZHYi" +
            "ZvgNeg3/OZUJZl40LzBzYGOD/8575eJETjfQ3QXaNyvNThu6Uf9B/V73QUxKI4/+" +
            "rwlbjA3niIga4MdDiY4b9K/KFA+HedvtZF1yE2p4smXGydPLOLBe31EgriGTob78" +
            "EE3f7SMFxlNaqn4Pm7KJkOodnMz0ilwLseeL1IkTtiFn/2OrcMpPHMtTxyDn3pQl" +
            "VCeJM5j/grDh+0YdyTMGdDHOBgM53VqSsDVyo1TNtP2yhPRYCIiI85hEHVaUnVM9" +
            "jGwCjNZLJHWh10Mrmh6B3z8BEmLhMAZXeL4fQBjBd42DLvIIJwM1USKFhjK+XghN" +
            "rQIDAQAB" +
            "-----END PUBLIC KEY-----";
    rsaPublicKey = rsaPublicKey.replace("-----BEGIN PUBLIC KEY-----", "");
    rsaPublicKey = rsaPublicKey.replace("-----END PUBLIC KEY-----", "");
    X509EncodedKeySpec keySpec = new X509EncodedKeySpec(Base64.getDecoder().decode(rsaPublicKey));
    KeyFactory kf = KeyFactory.getInstance("RSA");
    PublicKey publicKey = kf.generatePublic(keySpec);
    return publicKey;
}

Output:

{name=Jane Doe, email=jane@example.com, sub=jane, jti=4bb058c5-d6a6-44cb-a09f-fc927b670a2b, iat=1590805437, exp=1590805737}

In above example, we have hardcoded the Public Key. In a production system, it is usually configured through environment variable or service configuration.

7. Source Code – Generate and Validate JWT Tokens using Java & JJWT

Source code for the Creating and Validating JWT token in Java.

Github – source code



via ViralPatel.net https://ift.tt/3dz42Kx

Spring Boot GraphQL Subscription Realtime API

Spring Boot GraphQL Subscription Realtime API

GraphQL Subscription provides a great way of building real-time API. In this tutorial we will build a realtime GraphQL subscription API using Spring Boot.

1. Overview

Spring Boot GraphQL Subscription – GraphQL provides an alternate way of building APIs. Similar to REST, with GraphQL we can query data from server/backend and also update data. This is called Query and Mutation in GraphQL. Checkout the tutorial on Getting Started with GraphQL with Spring Boot in which both Query and Mutation is discussed in detail.

Oftentimes clients want to get pushed updates from the server when data they care about changes. GraphQL Subscription provides this capability of long-lived stream of source events that client can subscribe to.

2. GraphQL Subscription

GraphQL subscriptions are a way to push data from the server to the clients that choose to listen to real time messages from the server. Subscriptions are similar to queries in that they specify a set of fields to be delivered to the client, but instead of immediately returning a single answer, a result is sent every time a particular event happens on the server.

When using Subscription, the client will connect to GraphQL server using Websocket. A persistent websocket connect gets established at the beginning and used across the lifecycle until client wants to consume messages or until server is available. Each time a new message is available, server pushes it on the websocket connection to client.

Designing a realtime API using GraphQL subscription requires careful considerations. Unlike Query and Mutation which are stateless operation types, Subscription is a stateful operation. A stateful connection using WebSocket is established with server. Server should be designed to handle multiple connections and scale along with new connections. Usually to achieve this server can use a messaging service such as RabbitMQ, Redis Pub/Sub or Kafka.

3. Spring Boot GraphQL Subscription API

For this tutorial, we will going to mock the messaging server and send mock data to client through Subscription. Let us start with initializing Spring Boot app.

The next step would be to create Spring Boot app with GraphQL support. Let us scaffold the app with start.spring.io using following options:

  1. Gradle project
  2. Java 11
  3. Spring Boot 2.3.0
  4. Group: net.viralpatel Artifact: spring-boot-graphql-subscription

3.1 Dependencies

Let us add GraphQL Java and other dependencies in Gradle. Open build.gradle file and add following code:

build.gradle

plugins {
    id 'org.springframework.boot' version '2.3.0.RELEASE'
    id 'io.spring.dependency-management' version '1.0.9.RELEASE'
    id 'java'
}
group = 'net.viralpatel'
version = '0.0.1-SNAPSHOT'
sourceCompatibility = '11'

repositories {
    mavenCentral()
}

dependencies {
    implementation 'org.springframework.boot:spring-boot-starter'
    implementation 'com.graphql-java-kickstart:graphql-spring-boot-starter:7.0.1'
    implementation 'io.projectreactor:reactor-core'
    runtimeOnly 'com.graphql-java-kickstart:graphiql-spring-boot-starter:7.0.2-SNAPSHOT'
    testImplementation 'org.springframework.boot:spring-boot-starter-test'
}

test {
    useJUnitPlatform()
}

Or if you are using Maven, add following test dependencies.

pom.xml

<dependency>
    <groupId>org.springframework</groupId>
    <artifactId>boot:spring-boot-starter</artifactId>
</dependency>
<dependency>
    <groupId>com.graphql-java-kickstart</groupId>
    <artifactId>graphql-spring-boot-starter</artifactId>
    <version>7.0.1</version>
</dependency>
<dependency>
    <groupId>io.projectreactor</groupId>
    <artifactId>reactor-core</artifactId>
</dependency>
<dependency>
    <groupId>com.graphql-java-kickstart</groupId>
    <artifactId>graphiql-spring-boot-starter</artifactId>
    <version>7.0.2</version>
    <scope>runtime</scope>
</dependency>

3.2 GraphQL Schema SDL

The GraphQL schema can be written using GraphQL SDL (Schema Definition Language). Following is our app’s GraphQL schema.

Schema definition file is under src/resources directory. GraphQL Spring Boot starter will read this and configure it using graphql-java-tools.

src/resources/schema.graphqls

type Query {
    stockDetail(symbol: String): StockDetail
}

type Subscription {
    stockPrice(symbol: String): StockPrice
}

type StockDetail {
    symbol: String,
    name: String,
    marketCap: String
}

type StockPrice {
    symbol: String
    price: String
    timestamp: String
}

The schema consist of GraphQL Query and Subscription default operation types. It also contains user defined types StockDetail and StockPrice.

The Query stockDetail returns the type StockDetail and is used as GET API to get the stock detail once when called. The Subscription method stockPrice returns the type StockPrice. Since it is Subscription method, instead of returning the result once it subscribe to the method and listen for any changes from server. Server will push changes in realtime whenever there is new data available.

3.3 Query Resolver

The Spring Boot Graphql starter expects an instance of GraphQLQueryResolver in classpath when the app boots.

QueryResolver.java

package net.viralpatel.springbootgraphqlsubscription;

import graphql.kickstart.tools.GraphQLQueryResolver;
import net.viralpatel.springbootgraphqlsubscription.model.StockDetail;
import org.springframework.stereotype.Component;

@Component
public class QueryResolver implements GraphQLQueryResolver {

    public StockDetail stockDetail(String symbol) {

       return new StockDetail(symbol, "name", 2000l);
    }
}

The custom QueryResolver class defines method stockDetail which matches to the GraphQL SDL we defined above. This method will be invoked whenever the client calls the Query method stockDetail.

The response is just hardcoded value of stock symbol, name and price. Instead of hardcoding, these values can come from database or any backend API.

3.3 Subscription Resolver

Similar to QueryResolver, the custom SubscriptionResolver class is required on classpath when using subscription type in GraphQL SDL.

SubscriptionResolver.java

package net.viralpatel.springbootgraphqlsubscription;

import graphql.kickstart.tools.GraphQLSubscriptionResolver;
import net.viralpatel.springbootgraphqlsubscription.model.StockPrice;
import org.reactivestreams.Publisher;
import org.springframework.stereotype.Component;
import reactor.core.publisher.Flux;

import java.time.Duration;
import java.time.LocalDateTime;
import java.util.Random;

@Component
public class SubscriptionResolver implements GraphQLSubscriptionResolver {

    public Publisher<StockPrice> stockPrice(String symbol) {

        Random random = new Random();
        return Flux.interval(Duration.ofSeconds(1))
                .map(num -> new StockPrice(symbol, random.nextDouble(), LocalDateTime.now()));
    }
}

The custom SubscriptionResolver class contains stockPrice method which matches to the GraphQL SDL we defined earlier. This method will be invoked whenever client subscribe for this method. The client would establish a websoket connection with server and the stockPrice() method will return a project reactor’s Publisher instance. The Publisher instace does not return the result immediately rather it will send the result whenever server pushes it. This is the same way Spring Boot Webflux pushes the result non-blocking way.

For simplicity, we have hardcoded the result. The stockPrice() method would return a random value every second using Flux.interval method. Instead of hardcoding these results can also come from a Kafka topic or RabbitMQ queue.

StockDetail.java

package net.viralpatel.springbootgraphqlsubscription.model;

public class StockDetail {
    private String symbol;
    private String name;
    private long marketCap;

    public StockDetail(String symbol, String name, long marketCap) {
        this.name = name;
        this.symbol = symbol;
        this.marketCap = marketCap;
    }

    // getter and setters
}

StockPrice.java

package net.viralpatel.springbootgraphqlsubscription.model;

import java.time.LocalDateTime;

public class StockPrice {
    private String symbol;
    private double price;
    private LocalDateTime timestamp;

    public StockPrice(String symbol, double price, LocalDateTime timestamp) {
        this.price = price;
        this.symbol = symbol;
        this.timestamp = timestamp;
    }

    // getter and setters
}

The above are just some plain old Java objects to carry the data from server to client.

4. Build and Execute

Start the Spring Boot application by running SpringBootGraphqlSubscriptionApplication class or by running gradle:

./gradlew bootRun

Once the Spring Boot app is started on default port 8080, open http://localhost:8080/graphiql

Try running following GraphQL subscription method and see the output.

subscription {
  stockPrice(symbol: "GOOG") {
    symbol
    price
    timestamp
  }
}

At first we will see following output in Graphiql editor.

Your subscription data will appear here after server publication!

And after a few seconds we should start getting result back from server.

{
  "stockPrice": {
    "symbol": "GOOG",
    "price": "0.7199789993583869",
    "timestamp": "2020-05-24T19:41:51.780299"
  }
}
Spring Boot GraphQL Subscription GraphiQL Editor

5. Source Code – Spring Boot GraphQL Subscription Example

Source code for the Spring Boot GraphQL Subscription API.

Github – spring-boot-graphql-subscription



via ViralPatel.net https://ift.tt/36qUpKQ

Spring Boot DynamoDB Integration Test using Testcontainers

Spring Boot DynamoDB Integration Tests using TestContainers

1. Overview

Spring Boot Webflux DynamoDB Integration tests – In this tutorial we will see how to setup integration test for a Spring Boot Webflux project with DynamoDB using Testcontainers. This post will use the example from previous Spring Boot Webflux DynamoDB tutorial. Let us add integration tests using Testcontainers and integrate DynamoDB tests with our Spring Boot project.

2. What is Testcontainers?

Testcontainers is a Java library that supports JUnit tests, providing lightweight, throwaway instances of common databases, Selenium web browsers, or anything else that can run in a Docker container.

The key here is Docker containers. It allows us to use any Docker image and run it as container within our integration tests. There are number of standard Testcontainers available to test integration with technologies such as Elastic search, Kafka, RabbitMQ etc. We can also use any Docker container using Testcontainers GenericContainer api.

One key thing to remember while using Testcontainers is that local Docker installation is required to run these test cases. If you are running your integration tests in continues integration (CI) pipelines, remember to use machine instance instead of docker. As Testcontainers need Docker to run, we cannot run it inside a Docker container in CI. Usually all the standard CI platforms such as Circle CI do provides machine instance where you can run your integration tests.

3. Spring Boot DynamoDB Integration Tests using Testcontainers

3.1 Gradle test dependencies

Add following dependencies for Testcontainers and Junit support for Testcontainers.

build.gradle

testCompile "org.testcontainers:testcontainers:1.14.1"
testCompile "org.testcontainers:junit-jupiter:1.14.1"

Or if you are using Maven, add following test dependencies.

pom.xml

<dependency>
    <groupId>org.testcontainers</groupId>
    <artifactId>testcontainers</artifactId>
    <version>1.14.1</version>
    <scope>test</scope>
</dependency>
<dependency>
    <groupId>org.testcontainers</groupId>
    <artifactId>junit-jupiter</artifactId>
    <version>1.14.1</version>
    <scope>test</scope>
</dependency>

3.2 Define DynamoDB Docker @Container

Once the dependencies are defined, we can start using the Testcontainers in JUnit. First thing is to define a GenericContainer object with official docker image of DynamoDB amazon/dynamodb-local. The withExposedPorts() method defines the port where DynamoDB will be listening to inside the Docker container. The GenericContainer object is annotated with @Container annotation. This annotation works in conjunction with @Testcontainers annotation to mark containers that should be managed by the Testcontainers extension.

private static final int DYNAMODB_PORT = 8000;

@Container
public static GenericContainer dynamodb =
        new GenericContainer<>("amazon/dynamodb-local")
            .withExposedPorts(DYNAMODB_PORT);

Testcontainers will start the Docker container with DynamoDB on the given DYNAMO_PORT 8000, however that will be the internal port which we need to map to actual random port which the AWS DynamoDB client from Spring Boot app can connect to.

Since our Spring Boot app connects to DynamoDB on the host and port defined under application.dynamodb.endpoint config in application.yaml file, for the test case we need to override this config at runtime to point to actual host and port where Docker is running.

To achieve this we are using ApplicationContextInitializer to override the application.dynamodb.endpoint property. We are using GenericContainer‘s getContainerIpAddress and getMappedPort method to get the ip address and actual port number of DynamoDB running inside Docker.

public static class DynamoDBInitializer implements ApplicationContextInitializer<ConfigurableApplicationContext> {
    @Override
    public void initialise(ConfigurableApplicationContext ctx) {

        TestPropertyValues.of(
                String.format("application.dynamodb.endpoint: http://%s:%s",
                    dynamodb.getContainerIpAddress(), dynamodb.getMappedPort(DYNAMODB_PORT)))
                .applyTo(ctx);
    }
}

3.3 End to end integration test

In following RoutesTests, we have one testcase that creates a blank table customers and then uses Spring’s WebTestClient to call POST /customers API. The testcase checks the response from API. Since this is an integration test, the whole service is booted and the customer record is created in DynamoDB database using Testcontainers.

RoutesTests.java

package net.viralpatel.springbootwebfluxdynamodb;

import net.viralpatel.springbootwebfluxdynamodb.customer.Customer;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.boot.test.util.TestPropertyValues;
import org.springframework.context.ApplicationContextInitializer;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.web.reactive.server.WebTestClient;
import org.testcontainers.containers.GenericContainer;
import org.testcontainers.junit.jupiter.Container;
import org.testcontainers.junit.jupiter.Testcontainers;
import reactor.core.publisher.Mono;
import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClient;
import software.amazon.awssdk.services.dynamodb.model.*;

import java.util.concurrent.CompletableFuture;

import static org.hamcrest.Matchers.*;

@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
@Testcontainers
@ContextConfiguration(initializers = RoutesTests.DynamoDBInitializer.class)
public class RoutesTests {

    private static final int DYNAMODB_PORT = 8000;

    @Autowired
    DynamoDbAsyncClient dynamoDbAsyncClient;

    @Container
    public static GenericContainer dynamodb =
            new GenericContainer<>("amazon/dynamodb-local:latest")
                .withExposedPorts(DYNAMODB_PORT);

    public static class DynamoDBInitializer implements ApplicationContextInitializer<ConfigurableApplicationContext> {
        @Override
        public void initialize(ConfigurableApplicationContext ctx) {

            TestPropertyValues.of(
                    String.format("application.dynamodb.endpoint: http://%s:%s",
                        dynamodb.getContainerIpAddress(), dynamodb.getMappedPort(DYNAMODB_PORT)))
                    .applyTo(ctx);
        }
    }

    @Autowired
    public WebTestClient webTestClient;

    @Test
    public void shouldCreateCustomerWhenCustomerAPIInvoked() {

        // Create customers table in DynamoDB
        CompletableFuture<CreateTableResponse> createTable = dynamoDbAsyncClient.createTable(CreateTableRequest.builder()
                .tableName("customers")
                .attributeDefinitions(AttributeDefinition.builder().attributeName("customerId").attributeType("S").build())
                .keySchema(KeySchemaElement.builder().attributeName("customerId").keyType(KeyType.HASH).build())
                .provisionedThroughput(ProvisionedThroughput.builder().readCapacityUnits(5l).writeCapacityUnits(5l).build())
                .build());

        Mono.fromFuture(createTable).block();

        Customer customer = new Customer();
        customer.setName("John");
        customer.setCity("Sydney");
        customer.setEmail("john@example.com");

        webTestClient
                .post()
                .uri("/customers")
                .bodyValue(customer)
                .exchange()
                .expectStatus().is2xxSuccessful()
                .expectHeader().value("Location", is(not(blankOrNullString())));
    }
}

Let us go through the above code step by step and see what is going on.

@SpringBootTest – Specify that the test case is an integration test. Spring should load the full application context and make all beans available to the test case.

@Testcontainers – It is a JUnit Jupiter extension to activate automatic startup and stop of containers used in a test case. The test containers extension finds all fields that are annotated with Container and calls their container lifecycle methods. Containers declared as static fields will be shared between test methods. They will be started only once before any test method is executed and stopped after the last test method has executed. Containers declared as instance fields will be started and stopped for every test method.

@ContextConfiguration – It overrides the Spring properties. We use it to override the host and port of DynamoDB where our DynamoDB client connects. The DynamoDBInitializer static class defined within the test case override this property. We can move this logic into an abstract class and reuse it across multiple tests.

4. Gotchas / Errors in Integration Test

If you haven’t setup your local AWS CLI, you might get following error when running the test case. This is because when the DynamoDB client initialize, it tries to find credentials to connect to dynamodb. In our DynamoDBConfig client we are using DefaultCredentialsProvider which tries to find the credentials at number of places.

Unable to load credentials from service endpoint

software.amazon.awssdk.core.exception.SdkClientException: Unable to load credentials from any of the providers in the chain AwsCredentialsProviderChain(credentialsProviders=[SystemPropertyCredentialsProvider(), EnvironmentVariableCredentialsProvider(), ProfileCredentialsProvider(), WebIdentityTokenCredentialsProvider(), ContainerCredentialsProvider(), InstanceProfileCredentialsProvider()]) : [SystemPropertyCredentialsProvider(): Unable to load credentials from system settings. Access key must be specified either via environment variable (AWS_ACCESS_KEY_ID) or system property (aws.accessKeyId)., EnvironmentVariableCredentialsProvider(): Unable to load credentials from system settings. Access key must be specified either via environment variable (AWS_ACCESS_KEY_ID) or system property (aws.accessKeyId)., ProfileCredentialsProvider(): Profile file contained no credentials for profile 'default': ProfileFile(profiles=[]), WebIdentityTokenCredentialsProvider(): Either the environment variable AWS_WEB_IDENTITY_TOKEN_FILE or the javaproperty aws.webIdentityTokenFile must be set., ContainerCredentialsProvider(): Cannot fetch credentials from container - neither AWS_CONTAINER_CREDENTIALS_FULL_URI or AWS_CONTAINER_CREDENTIALS_RELATIVE_URI environment variables are set., InstanceProfileCredentialsProvider(): Unable to load credentials from service endpoint.]
    at software.amazon.awssdk.core.exception.SdkClientException$BuilderImpl.build(SdkClientException.java:97) ~[sdk-core-2.10.40.jar:na]
    Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException: 
Error has been observed at the following site(s):
    |_ checkpoint ⇢ HTTP POST "/customers" [ExceptionHandlingWebHandler]
Stack trace:
        at software.amazon.awssdk.core.exception.SdkClientException$BuilderImpl.build(SdkClientException.java:97) ~[sdk-core-2.10.40.jar:na]

To resolve this error, either setup your AWS CLI so that client can find the credential or define environment variables:

export AWS_SECRET_ACCESS_KEY=test
export AWS_ACCESS_KEY_ID=test

The env variables value can be any non empty string.

Read more on working with AWS Credentials

5. Source Code – Spring Boot DynamoDB Integration Tests

Source code for the Testcontainers integration test for DynamoDB and Spring Boot is in Github.

Github – spring-boot-webflux-dynamodb

References

  1. Testcontainers
  2. Spring Boot Webflux
  3. AWS DynamoDB Async Client


via ViralPatel.net https://ift.tt/2Lzekhl

Spring Boot Webflux DynamoDB Tutorial

Spring Boot Webflux DynamoDB Tutorial

1. Overview

Spring Boot Webflux DynamoDB Tutorial – Let us integrate AWS DynamoDB with Spring Boot Webflux. In this tutorial will be try to integrate DynamoDB with Webflux in Spring Boot. Instead of using the default AWS Sync Client which blocks the thread, we will use Async client with Webflux.

We are going to create a REST API with basic CRUD operations on a Customer entity. The customer record will be stored in AWS DynamoDB. Also we will use Webflux to connect with DynamoDB.

2. Project Structure

Let’s start by bootstrapping a new Spring Boot project using start.spring.io. Following project settings were selected in start.spring.io.

  • Project: Gradle Project
  • Language: Java
  • Spring Boot: 2.2.2
  • Dependencies: Spring Reactive Web

Once generated, import the project in your favorite IDE.

Update the build.gradle file as following. We have added awssdk dynamodb 2.10.40 through dependencyManagement. Also not that spring-boot-starter-weblux is on classpath.

build.gradle

plugins {
    id 'org.springframework.boot' version '2.2.2.RELEASE'
    id 'io.spring.dependency-management' version '1.0.8.RELEASE'
    id 'java'
}

group = 'net.viralpatel'
version = '0.0.1-SNAPSHOT'
sourceCompatibility = '11'

repositories {
    mavenCentral()
    maven {
        url 'https://s3-us-west-2.amazonaws.com/dynamodb-local/release'
    }
}

dependencies {
    implementation 'org.springframework.boot:spring-boot-starter-webflux'
    implementation 'software.amazon.awssdk:dynamodb'
    testImplementation('org.springframework.boot:spring-boot-starter-test') {
        exclude group: 'org.junit.vintage', module: 'junit-vintage-engine'
    }
    testImplementation 'io.projectreactor:reactor-test'
}

dependencyManagement {
    imports {
        mavenBom 'software.amazon.awssdk:bom:2.10.40'
    }
}

test {
    useJUnitPlatform()
}

Add following configuration in application.yaml file. We have defined a couple of properties for dynamodb endpoint and table name. Note the endpoint is pointing to our local DynamoDB. In production you might want to change this to point to your AWS region. We can also use Spring profiles to switch the value for this properties in different environment.

application.yaml

application:
    dynamodb:
        endpoint: http://localhost:8000
        customer_table: customers

Following is the directory structure for our REST API project. Note that the customer related classes are segregated in a customer package.

Spring Boot Webflux DynamoDB Project Structure

3. Setting up DynamoDB Locally without Docker

Before we proceed with the rest of tutorial, we will setup a local dynamodb instance where we can test our changes. Instead of relying on AWS environment for DynamoDB this would speed up the development process.

We will create customers table in local DynamoDB. Follow the steps at https://ift.tt/375nb2N

  1. Download the AWS DynamoDB Local JAR from above link and unzip it
  2. Run the local dynamodb jar
    java -Djava.library.path=./DynamoDBLocal_lib/ 
             -jar DynamoDBLocal.jar
  3. Create customer table in dynamodb.
    aws dynamodb create-table 
        --table-name customers 
        --attribute-definitions AttributeName=customerId,AttributeType=S 
        --key-schema AttributeName=customerId,KeyType=HASH 
        --provisioned-throughput ReadCapacityUnits=5,WriteCapacityUnits=5 
        --endpoint-url http://localhost:8000
  4. Verify the table is created.
    aws dynamodb list-tables 
        --endpoint-url http://localhost:8000
    Output:
    {
          "TableNames": [
              "customers"
          ]
    }

4. REST API in Spring Boot Webflux DynamoDB

Let us start by defining the Repository and DynamoDB configuration to access the data from DynamoDB. As noted earlier we will use DynamoDbAsyncClient to access DynamoDB.

4.1 Repository with AWS Async Client

The repository class contains basic CRUD methods to maintain Customer entity. Note how we use dynamoDbAsyncClient to access DynamoDB using different GetItemRequest, DeleteItemRequest, ScanRequest APIs. Also we map the return type from DynamoDbAsyncClient which is CompletableFuture to Reactor’s Mono class using Mono.fromCompletionStage.

CustomerRepository.java

package net.viralpatel.springbootwebfluxdynamodb.customer;

import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Repository;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClient;
import software.amazon.awssdk.services.dynamodb.model.*;

import java.util.Map;
import java.util.UUID;

@Repository
public class CustomerRepository {

    private DynamoDbAsyncClient dynamoDbAsyncClient;
    private String customerTable;

    public CustomerRepository(DynamoDbAsyncClient dynamoDbAsyncClient,
                              @Value("${application.dynamodb.customer_table}") String customerTable) {
        this.dynamoDbAsyncClient = dynamoDbAsyncClient;
        this.customerTable = customerTable;
    }

    public Flux<Customer> listCustomers() {

        ScanRequest scanRequest = ScanRequest.builder()
                .tableName(customerTable)
                .build();

        return Mono.fromCompletionStage(dynamoDbAsyncClient.scan(scanRequest))
                .map(scanResponse -> scanResponse.items())
                .map(CustomerMapper::fromList)
                .flatMapMany(Flux::fromIterable);
    }

    public Mono<Customer> createCustomer(Customer customer) {

        customer.setId(UUID.randomUUID().toString());

        PutItemRequest putItemRequest = PutItemRequest.builder()
                .tableName(customerTable)
                .item(CustomerMapper.toMap(customer))
                .build();

        return Mono.fromCompletionStage(dynamoDbAsyncClient.putItem(putItemRequest))
                .map(putItemResponse -> putItemResponse.attributes())
                .map(attributeValueMap -> customer);
    }

    public Mono<String> deleteCustomer(String customerId) {
        DeleteItemRequest deleteItemRequest = DeleteItemRequest.builder()
                .tableName(customerTable)
                .key(Map.of("customerId", AttributeValue.builder().s(customerId).build()))
                .build();

        return Mono.fromCompletionStage(dynamoDbAsyncClient.deleteItem(deleteItemRequest))
                .map(deleteItemResponse -> deleteItemResponse.attributes())
                .map(attributeValueMap -> customerId);
    }

    public Mono<Customer> getCustomer(String customerId) {
        GetItemRequest getItemRequest = GetItemRequest.builder()
                .tableName(customerTable)
                .key(Map.of("customerId", AttributeValue.builder().s(customerId).build()))
                .build();

        return Mono.fromCompletionStage(dynamoDbAsyncClient.getItem(getItemRequest))
                .map(getItemResponse -> getItemResponse.item())
                .map(CustomerMapper::fromMap);
    }

    public Mono<String> updateCustomer(String customerId, Customer customer) {

        customer.setId(customerId);
        PutItemRequest putItemRequest = PutItemRequest.builder()
                .tableName(customerTable)
                .item(CustomerMapper.toMap(customer))
                .build();

        return Mono.fromCompletionStage(dynamoDbAsyncClient.putItem(putItemRequest))
                .map(updateItemResponse -> customerId);
    }
}

In following configuration class we create an instance of DynamoDbAsyncClient. Note how we mapped the endpoint from the application.yaml using Spring @Value annotation.

DynamoDBConfig.java

package net.viralpatel.springbootwebfluxdynamodb.config;

import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClient;

import java.net.URI;

@Configuration
public class DynamoDBConfig {

    @Bean
    public DynamoDbAsyncClient dynamoDbAsyncClient(
            @Value("${application.dynamodb.endpoint}") String dynamoDBEndpoint) {
        return DynamoDbAsyncClient.builder()
                .endpointOverride(URI.create(dynamoDBEndpoint))
                .credentialsProvider(DefaultCredentialsProvider.builder().build())
                .build();
    }
}

Below is a utility class to map response from DynamoDB into our Customer entity class and vice versa. It’s just a boilerplate code.

CustomerMapper.java

package net.viralpatel.springbootwebfluxdynamodb.customer;

import software.amazon.awssdk.services.dynamodb.model.AttributeValue;

import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;

public class CustomerMapper {

    public static List<Customer> fromList(List<Map<String, AttributeValue>> items) {
        return items.stream()
                .map(CustomerMapper::fromMap)
                .collect(Collectors.toList());
    }

    public static Customer fromMap(Map<String, AttributeValue> attributeValueMap) {
        Customer customer = new Customer();
        customer.setId(attributeValueMap.get("customerId").s());
        customer.setName(attributeValueMap.get("name").s());
        customer.setEmail(attributeValueMap.get("email").s());
        customer.setCity(attributeValueMap.get("city").s());
        return customer;
    }

    public static Map<String, AttributeValue> toMap(Customer customer) {
        return Map.of(
                "customerId", AttributeValue.builder().s(customer.getId()).build(),
                "name", AttributeValue.builder().s(customer.getName()).build(),
                "email", AttributeValue.builder().s(customer.getEmail()).build(),
                "city", AttributeValue.builder().s(customer.getCity()).build()
        );
    }
}

4.2 Service

Next we are defining Spring’s Service class to abstract the repository from our routes (controller in old world) layer. Note that we are calling CustomerRepository from the service class and mapping the response into ServerResponse with appropriate Http Status.

CustomerService.java

package net.viralpatel.springbootwebfluxdynamodb.customer;

import org.springframework.stereotype.Service;
import org.springframework.web.reactive.function.BodyInserters;
import org.springframework.web.reactive.function.server.ServerRequest;
import org.springframework.web.reactive.function.server.ServerResponse;
import reactor.core.publisher.Mono;

import java.net.URI;
import java.util.stream.Collectors;

@Service
public class CustomerService {

    private CustomerRepository customerRepository;

    public CustomerService(CustomerRepository customerRepository) {
        this.customerRepository = customerRepository;
    }

    public Mono<ServerResponse> listCustomers(ServerRequest serverRequest) {
        return customerRepository.listCustomers()
                .collect(Collectors.toList())
                .flatMap(customers -> ServerResponse.ok().body(BodyInserters.fromValue(customers)));
    }

    public Mono<ServerResponse> createCustomer(ServerRequest serverRequest) {

        return serverRequest.bodyToMono(Customer.class)
                .flatMap(customer -> customerRepository.createCustomer(customer))
                .flatMap(customer -> ServerResponse.created(URI.create("/customers/" + customer.getId())).build());
    }

    public Mono<ServerResponse> deleteCustomer(ServerRequest serverRequest) {
        String customerId = serverRequest.pathVariable("customerId");

        return customerRepository.deleteCustomer(customerId)
                .flatMap(customer -> ServerResponse.ok().build());
    }

    public Mono<ServerResponse> getCustomer(ServerRequest serverRequest) {
        String customerId = serverRequest.pathVariable("customerId");

        return customerRepository.getCustomer(customerId)
                .flatMap(customer -> ServerResponse.ok().body(BodyInserters.fromValue(customer)));
    }

    public Mono<ServerResponse> updateCustomer(ServerRequest serverRequest) {
        String customerId = serverRequest.pathVariable("customerId");

        return serverRequest.bodyToMono(Customer.class)
                .flatMap(customer -> customerRepository.updateCustomer(customerId, customer))
                .flatMap(customer -> ServerResponse.ok().build());
    }
}

4.3 REST API Webflux Routes

Finally we glue everything up using Routes.java. In this class we utilize Spring Webflux RouterFunctions to define the route endpoints for Customer REST API.

We defined a bunch of methods using GET, PUT, POST, DELETE methods in Spring Webflux RouterFunctions and invoked appropriate CustomerService methods.

Routes.java

package net.viralpatel.springbootwebfluxdynamodb;

import net.viralpatel.springbootwebfluxdynamodb.customer.CustomerService;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.web.reactive.function.server.RouterFunction;
import org.springframework.web.reactive.function.server.ServerResponse;

import static org.springframework.web.reactive.function.server.RequestPredicates.*;
import static org.springframework.web.reactive.function.server.RouterFunctions.route;

@Configuration
public class Routes {

    private CustomerService customerService;

    public Routes(CustomerService customerService) {
        this.customerService = customerService;
    }

    @Bean
    RouterFunction<ServerResponse> customers() {
        return route(GET("/customers"), customerService::listCustomers)
                .andRoute(POST("/customers"), customerService::createCustomer)
                .andRoute(GET("/customers/{customerId}"), customerService::getCustomer)
                .andRoute(PUT("/customers/{customerId}"), customerService::updateCustomer)
                .andRoute(DELETE("/customers/{customerId}"), customerService::deleteCustomer);
    }
}

That’s All Folks

Build and execute the project using Gradle.

./graldew bootRun

Make sure your local instance of DynamoDB is up and running and customers table is created before starting the project. If everything is setup correctly, the customer API should be able to communicate with DynamoDB.

Open Postman and fire up the APIs.

Create new customer

Create Customer REST API Webflux DynamoDB

List all customers

List Customers REST API Webflux DynamoDB

Download – Spring Boot Webflux DynamoDB example

Source code of this project is available on Github.

Github – spring-boot-webflux-dynamodb

Further Reading



via ViralPatel.net https://ift.tt/2ESm2Qg

Chrome 123 to Replace GoogleUpdate.exe with New Updater.exe Tool

SUMMARY: Chrome 123.0 and later versions will use a new version of Google Update tool. The previous GoogleUpdate.exe will be replaced with n...