Integrating Amazon DynamoDB into your development process

by Aliaksandr Kavalevich - 30 Mar 2016

In this article I would like to talk about the integration of Amazon DynamoDB into your development process. It’s not about convincing you to use Amazon DynamoDB, as I assume that you have already made the decision to use it and have several questions about how to start development.

Development is not only about production code - it should also include integration tests and support different environments for making more complex tests. How would you achieve it with SaaS? Especially for integration tests and local development, Amazon provides local installation of DynamoDB. You can use it for your tests and local development, it will save you a lot of money, and it will also increase the execution speed of your integration tests. In this post, I'll show you how to write your production code and integration tests, and how to separate different environments with Java, Spring Boot, and Gradle.

Let's start with a simple example. First we will need to create a Gradle build file with all the needed dependencies included:

apply plugin: 'java'
apply plugin: 'spring-boot'
buildscript {
repositories {
mavenCentral()
maven {
url "https://plugins.gradle.org/m2/"
}
}
dependencies {
classpath "org.springframework.boot:spring-boot-gradle-plugin:1.3.2.RELEASE"
}
}
repositories {
mavenCentral()
}
jar {
baseName = 'application-gradle'
version = '0.1.0'
}
dependencies {
compile('org.springframework.boot:spring-boot-starter-web:1.3.2.RELEASE')
compile 'com.amazonaws:aws-java-sdk-dynamodb:1.10.52'
compile 'com.github.derjust:spring-data-dynamodb:4.2.0'
    testCompile 'junit:junit:4.12'
testCompile 'org.springframework.boot:spring-boot-starter-test'
}
bootRun {
addResources = false
main = 'org.article.Application'
}
test {
testLogging {
events "passed", "skipped", "failed"
}
}

Two main dependencies for using DynamoDB are:

compile 'com.amazonaws:aws-java-sdk-dynamodb:1.10.45'
compile 'com.github.derjust:spring-data-dynamodb:4.2.0'

Those dependencies include Amazon DynamoDB support for us. The first one includes a standard client for DynamoDB from AWS, and the second adds Spring-Data support for DynamoDB.

The next step is creating a Spring-Boot configuration class to configure the connection to DynamoDB. It should looks like this:

package org.article.config;

import org.apache.commons.lang3.StringUtils;
import org.socialsignin.spring.data.dynamodb.core.DynamoDBOperations;
import org.socialsignin.spring.data.dynamodb.core.DynamoDBTemplate;
import org.socialsignin.spring.data.dynamodb.repository.config.EnableDynamoDBRepositories;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig;

@EnableDynamoDBRepositories(basePackages = "org.article.repo", dynamoDBOperationsRef = "dynamoDBOperations")
@Configuration
public class DynamoDBConfig {

@Value("${amazonDynamodbEndpoint}")
private String amazonDynamoDBEndpoint;
@Value("${environment}")
private String environment;
@Value("${region}")
private String region;

@Bean
public AmazonDynamoDB amazonDynamoDB() {
final AmazonDynamoDBClient client = new AmazonDynamoDBClient();
client.setSignerRegionOverride(Regions.fromName(region).getName());
if (StringUtils.isNotEmpty(amazonDynamoDBEndpoint)) {
client.setEndpoint(amazonDynamoDBEndpoint);
}
return client;
}

@Bean
public DynamoDBOperations dynamoDBOperations() {
final DynamoDBTemplate dynamoDBTemplate = new DynamoDBTemplate(amazonDynamoDB());
final DynamoDBMapperConfig.TableNameOverride tableNameOverride = DynamoDBMapperConfig.TableNameOverride .withTableNamePrefix(environment);
dynamoDBTemplate.setDynamoDBMapperConfig(new DynamoDBMapperConfig(tableNameOverride));

return dynamoDBTemplate;
}
}

Here we've created an Amazon DynamoDB client for a specified region. It’s important to notice that Amazon provides DynamoDB in different regions, and those DBs are completely separate instances, therefore it’s important to specify the region. By default, a client will use region "us-west-1". We've also added the possibility to change the DynamoDB endpoint. For production code, you don’t need to specify this endpoint, since the client provided by Amazon will create the appropriate URL itself. For test purposes you need only to specify the URL of your local DynamoDB installation.

Another decision to be made needs to be about environment separation. For each AWS account, you only need one DynamoDB instance per region. There are two possibilities where you can have several environments (e.g. production and stage) in Amazon DynamoDB.

The first approach is to have two separate accounts - one per environment. The main benefit of this approach is that you have two completely separate environments. The main disadvantage however is that you have to maintain two accounts and switch between them during development. There can be quite a big overhead for this task.

The second approach is to separate environments using table name prefixes. For example, for table "User" there will be no real table with the name "User" in DynamoDB. Instead, there will be table names like "prodUser", "stageUser". The main benefit just so happens to be the main disadvantage of the previous approach: You don’t have to switch between accounts.

Now it's time to create a Java entity. It should look like this:


package org.article.domain;

import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTable;

@DynamoDBTable(tableName = "User")
public class User {
@DynamoDBHashKey
private String userName;
@DynamoDBAttribute
private String firstName;
@DynamoDBAttribute
private String lastName;
public String getUserName() {
return userName;
}
public void setUserName(final String userName) {
this.userName = userName;
}
public String getFirstName() {
return firstName;
}
public void setFirstName(final String firstName) {
this.firstName = firstName;
}
public String getLastName() {
return lastName;
}
public void setLastName(final String lastName) {
this.lastName = lastName;
}
@Override
public boolean equals(final Object o) {
if (this == o) {
return true;
}

if (!(o instanceof User)) {
return false;
}

User forum = (User) o;
if (userName != null ? !userName.equals(forum.userName) : forum.userName != null) {
return false;
}

if (firstName != null ? !firstName.equals(forum.firstName) : forum.firstName != null) {
return false;
}

return lastName != null ? lastName.equals(forum.lastName) : forum.lastName == null;
}

@Override
public int hashCode() {
int result = userName != null ? userName.hashCode() : 0;
result = 31 * result + (firstName != null ? firstName.hashCode() : 0);
result = 31 * result + (lastName != null ? lastName.hashCode() : 0);
return result;
}
}

The User entity looks like a usual POJO. In additional to that, we have a couple of annotations added. The DynamoDBTable annotation shows that this class corresponds to the table with the name “User”. In this table we should only have one hash key. To specify this we need the annotation DynamoDBHashKey. So, we marked the field userName as a hash key. We also have two attributes: firstName and lastName annotated with DynamoDBAttribute. Please note that this entity has a partition key without a sort key.

After Entity, we should create the UserRepository object, which is just an interface extended from CrubRepository. We should specify the entity and type of id – then it's done! Now, we have basic CRUD operations implemented for us:

package org.article.repo;

import org.article.domain.User;
import org.socialsignin.spring.data.dynamodb.repository.EnableScan;
import org.springframework.data.repository.CrudRepository;

@EnableScan
public interface UserRepository extends CrudRepository<User, String> { }

At the moment, we have both User entity and UserRepository with basic CRUD operations implemented, so it’s time to check them out with an integration test. First, we need to change our Gradle build to run an integration test. Local DynamoDB should start before tests and stop right after. We also need to create tables in DynamoDB. Although it doesn’t have a schema in the usual way, you still need to create tables and specify the partition key and sort key, if needed. To start local DynamoDB, create tables, and stop the local DynamoDB instance, there’s a nice Maven plugin here. The main disadvantage of this plugin is that it can create tables only for local DynamoDB instances, but not for the real Amazon environment. As you'll need to create tables for the production environment anyway, I believe this should be done exactly the same way as you’d do it for your local instance (that's why I don't use this plugin). What I like to do is start a local DynamoDB instance from a Docker container. If you don't have Docker yet, you can find instructions on how to set it up here.

The first Gradle task that we need is to start the local DynamoDB instance:

task startDB (type:Exec) {
commandLine "bash", "-c", "docker run -p ${dbPort}:${dbPort} -d tray/dynamodb-local -inMemory -sharedDb -port ${dbPort}"
}

This will start DynamoDB on the port specified with the property dbPort. We used 2 parameters to start DB: The first one is “inMemory”. This parameter tells DynamoDB that it should be completely in memory. The second parameter is “sharedDb”. This is responsible for making sure there isn’t any region separation in the DB.

The next step would be to create tables. We will create a table description in json format in the directory database, so there should be just the one User.json file for now.

 {
"AttributeDefinitions": [
{
"AttributeName": "userName",
"AttributeType": "S"
}
],
"TableName": "User",
"KeySchema": [
{
"AttributeName": "userName",
"KeyType": "HASH"
}
],
"ProvisionedThroughput": {
"ReadCapacityUnits": 10,
"WriteCapacityUnits": 10
}
}

We also need to add a Gradle task to create the User table in DynamoDB:

task deployDB(type:Exec) {
mustRunAfter startDB
def dynamoDBEndpoint;
if (amazonDynamodbEndpoint != "") {
dynamoDBEndpoint = "--endpoint=${amazonDynamodbEndpoint}"
} else {
dynamoDBEndpoint = ""
}
commandLine "bash", "-c", "for f in \$(find database -name \"*.json\"); do aws --region ${region} dynamodb create-table ${dynamoDBEndpoint} --cli-input-json \"\$(cat \$f | sed -e 's/TableName\": \"/TableName\": \"${environment}/g')\"; done"
}

You’ll notice that when using this task we can create tables both for local and real AWS environments. We can also create tables for different environments in the cloud. To do this, all we need is to pass the right parameters. To deploy to the real AWS, you need to execute the following command:

gradle deployDB -Penv=prod

Here the env parameter is the name of the property file. Then, we need the task to stop DynamoDB.

task stopDB (type:Exec) {
commandLine "bash", "-c", "id=\$(docker ps | grep \"tray/dynamodb-local\" | awk '{print \$1}');if [[ \${id} ]]; then docker stop \$id; fi"
}

Let's configure those tasks to start DynamoDB before an integration test and to stop it right after:

test.dependsOn startDB
test.dependsOn deployDB
test.finalizedBy stopDB

Before we can execute our first integration test, we need to create two property files – the first one for production usage and the second one for the integration tests:

src/main/resources/prod.properties
amazon.dynamodb.endpoint=
environment=prod
region=eu-west-1
dbPort=
AWS_ACCESS_KEY=realValue
AWS_SECRET_ACCESS_KEY=realValue
src/test/resources/application.properties
amazon.dynamodb.endpoint=http://localhost:7777
environment=local
region=eu-west-1
dbPort=7777
AWS_ACCESS_KEY=nonEmpty
AWS_SECRET_ACCESS_KEY=nonEmpty

Now everything is ready to write our first integration test. It looks very simple: we try to save two entities and then get one of them by ID:

package org.article.repo;

import static org.junit.Assert.assertEquals;
import org.article.Application;
import org.article.domain.User;
import org.junit.After;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.SpringApplicationConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;

@SpringApplicationConfiguration(classes = Application.class)
@RunWith(SpringJUnit4ClassRunner.class)
public class UserRepositoryIT {

@Autowired
private UserRepository userRepository;

@After
public void tearDown() {
userRepository.deleteAll();
}

@Test
public void findByUserName() {
final User user = new User();
user.setUserName("userName");
user.setFirstName("firstName");
user.setFirstName("lastName");
userRepository.save(user);

final User actualUser = userRepository.findOne(user.getUserName());
assertEquals(user, actualUser);
}
}

Once initiated, we should be able to execute this test using Gradle from the command line:

    gradle clean build

And there you go! In this post, I've showed you how to manage a local instance of Amazon DynamoDB during the development process using Spring Data and the Gradle build tool. I have also shown you how to create tables for a real AWS environment and how to separate environments using a standard Java client for AWS. The source code for this procedure can be found on our GitHub page.

Similar blog posts