To add data to the database, you essentially assert a fact to be true, and it records it and when it happened. When the fact is no longer true, you can retract it.
However, the database doesn’t forget the fact. It knows the fact was true from the time it was added until it was retracted. The power of Datomic is that you can then query the database from any point in time, and see what facts were true then.
For local development, we long used PostgreSQL as the backend store for Datomic, due to the fact that it is free to use and easy to run locally. In production, we use Amazon’s DynamoDB as the backend store for Datomic, as it provides us with high scalability. This has been, and continues to be, a great solution.
However, over time we started to experience CPU spikes on our Datomic peers.
We knew that we had a bottleneck where we were storing large JSON documents in Datomic, so we decided to store them directly in DynamoDB instead. This allowed us to scale DynamoDB to fit the specific outlier use case, while leaving our Datomic instance to worry about everything else.
DynamoDB pricing was a barrier to providing access to every developer on the team. Fortunately, Amazon offers DynamoDB Local, which is a self-hosted server that implements the DynamoDB API. DynamoDB Local is by no means production quality software — it isn’t intended to be — but it solved our issue.
With that handled, we decided to simplify our development setup by removing PostgreSQL and using DynamoDB Local as our local Datomic backend.
Running DynamoDB Local is relatively easy. All you need to do is download it from Amazon, and execute the jar as shown below. There are a handful of options and flags that you can specify, so check out the documentation for more information about them.
java -Djava.library.path=./DynamoDBLocal_lib -jar DynamoDBLocal.jar -sharedDb
In order to prepare DynamoDB Local to be used by Datomic, you’ll need to initialize the Datomic transactor. Assuming $DATOMIC_HOME points to your Datomic installation, you can use a modified version of $DATOMIC_HOME/config/samples/ddb-transactor.properties.
Copy that file somewhere on your files system. For my example, I'll assume it's in the current directory and called ddb-local.properties Open the file, and you'll need to change a few of the properties. The protocol properties needs to be ddb-local. Set aws-dynamodb-table to whatever you want your DynamoDB table name to be. Comment out the aws-dynamodb-regionproperty, as that is not needed. Lastly, uncomment the aws-dynamodb-override-endpoint property and set it to localhost:8000, or whatever host and port your DynamoDB Local instance is running on. (You may also need to set the license-key property to contain your license key.)
With that all set, run the following command to setup your Datomic transactor with DynamoDB Local. Even though you’re running DynamoDB Local, it still requires AWS a pair of keys and a region. This is because it uses them internally.
$DATOMIC_HOME/bin/datomic ensure-transactor ddb-local.properties ddb-local.properties
(Yes, you do need to provide the properties file argument twice.)
Once that is set, spin up Datomic using the same properties file and you can begin using your DynamoDB Local-backed instance of Datomic!
If you don’t use CircleCI, you can stop here. We use it to run our test suite on every Pull Request and we quickly realized that while we use a memory-backed instance of Datomic when running our tests, we do have a few tests that interact directly with DynamoDB.
So it was essential to have DynamoDB Local work with CircleCI.
Luckily, a bit of hunting around revealed that configuration is as easy as downloading and running the jar file. You just need to specify the commands in your circle.yml file (our solution was modified from this answer on the CircleCI support forums).
- curl -k -L -o dynamodb-local.tar.gz https://s3-us-west-2.amazonaws.com/dynamodb-local/dynamodb_local_latest.tar.gz
- tar -xzf dynamodb-local.tar.gz
- java -Djava.library.path=./DynamoDBLocal_lib -jar DynamoDBLocal.jar -sharedDb:
While we don’t have any hard data just yet, so far the CPU spikes we noticed seemed to have greatly reduced. All in all, we are happy to have made this long-overdue switch, and look forward to DynamoDB being even more usable for local development in the future.