[dynamodb] Dynamodb refactor (#9937)

* [dynamodb] Update to SDKv2 Enhanced Client

In addition, introduce new more simple table layout, having only one
table for all items and with more efficient data encoding (saves some read capacity).

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Time To Live (TTL) support with new table schema

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Support QuantityType

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] suppress null warnings in tests

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Optimized query performance

Similar to https://github.com/openhab/openhab-addons/pull/8938,
avoid calling Item.getUnit() repeatedly when querying data.

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Support for Group items

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Update copyright to 2021

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Removing TODO comments and add javadoc

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] javadoc

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Readability improved in TableCreatingPutItem

Also documenting the full retry logic.

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] verify fixes

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Remove slf4j from explicit dependencies

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Remove jackson from pom.xml, add as feature dep

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] bnd.importpackage tuned

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] abort query() immediately if not configured to avoid NPE

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] less chatty diagnostics

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] xml formatting

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] corrected logger class

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] null checks

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] netty client configured

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] bnd not to filter out importpackage org.slf4j.impl

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] cfg bundle group id

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Remove usage of org.apache.commons

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Remove extra prints from test

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Reducing @SupressWarnings with generics

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] README extra space removed

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] spotless

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Removed unnecessary logging

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] encapsulation

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] removed unnecessary NonNullByDefault({}) ctr-injected field

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] null annotations

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] less verbose logging in tests

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Prefer Collections.emptyList over List.of()

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] less verbose call

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Visitor to return values (simplifies the code)

Less warnings suppressed

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] comments for remaining warning supressions

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] README tuning, typo fixing

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Using less verbose syntax

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] simplified logging on errors

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Code review comments

Avoiding null checker while having more compact code

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Null safety

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] configuration label and description formatting

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] xml indentation with tabs

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] @Nullable 1-line annotation with class fields

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] No need to override credentials per request

Client has the credentials set on build time

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] set API timeouts no matter what

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] adding exception message

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] static logger

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] dependency

- comments clarifying the logic of properties
- adding netty to dep.noembedding to ensure it is not compiled in

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] ensure correct jackson and netty versions using dependencyMgt

Specifically for development and testing

See 051c764789
for further discussion why this is needed.

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] avoid google collections

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] jackson-dataformat-cbor not jackson-cbor

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] also restrict netty-transport-native-epoll linux-x86_64 version

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] refering dynamodb.cfg similar to other bundles

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] bnd.importpackage to excl. reactivestreams and typesafe.netty

These are compiled-in dependencies, and thus we do not want to have them in
OSGi Import-Package.

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* Update bundles/org.openhab.persistence.dynamodb/src/main/resources/OH-INF/config/config.xml

Co-authored-by: Fabian Wolter <github@fabian-wolter.de>
Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* Update bundles/org.openhab.persistence.dynamodb/src/main/resources/OH-INF/config/config.xml

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

Co-authored-by: Fabian Wolter <github@fabian-wolter.de>

* [dynamodb] remove netty-codec-http2 as it is included in tp-netty

See https://github.com/openhab/openhab-core/pull/2257/

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] removed duplicate in bnd.importpackage

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] slf4j-api marked as provided to remove dep errors in runtime

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

Co-authored-by: Fabian Wolter <github@fabian-wolter.de>
This commit is contained in:
Sami Salonen 2021-04-10 23:13:38 +03:00 committed by GitHub
parent 08602c04b4
commit b675160486
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
58 changed files with 4407 additions and 1592 deletions

View File

@ -0,0 +1 @@
/src/test/resources/native-libs/

View File

@ -15,22 +15,10 @@ This service is provided "AS IS", and the user takes full responsibility of any
## Table of Contents ## Table of Contents
<!-- Using MarkdownTOC plugin for Sublime Text to update the table of contents (TOC) --> {::options toc_levels="2..4"/}
<!-- MarkdownTOC depth=3 autolink=true bracket=round -->
- [Prerequisites](#prerequisites) - TOC
- [Setting Up an Amazon Account](#setting-up-an-amazon-account) {:toc}
- [Configuration](#configuration)
- [Basic configuration](#basic-configuration)
- [Configuration Using Credentials File](#configuration-using-credentials-file)
- [Advanced Configuration](#advanced-configuration)
- [Details](#details)
- [Tables Creation](#tables-creation)
- [Caveats](#caveats)
- [Developer Notes](#developer-notes)
- [Updating Amazon SDK](#updating-amazon-sdk)
<!-- /MarkdownTOC -->
## Prerequisites ## Prerequisites
@ -55,9 +43,53 @@ Please also note possible [Free Tier](https://aws.amazon.com/free/) benefits.
## Configuration ## Configuration
This service can be configured in the file `services/dynamodb.cfg`. This service can be configured using the MainUI or using persistence configuration file `services/dynamodb.cfg`.
### Basic configuration In order to configure the persistence service, you need to configure two things:
1. Table schema revision to use
2. AWS credentials to access DynamoDB
### Table schema
The DynamoDB persistence addon provides two different table schemas: "new" and "legacy".
As the name implies, "legacy" is offered for backwards-compatibility purpose for old users who like to access the data that is already stored in DynamoDB.
All users are advised to transition to "new" table schema, which is more optimized.
At this moment there is no supported way to migrate data from old format to new.
#### New table schema
Configure the addon to use new schema by setting `table` parameter (name of the table).
Only one table will be created for all data. The table will have the following fields
| Attribute | Type | Data type | Description |
| --------- | ------ | --------- | --------------------------------------------- |
| `i` | String | Yes | Item name |
| `t` | Number | Yes | Timestamp in milliepoch |
| `s` | String | Yes | State of the item, stored as DynamoDB string. |
| `n` | Number | Yes | State of the item, stored as DynamoDB number. |
| `exp` | Number | Yes | Expiry date for item, in epoch seconds |
Other notes
- `i` and `t` forms the composite primary key (partition key, sort key) for the table
- Only one of `s` or `n` attributes are specified, not both. Most items are converted to number type for most compact representation.
- Compared to legacy format, data overhead is minimizing by using short attribute names, number timestamps and having only single table.
- `exp` attribute is used with DynamoDB Time To Live (TTL) feature to automatically delete old data
#### Legacy schema
Configure the addon to use legacy schema by setting `tablePrefix` parameter.
- When an item is persisted via this service, a table is created (if necessary).
- The service will create at most two tables for different item types.
- The tables will be named `<tablePrefix><item-type>`, where the `<item-type>` is either `bigdecimal` (numeric items) or `string` (string and complex items).
- Each table will have three columns: `itemname` (item name), `timeutc` (in ISO 8601 format with millisecond accuracy), and `itemstate` (either a number or string representing item state).
### Credentials Configuration Using Access Key and Secret Key
| Property | Default | Required | Description | | Property | Default | Required | Description |
| --------- | ------- | :------: | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- | | --------- | ------- | :------: | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- |
@ -65,7 +97,7 @@ This service can be configured in the file `services/dynamodb.cfg`.
| secretKey | | Yes | secret key as shown in [Setting up Amazon account](#setting-up-an-amazon-account). | | secretKey | | Yes | secret key as shown in [Setting up Amazon account](#setting-up-an-amazon-account). |
| region | | Yes | AWS region ID as described in [Setting up Amazon account](#setting-up-an-amazon-account). The region needs to match the region that was used to create the user. | | region | | Yes | AWS region ID as described in [Setting up Amazon account](#setting-up-an-amazon-account). The region needs to match the region that was used to create the user. |
### Configuration Using Credentials File ### Credentials Configuration Using Credentials File
Alternatively, instead of specifying `accessKey` and `secretKey`, one can configure a configuration profile file. Alternatively, instead of specifying `accessKey` and `secretKey`, one can configure a configuration profile file.
@ -95,47 +127,27 @@ aws_secret_access_key=testSecretKey
In addition to the configuration properties above, the following are also available: In addition to the configuration properties above, the following are also available:
| Property | Default | Required | Description | | Property | Default | Required | Description |
| -------------------------- | ---------- | :------: | -------------------------------------------------------------------------------------------------- | | ------------------ | ------- | :------: | ----------------------------------------------------------- |
| readCapacityUnits | 1 | No | read capacity for the created tables | | expireDays | (null) | No | Expire time for data in days (relative to stored timestamp) |
| writeCapacityUnits | 1 | No | write capacity for the created tables | | readCapacityUnits | 1 | No | read capacity for the created tables |
| tablePrefix | `openhab-` | No | table prefix used in the name of created tables | | writeCapacityUnits | 1 | No | write capacity for the created tables |
| bufferCommitIntervalMillis | 1000 | No | Interval to commit (write) buffered data. In milliseconds. |
| bufferSize | 1000 | No | Internal buffer size in datapoints which is used to batch writes to DynamoDB every `bufferCommitIntervalMillis`. |
Typically you should not need to modify parameters related to buffering.
Refer to Amazon documentation on [provisioned throughput](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.ProvisionedThroughput.html) for details on read/write capacity. Refer to Amazon documentation on [provisioned throughput](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.ProvisionedThroughput.html) for details on read/write capacity.
DynamoDB Time to Live (TTL) setting is configured using `expireDays`.
All item- and event-related configuration is done in the file `persistence/dynamodb.persist`. All item- and event-related configuration is done in the file `persistence/dynamodb.persist`.
## Details ## Details
### Tables Creation
When an item is persisted via this service, a table is created (if necessary).
Currently, the service will create at most two tables for different item types.
The tables will be named `<tablePrefix><item-type>`, where the `<item-type>` is either `bigdecimal` (numeric items) or `string` (string and complex items).
Each table will have three columns: `itemname` (item name), `timeutc` (in ISO 8601 format with millisecond accuracy), and `itemstate` (either a number or string representing item state).
## Buffering
By default, the service is asynchronous which means that data is not written immediately to DynamoDB but instead buffered in-memory.
The size of the buffer, in terms of datapoints, can be configured with `bufferSize`.
Every `bufferCommitIntervalMillis` the whole buffer of data is flushed to DynamoDB.
It is recommended to have the buffering enabled since the synchronous behaviour (writing data immediately) might have adverse impact to the whole system when there is many items persisted at the same time.
The buffering can be disabled by setting `bufferSize` to zero.
The defaults should be suitable in many use cases.
### Caveats ### Caveats
When the tables are created, the read/write capacity is configured according to configuration. When the tables are created, the read/write capacity is configured according to configuration.
However, the service does not modify the capacity of existing tables. However, the service does not modify the capacity of existing tables.
As a workaround, you can modify the read/write capacity of existing tables using the [Amazon console](https://aws.amazon.com/console/). As a workaround, you can modify the read/write capacity of existing tables using the [Amazon console](https://aws.amazon.com/console/).
Similar caveat applies for DynamoDB Time to Live (TTL) setting `expireDays`.
## Developer Notes ## Developer Notes
### Updating Amazon SDK ### Updating Amazon SDK
@ -143,20 +155,15 @@ As a workaround, you can modify the read/write capacity of existing tables using
1. Clean `lib/*` 1. Clean `lib/*`
2. Update SDK version in `scripts/fetch_sdk_pom.xml`. You can use the [maven online repository browser](https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-dynamodb) to find the latest version available online. 2. Update SDK version in `scripts/fetch_sdk_pom.xml`. You can use the [maven online repository browser](https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-dynamodb) to find the latest version available online.
3. `scripts/fetch_sdk.sh` 3. `scripts/fetch_sdk.sh`
4. Copy `scripts/target/site/dependencies.html` and `scripts/target/dependency/*.jar` to `lib/` 4. Copy printed dependencies to `pom.xml`
5. Generate `build.properties` entries
`ls lib/*.jar | python -c "import sys; print(' ' + ',\\\\\\n '.join(map(str.strip, sys.stdin.readlines())))"`
6. Generate `META-INF/MANIFEST.MF` `Bundle-ClassPath` entries
`ls lib/*.jar | python -c "import sys; print(' ' + ',\\n '.join(map(str.strip, sys.stdin.readlines())))"`
7. Generate `.classpath` entries
`ls lib/*.jar | python -c "import sys;pre='<classpathentry exported=\"true\" kind=\"lib\" path=\"';post='\"/>'; print('\\t' + pre + (post + '\\n\\t' + pre).join(map(str.strip, sys.stdin.readlines())) + post)"`
After these changes, it's good practice to run integration tests (against live AWS DynamoDB) in `org.openhab.persistence.dynamodb.test` bundle. After these changes, it's good practice to run integration tests (against live AWS DynamoDB) in `org.openhab.persistence.dynamodb.test` bundle.
See README.md in the test bundle for more information how to execute the tests. See README.md in the test bundle for more information how to execute the tests.
### Running integration tests ### Running integration tests
To run integration tests, one needs to provide AWS credentials. When running integration tests, local temporary DynamoDB server is used, emulating the real AWS DynamoDB API.
One can configure AWS credentials to run the test against real AWS DynamoDB for most realistic tests.
Eclipse instructions Eclipse instructions
@ -168,7 +175,4 @@ Eclipse instructions
-DDYNAMODBTEST_REGION=REGION-ID -DDYNAMODBTEST_REGION=REGION-ID
-DDYNAMODBTEST_ACCESS=ACCESS-KEY -DDYNAMODBTEST_ACCESS=ACCESS-KEY
-DDYNAMODBTEST_SECRET=SECRET -DDYNAMODBTEST_SECRET=SECRET
```` ````
The tests will create tables with prefix `dynamodb-integration-tests-`.
Note that when tests are begun, all data is removed from that table!

View File

@ -15,118 +15,303 @@
<name>openHAB Add-ons :: Bundles :: Persistence Service :: DynamoDB</name> <name>openHAB Add-ons :: Bundles :: Persistence Service :: DynamoDB</name>
<properties> <properties>
<dep.noembedding>jackson-core,jackson-annotations,jackson-databind,jackson-dataformat-cbor</dep.noembedding> <!-- Avoid declaring OSGI-imports for packages that are part of embedded/compiled dependencies, declared below under
<bnd.importpackage>!com.amazonaws.*,!org.joda.convert.*,!com.sun.org.apache.xpath.*,!kotlin,!org.apache.log.*,!org.bouncycastle.*,!org.apache.avalon.*</bnd.importpackage> <dependencies> -->
<bnd.importpackage>!com.amazonaws.*,!com.sun.org.apache.xpath.*,!kotlin,!org.apache.log.*,!org.bouncycastle.*,!org.joda.convert.*,!scala.util.*,!software.amazon.*,!org.reactivestreams,!com.typesafe.netty</bnd.importpackage>
<!-- We do not want to embed/compile in dependencies that are declared as OSGi imports (feature.xml). This includes e.g.
netty & jackson. Let's ensure by listing relevant packages with dep.noembedding -->
<dep.noembedding>netty-common,netty-transport,netty-transport-native-epoll,netty-transport-native-unix-common,netty-buffer,netty-resolver,netty-codec,netty-codec-http,netty-codec-http2,netty-handler,jackson-core,jackson-annotations,jackson-dataformat-cbor,jackson-databind</dep.noembedding>
<!-- netty version matching the openhab.tp-netty feature version -->
<netty.version>4.1.42.Final</netty.version>
<slf4j.version>1.7.21</slf4j.version>
</properties> </properties>
<!--Custom repository for DynamoDBLocal -->
<repositories>
<repository>
<id>dynamodb-local-repo</id>
<name>DynamoDB Local Release Repository</name>
<url>https://s3-us-west-2.amazonaws.com/dynamodb-local/release</url>
</repository>
</repositories>
<build>
<plugins>
<!-- Copy sqlite native libraries for tests -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>3.1.2</version>
<executions>
<execution>
<id>copy</id>
<phase>test-compile</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<includeScope>test</includeScope>
<includeTypes>so,dll,dylib</includeTypes>
<outputDirectory>${project.basedir}/src/test/resources/native-libs</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<!-- Let's ensure the correct versions with dependencyManagement.
We want to run our tests and compilations using netty and jackson version used in the runtime (provided as OSGi features).
slf4j-api version is locked to core-version. Also: slf4j comes via openHAB logging, so setting it here as provided to
have the right OSGi imports.
-->
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-buffer</artifactId>
<version>${netty.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-codec-http2</artifactId>
<version>${netty.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-codec-http</artifactId>
<version>${netty.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-codec</artifactId>
<version>${netty.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-common</artifactId>
<version>${netty.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-handler</artifactId>
<version>${netty.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-resolver</artifactId>
<version>${netty.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport</artifactId>
<version>${netty.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport-native-epoll</artifactId>
<version>${netty.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport-native-epoll</artifactId>
<classifier>linux-x86_64</classifier>
<version>${netty.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport-native-unix-common</artifactId>
<version>${netty.version}</version>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies> <dependencies>
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-core --> <!-- Test dependencies -->
<dependency> <dependency>
<groupId>com.amazonaws</groupId> <groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-core</artifactId> <artifactId>DynamoDBLocal</artifactId>
<version>1.11.213</version> <version>1.13.5</version>
<exclusions> <scope>test</scope>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency> </dependency>
<dependency> <dependency>
<groupId>com.amazonaws</groupId> <groupId>org.eclipse.jetty</groupId>
<artifactId>aws-java-sdk-dynamodb</artifactId> <artifactId>jetty-util</artifactId>
<version>1.11.213</version> <version>8.1.12.v20130726</version>
<scope>test</scope>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-kms -->
<dependency> <dependency>
<groupId>com.amazonaws</groupId> <groupId>org.eclipse.jetty</groupId>
<artifactId>aws-java-sdk-kms</artifactId> <artifactId>jetty-server</artifactId>
<version>1.11.213</version> <version>8.1.12.v20130726</version>
<scope>test</scope>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-s3 -->
<dependency> <dependency>
<groupId>com.amazonaws</groupId> <groupId>com.almworks.sqlite4java</groupId>
<artifactId>aws-java-sdk-s3</artifactId> <artifactId>sqlite4java</artifactId>
<version>1.11.213</version> <version>[1.0, 2.0)</version>
<scope>test</scope>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/com.amazonaws/jmespath-java -->
<dependency> <dependency>
<groupId>com.amazonaws</groupId> <groupId>com.almworks.sqlite4java</groupId>
<artifactId>jmespath-java</artifactId> <artifactId>sqlite4java-win32-x86</artifactId>
<version>1.11.213</version> <type>dll</type>
<exclusions> <version>[1.0, 2.0)</version>
<exclusion> <scope>test</scope>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/org.apache.httpcomponents/httpclient -->
<dependency> <dependency>
<groupId>org.apache.httpcomponents</groupId> <groupId>com.almworks.sqlite4java</groupId>
<artifactId>httpclient</artifactId> <artifactId>sqlite4java-win32-x64</artifactId>
<version>4.5.2</version> <type>dll</type>
<version>[1.0, 2.0)</version>
<scope>test</scope>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/software.amazon.ion/ion-java -->
<dependency> <dependency>
<groupId>software.amazon.ion</groupId> <groupId>com.almworks.sqlite4java</groupId>
<artifactId>ion-java</artifactId> <artifactId>libsqlite4java-osx</artifactId>
<type>dylib</type>
<version>[1.0, 2.0)</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-i386</artifactId>
<type>so</type>
<version>[1.0, 2.0)</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-amd64</artifactId>
<type>so</type>
<version>[1.0, 2.0)</version>
<scope>test</scope>
</dependency>
<!-- -->
<!-- -->
<!-- SDK (runtime) dependencies -->
<!-- -->
<!-- -->
<!-- -->
<!-- -->
<!-- NOTE: this list is generated automatically using scripts/fetch_sdk.sh to
facilitate easier SDK updates. Do not edit the below manually -->
<!-- NOTE 2: all transitive dependencies of AWS SDK are included as direct dependencies of this bundle,
since we want to embed them to the bundle. The ones specified in dep.noembedded are not embedded though and not even
listed here.
-->
<dependency>
<groupId>com.typesafe.netty</groupId>
<artifactId>netty-reactive-streams-http</artifactId>
<version>2.0.4</version>
</dependency>
<dependency>
<groupId>com.typesafe.netty</groupId>
<artifactId>netty-reactive-streams</artifactId>
<version>2.0.4</version>
</dependency>
<dependency>
<groupId>org.reactivestreams</groupId>
<artifactId>reactive-streams</artifactId>
<version>1.0.2</version> <version>1.0.2</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/org.apache.httpcomponents/httpcore -->
<dependency> <dependency>
<groupId>org.apache.httpcomponents</groupId> <groupId>software.amazon.awssdk</groupId>
<artifactId>httpcore</artifactId> <artifactId>annotations</artifactId>
<version>4.4.4</version> <version>2.15.56</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/commons-logging/commons-logging -->
<dependency> <dependency>
<groupId>commons-logging</groupId> <groupId>software.amazon.awssdk</groupId>
<artifactId>commons-logging</artifactId> <artifactId>auth</artifactId>
<version>1.1.3</version> <version>2.15.56</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/commons-codec/commons-codec -->
<dependency> <dependency>
<groupId>commons-codec</groupId> <groupId>software.amazon.awssdk</groupId>
<artifactId>commons-codec</artifactId> <artifactId>aws-core</artifactId>
<version>1.9</version> <version>2.15.56</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/joda-time/joda-time -->
<dependency> <dependency>
<groupId>joda-time</groupId> <groupId>software.amazon.awssdk</groupId>
<artifactId>joda-time</artifactId> <artifactId>aws-json-protocol</artifactId>
<version>2.8.1</version> <version>2.15.56</version>
</dependency> </dependency>
<!-- The following dependencies are required for test resolution -->
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-annotations -->
<dependency> <dependency>
<groupId>com.fasterxml.jackson.core</groupId> <groupId>software.amazon.awssdk</groupId>
<artifactId>jackson-annotations</artifactId> <artifactId>dynamodb-enhanced</artifactId>
<version>${jackson.version}</version> <version>2.15.56</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-core -->
<dependency> <dependency>
<groupId>com.fasterxml.jackson.core</groupId> <groupId>software.amazon.awssdk</groupId>
<artifactId>jackson-core</artifactId> <artifactId>dynamodb</artifactId>
<version>${jackson.version}</version> <version>2.15.56</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind -->
<dependency> <dependency>
<groupId>com.fasterxml.jackson.core</groupId> <groupId>software.amazon.awssdk</groupId>
<artifactId>jackson-databind</artifactId> <artifactId>http-client-spi</artifactId>
<version>${jackson.version}</version> <version>2.15.56</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.dataformat/jackson-dataformat-cbor -->
<dependency> <dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId> <groupId>software.amazon.awssdk</groupId>
<artifactId>jackson-dataformat-cbor</artifactId> <artifactId>metrics-spi</artifactId>
<version>${jackson.version}</version> <version>2.15.77</version>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>netty-nio-client</artifactId>
<version>2.15.77</version>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>profiles</artifactId>
<version>2.15.56</version>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>protocol-core</artifactId>
<version>2.15.56</version>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>regions</artifactId>
<version>2.15.56</version>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>sdk-core</artifactId>
<version>2.15.56</version>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>utils</artifactId>
<version>2.15.56</version>
</dependency>
<dependency>
<groupId>software.amazon.eventstream</groupId>
<artifactId>eventstream</artifactId>
<version>1.0.1</version>
</dependency> </dependency>
</dependencies> </dependencies>
</project> </project>

View File

@ -1,5 +1,25 @@
#!/usr/bin/env bash #!/usr/bin/env bash
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
mvn -f $DIR/fetch_sdk_pom.xml clean process-sources project-info-reports:dependencies echo "Copy these to pom.xml"
echo ""
for dependency in $(mvn -f "$DIR/fetch_sdk_pom.xml" clean process-sources dependency:tree|grep -E ':(test|compile)$' | grep -o '[[:lower:]].*'|sort); do
readarray -d : -t components <<< "$dependency"
scope_without_newline="$(echo "${components[4]}"|tr -d '\n')"
cat << EOF
<dependency>
<groupId>${components[0]}</groupId>
<artifactId>${components[1]}</artifactId>
<version>${components[3]}</version>
EOF
if [[ "${components[2]}" != "jar" ]]; then
echo " <type>${components[2]}</type>"
fi
if [[ "${scope_without_newline}" != "compile" ]]; then
echo " <scope>${scope_without_newline}</scope>"
fi
cat << EOF
</dependency>
EOF
done
echo "Check $DIR/target/site/dependencies.html and $DIR/target/dependency"

View File

@ -8,9 +8,80 @@
<dependencies> <dependencies>
<dependency> <dependency>
<groupId>com.amazonaws</groupId> <groupId>software.amazon.awssdk</groupId>
<artifactId>aws-java-sdk-dynamodb</artifactId> <artifactId>dynamodb-enhanced</artifactId>
<version>1.11.213</version> <version>2.15.56</version>
<exclusions>
<!-- exclude artifacts available via openhab jackson feature -->
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>netty-nio-client</artifactId>
<version>2.15.77</version>
<exclusions>
<!-- exclude artifacts available via openhab netty feature, or otherwise added in feature.xml -->
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-buffer</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-codec</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-codec-http</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-codec-http2</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-common</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-handler</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-resolver</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-transport</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-transport-native-epoll</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-transport-native-unix-common</artifactId>
</exclusion>
</exclusions>
</dependency> </dependency>
</dependencies> </dependencies>
@ -34,4 +105,4 @@
</plugin> </plugin>
</plugins> </plugins>
</build> </build>
</project> </project>

View File

@ -5,6 +5,7 @@
<feature name="openhab-persistence-dynamodb" description="DynamoDB Persistence" version="${project.version}"> <feature name="openhab-persistence-dynamodb" description="DynamoDB Persistence" version="${project.version}">
<feature>openhab-runtime-base</feature> <feature>openhab-runtime-base</feature>
<feature dependency="true">openhab.tp-jackson</feature> <feature dependency="true">openhab.tp-jackson</feature>
<feature dependency="true">openhab.tp-netty</feature>
<bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.dynamodb/${project.version}</bundle> <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.dynamodb/${project.version}</bundle>
<configfile finalname="${openhab.conf}/services/dynamodb.cfg" override="false">mvn:org.openhab.addons.features.karaf/org.openhab.addons.features.karaf.openhab-addons-external/${project.version}/cfg/dynamodb</configfile> <configfile finalname="${openhab.conf}/services/dynamodb.cfg" override="false">mvn:org.openhab.addons.features.karaf/org.openhab.addons.features.karaf.openhab-addons-external/${project.version}/cfg/dynamodb</configfile>
</feature> </feature>

View File

@ -1,129 +0,0 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import java.time.Instant;
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.util.UUID;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.TimeUnit;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.openhab.core.items.Item;
import org.openhab.core.persistence.PersistenceService;
import org.openhab.core.types.State;
import org.openhab.core.types.UnDefType;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* Abstract class for buffered persistence services
*
* @param <T> Type of the state as accepted by the AWS SDK.
*
* @author Sami Salonen - Initial contribution
* @author Kai Kreuzer - Migration to 3.x
*
*/
@NonNullByDefault
public abstract class AbstractBufferedPersistenceService<T> implements PersistenceService {
private static final long BUFFER_OFFER_TIMEOUT_MILLIS = 500;
private final Logger logger = LoggerFactory.getLogger(AbstractBufferedPersistenceService.class);
protected @Nullable BlockingQueue<T> buffer;
private boolean writeImmediately;
protected void resetWithBufferSize(int bufferSize) {
int capacity = Math.max(1, bufferSize);
buffer = new ArrayBlockingQueue<>(capacity, true);
writeImmediately = bufferSize == 0;
}
protected abstract T persistenceItemFromState(String name, State state, ZonedDateTime time);
protected abstract boolean isReadyToStore();
protected abstract void flushBufferedData();
@Override
public void store(Item item) {
store(item, null);
}
@Override
public void store(Item item, @Nullable String alias) {
long storeStart = System.currentTimeMillis();
String uuid = UUID.randomUUID().toString();
if (item.getState() instanceof UnDefType) {
logger.debug("Undefined item state received. Not storing item {}.", item.getName());
return;
}
if (!isReadyToStore()) {
return;
}
if (buffer == null) {
throw new IllegalStateException("Buffer not initialized with resetWithBufferSize. Bug?");
}
ZonedDateTime time = ZonedDateTime.ofInstant(Instant.ofEpochMilli(storeStart), ZoneId.systemDefault());
String realName = item.getName();
String name = (alias != null) ? alias : realName;
State state = item.getState();
T persistenceItem = persistenceItemFromState(name, state, time);
logger.trace("store() called with item {}, which was converted to {} [{}]", item, persistenceItem, uuid);
if (writeImmediately) {
logger.debug("Writing immediately item {} [{}]", realName, uuid);
// We want to write everything immediately
// Synchronous behavior to ensure buffer does not get full.
synchronized (this) {
boolean buffered = addToBuffer(persistenceItem);
assert buffered;
flushBufferedData();
}
} else {
long bufferStart = System.currentTimeMillis();
boolean buffered = addToBuffer(persistenceItem);
if (buffered) {
logger.debug("Buffered item {} in {} ms. Total time for store(): {} [{}]", realName,
System.currentTimeMillis() - bufferStart, System.currentTimeMillis() - storeStart, uuid);
} else {
logger.debug(
"Buffer is full. Writing buffered data immediately and trying again. Consider increasing bufferSize");
// Buffer is full, commit it immediately
flushBufferedData();
boolean buffered2 = addToBuffer(persistenceItem);
if (buffered2) {
logger.debug("Buffered item in {} ms (2nd try, flushed buffer in-between) [{}]",
System.currentTimeMillis() - bufferStart, uuid);
} else {
// The unlikely case happened -- buffer got full again immediately
logger.warn("Buffering failed for the second time -- Too small bufferSize? Discarding data [{}]",
uuid);
}
}
}
}
protected boolean addToBuffer(T persistenceItem) {
try {
return buffer != null && buffer.offer(persistenceItem, BUFFER_OFFER_TIMEOUT_MILLIS, TimeUnit.MILLISECONDS);
} catch (InterruptedException e) {
logger.warn("Interrupted when trying to buffer data! Dropping data");
return false;
}
}
}

View File

@ -13,6 +13,8 @@
package org.openhab.persistence.dynamodb.internal; package org.openhab.persistence.dynamodb.internal;
import java.math.BigDecimal; import java.math.BigDecimal;
import java.time.Duration;
import java.time.Instant;
import java.time.ZoneId; import java.time.ZoneId;
import java.time.ZonedDateTime; import java.time.ZonedDateTime;
import java.time.format.DateTimeFormatter; import java.time.format.DateTimeFormatter;
@ -20,12 +22,18 @@ import java.time.format.DateTimeParseException;
import java.util.HashMap; import java.util.HashMap;
import java.util.Map; import java.util.Map;
import javax.measure.Quantity;
import javax.measure.Unit;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.openhab.core.items.Item; import org.openhab.core.items.Item;
import org.openhab.core.library.items.CallItem; import org.openhab.core.library.items.CallItem;
import org.openhab.core.library.items.ColorItem; import org.openhab.core.library.items.ColorItem;
import org.openhab.core.library.items.ContactItem; import org.openhab.core.library.items.ContactItem;
import org.openhab.core.library.items.DateTimeItem; import org.openhab.core.library.items.DateTimeItem;
import org.openhab.core.library.items.DimmerItem; import org.openhab.core.library.items.DimmerItem;
import org.openhab.core.library.items.ImageItem;
import org.openhab.core.library.items.LocationItem; import org.openhab.core.library.items.LocationItem;
import org.openhab.core.library.items.NumberItem; import org.openhab.core.library.items.NumberItem;
import org.openhab.core.library.items.PlayerItem; import org.openhab.core.library.items.PlayerItem;
@ -40,17 +48,23 @@ import org.openhab.core.library.types.OpenClosedType;
import org.openhab.core.library.types.PercentType; import org.openhab.core.library.types.PercentType;
import org.openhab.core.library.types.PlayPauseType; import org.openhab.core.library.types.PlayPauseType;
import org.openhab.core.library.types.PointType; import org.openhab.core.library.types.PointType;
import org.openhab.core.library.types.QuantityType;
import org.openhab.core.library.types.RewindFastforwardType; import org.openhab.core.library.types.RewindFastforwardType;
import org.openhab.core.library.types.StringListType; import org.openhab.core.library.types.StringListType;
import org.openhab.core.library.types.StringType; import org.openhab.core.library.types.StringType;
import org.openhab.core.library.types.UpDownType;
import org.openhab.core.persistence.HistoricItem; import org.openhab.core.persistence.HistoricItem;
import org.openhab.core.types.State; import org.openhab.core.types.State;
import org.openhab.core.types.UnDefType; import org.openhab.core.types.UnDefType;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverter; import software.amazon.awssdk.enhanced.dynamodb.AttributeConverter;
import software.amazon.awssdk.enhanced.dynamodb.AttributeValueType;
import software.amazon.awssdk.enhanced.dynamodb.EnhancedType;
import software.amazon.awssdk.enhanced.dynamodb.TableSchema;
import software.amazon.awssdk.enhanced.dynamodb.mapper.StaticAttributeTags;
import software.amazon.awssdk.enhanced.dynamodb.mapper.StaticTableSchema.Builder;
import software.amazon.awssdk.services.dynamodb.model.AttributeValue;
/** /**
* Base class for all DynamoDBItem. Represents openHAB Item serialized in a suitable format for the database * Base class for all DynamoDBItem. Represents openHAB Item serialized in a suitable format for the database
@ -59,33 +73,71 @@ import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverter;
* *
* @author Sami Salonen - Initial contribution * @author Sami Salonen - Initial contribution
*/ */
@NonNullByDefault
public abstract class AbstractDynamoDBItem<T> implements DynamoDBItem<T> { public abstract class AbstractDynamoDBItem<T> implements DynamoDBItem<T> {
private static final BigDecimal REWIND_BIGDECIMAL = new BigDecimal("-1");
private static final BigDecimal PAUSE_BIGDECIMAL = new BigDecimal("0");
private static final BigDecimal PLAY_BIGDECIMAL = new BigDecimal("1");
private static final BigDecimal FAST_FORWARD_BIGDECIMAL = new BigDecimal("2");
private static final ZoneId UTC = ZoneId.of("UTC"); private static final ZoneId UTC = ZoneId.of("UTC");
public static final ZonedDateTimeStringConverter ZONED_DATE_TIME_CONVERTER_STRING = new ZonedDateTimeStringConverter();
public static final ZonedDateTimeMilliEpochConverter ZONED_DATE_TIME_CONVERTER_MILLIEPOCH = new ZonedDateTimeMilliEpochConverter();
public static final DateTimeFormatter DATEFORMATTER = DateTimeFormatter.ofPattern(DATE_FORMAT).withZone(UTC); public static final DateTimeFormatter DATEFORMATTER = DateTimeFormatter.ofPattern(DATE_FORMAT).withZone(UTC);
protected static final Class<@Nullable Long> NULLABLE_LONG = (Class<@Nullable Long>) Long.class;
private static final String UNDEFINED_PLACEHOLDER = "<org.openhab.core.types.UnDefType.UNDEF>"; public static AttributeConverter<ZonedDateTime> getTimestampConverter(boolean legacy) {
return legacy ? ZONED_DATE_TIME_CONVERTER_STRING : ZONED_DATE_TIME_CONVERTER_MILLIEPOCH;
private static final Map<Class<? extends Item>, Class<? extends DynamoDBItem<?>>> ITEM_CLASS_MAP = new HashMap<>();
static {
ITEM_CLASS_MAP.put(CallItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP.put(ContactItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP.put(DateTimeItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP.put(LocationItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP.put(NumberItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP.put(RollershutterItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP.put(StringItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP.put(SwitchItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP.put(DimmerItem.class, DynamoDBBigDecimalItem.class); // inherited from SwitchItem (!)
ITEM_CLASS_MAP.put(ColorItem.class, DynamoDBStringItem.class); // inherited from DimmerItem
ITEM_CLASS_MAP.put(PlayerItem.class, DynamoDBStringItem.class);
} }
public static final Class<DynamoDBItem<?>> getDynamoItemClass(Class<? extends Item> itemClass) protected static <C extends AbstractDynamoDBItem<?>> Builder<C> getBaseSchemaBuilder(Class<C> clz, boolean legacy) {
throws NullPointerException { return TableSchema.builder(clz).addAttribute(String.class,
@SuppressWarnings("unchecked") a -> a.name(legacy ? DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME_LEGACY : DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME)
Class<DynamoDBItem<?>> dtoclass = (Class<DynamoDBItem<?>>) ITEM_CLASS_MAP.get(itemClass); .getter(AbstractDynamoDBItem::getName).setter(AbstractDynamoDBItem::setName)
.tags(StaticAttributeTags.primaryPartitionKey()))
.addAttribute(ZonedDateTime.class, a -> a
.name(legacy ? DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC_LEGACY : DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC)
.getter(AbstractDynamoDBItem::getTime).setter(AbstractDynamoDBItem::setTime)
.tags(StaticAttributeTags.primarySortKey()).attributeConverter(getTimestampConverter(legacy)));
}
private static final Map<Class<? extends Item>, Class<? extends DynamoDBItem<?>>> ITEM_CLASS_MAP_LEGACY = new HashMap<>();
static {
ITEM_CLASS_MAP_LEGACY.put(CallItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_LEGACY.put(ContactItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_LEGACY.put(DateTimeItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_LEGACY.put(LocationItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_LEGACY.put(NumberItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_LEGACY.put(RollershutterItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_LEGACY.put(StringItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_LEGACY.put(SwitchItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_LEGACY.put(DimmerItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_LEGACY.put(ColorItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_LEGACY.put(PlayerItem.class, DynamoDBStringItem.class);
}
private static final Map<Class<? extends Item>, Class<? extends DynamoDBItem<?>>> ITEM_CLASS_MAP_NEW = new HashMap<>();
static {
ITEM_CLASS_MAP_NEW.put(CallItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_NEW.put(ContactItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_NEW.put(DateTimeItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_NEW.put(LocationItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_NEW.put(NumberItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_NEW.put(RollershutterItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_NEW.put(StringItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_NEW.put(SwitchItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_NEW.put(DimmerItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_NEW.put(ColorItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_NEW.put(PlayerItem.class, DynamoDBBigDecimalItem.class); // Different from LEGACY
}
public static final Class<? extends DynamoDBItem<?>> getDynamoItemClass(Class<? extends Item> itemClass,
boolean legacy) throws NullPointerException {
Class<? extends DynamoDBItem<?>> dtoclass = (legacy ? ITEM_CLASS_MAP_LEGACY : ITEM_CLASS_MAP_NEW)
.get(itemClass);
if (dtoclass == null) { if (dtoclass == null) {
throw new IllegalArgumentException(String.format("Unknown item class %s", itemClass)); throw new IllegalArgumentException(String.format("Unknown item class %s", itemClass));
} }
@ -101,123 +153,313 @@ public abstract class AbstractDynamoDBItem<T> implements DynamoDBItem<T> {
* @author Sami Salonen - Initial contribution * @author Sami Salonen - Initial contribution
* *
*/ */
public static final class ZonedDateTimeConverter implements DynamoDBTypeConverter<String, ZonedDateTime> { public static final class ZonedDateTimeStringConverter implements AttributeConverter<ZonedDateTime> {
@Override @Override
public String convert(ZonedDateTime time) { public AttributeValue transformFrom(ZonedDateTime time) {
return DATEFORMATTER.format(time.withZoneSameInstant(UTC)); return AttributeValue.builder().s(toString(time)).build();
} }
@Override @Override
public ZonedDateTime unconvert(String serialized) { public ZonedDateTime transformTo(@NonNullByDefault({}) AttributeValue serialized) {
return transformTo(serialized.s());
}
@Override
public EnhancedType<ZonedDateTime> type() {
return EnhancedType.<ZonedDateTime> of(ZonedDateTime.class);
}
@Override
public AttributeValueType attributeValueType() {
return AttributeValueType.S;
}
public String toString(ZonedDateTime time) {
return DATEFORMATTER.format(time.withZoneSameInstant(UTC));
}
public ZonedDateTime transformTo(String serialized) {
return ZonedDateTime.parse(serialized, DATEFORMATTER); return ZonedDateTime.parse(serialized, DATEFORMATTER);
} }
} }
private static final ZonedDateTimeConverter zonedDateTimeConverter = new ZonedDateTimeConverter(); /**
* Custom converter for serialization/deserialization of ZonedDateTime.
*
* Serialization: ZonedDateTime is first converted to UTC and then stored as milliepochs
*
* @author Sami Salonen - Initial contribution
*
*/
public static final class ZonedDateTimeMilliEpochConverter implements AttributeConverter<ZonedDateTime> {
@Override
public AttributeValue transformFrom(ZonedDateTime time) {
return AttributeValue.builder().n(toEpochMilliString(time)).build();
}
@Override
public ZonedDateTime transformTo(@NonNullByDefault({}) AttributeValue serialized) {
return transformTo(serialized.n());
}
@Override
public EnhancedType<ZonedDateTime> type() {
return EnhancedType.<ZonedDateTime> of(ZonedDateTime.class);
}
@Override
public AttributeValueType attributeValueType() {
return AttributeValueType.N;
}
public static String toEpochMilliString(ZonedDateTime time) {
return String.valueOf(time.toInstant().toEpochMilli());
}
public static BigDecimal toBigDecimal(ZonedDateTime time) {
return new BigDecimal(toEpochMilliString(time));
}
public ZonedDateTime transformTo(String serialized) {
return transformTo(Long.valueOf(serialized));
}
public ZonedDateTime transformTo(Long epochMillis) {
return Instant.ofEpochMilli(epochMillis).atZone(UTC);
}
}
private final Logger logger = LoggerFactory.getLogger(AbstractDynamoDBItem.class); private final Logger logger = LoggerFactory.getLogger(AbstractDynamoDBItem.class);
protected String name; protected String name;
protected T state; protected @Nullable T state;
protected ZonedDateTime time; protected ZonedDateTime time;
private @Nullable Integer expireDays;
private @Nullable Long expiry;
public AbstractDynamoDBItem(String name, T state, ZonedDateTime time) { public AbstractDynamoDBItem(String name, @Nullable T state, ZonedDateTime time, @Nullable Integer expireDays) {
this.name = name; this.name = name;
this.state = state; this.state = state;
this.time = time; this.time = time;
if (expireDays != null && expireDays <= 0) {
throw new IllegalArgumentException();
}
this.expireDays = expireDays;
this.expiry = expireDays == null ? null : time.toInstant().plus(Duration.ofDays(expireDays)).getEpochSecond();
} }
public static DynamoDBItem<?> fromState(String name, State state, ZonedDateTime time) { /**
if (state instanceof DecimalType && !(state instanceof HSBType)) { * Convert given state to target state.
// also covers PercentType which is inherited from DecimalType *
return new DynamoDBBigDecimalItem(name, ((DecimalType) state).toBigDecimal(), time); * If conversion fails, IllegalStateException is raised.
} else if (state instanceof OnOffType) { * Use this method you do not expect conversion to fail.
return new DynamoDBBigDecimalItem(name, *
((OnOffType) state) == OnOffType.ON ? BigDecimal.ONE : BigDecimal.ZERO, time); * @param <T> state type to convert to
} else if (state instanceof OpenClosedType) { * @param state state to convert
return new DynamoDBBigDecimalItem(name, * @param clz class of the resulting state
((OpenClosedType) state) == OpenClosedType.OPEN ? BigDecimal.ONE : BigDecimal.ZERO, time); * @return state as type T
} else if (state instanceof UpDownType) { * @throws IllegalStateException on failing conversion
return new DynamoDBBigDecimalItem(name, */
((UpDownType) state) == UpDownType.UP ? BigDecimal.ONE : BigDecimal.ZERO, time); private static <T extends State> T convert(State state, Class<T> clz) {
} else if (state instanceof DateTimeType) { @Nullable
return new DynamoDBStringItem(name, T converted = state.as(clz);
zonedDateTimeConverter.convert(((DateTimeType) state).getZonedDateTime()), time); if (converted == null) {
} else if (state instanceof UnDefType) { throw new IllegalStateException(String.format("Could not convert %s '%s' into %s",
return new DynamoDBStringItem(name, UNDEFINED_PLACEHOLDER, time); state.getClass().getSimpleName(), state, clz.getClass().getSimpleName()));
} else if (state instanceof StringListType) { }
return new DynamoDBStringItem(name, state.toFullString(), time); return converted;
}
public static DynamoDBItem<?> fromStateLegacy(Item item, ZonedDateTime time) {
String name = item.getName();
State state = item.getState();
if (item instanceof PlayerItem) {
return new DynamoDBStringItem(name, state.toFullString(), time, null);
} else { } else {
// HSBType, PointType, PlayPauseType and StringType // Apart from PlayerItem, the values are serialized to dynamodb number/strings in the same way in legacy
return new DynamoDBStringItem(name, state.toFullString(), time); // delegate to fromStateNew
return fromStateNew(item, time, null);
}
}
public static DynamoDBItem<?> fromStateNew(Item item, ZonedDateTime time, @Nullable Integer expireDays) {
String name = item.getName();
State state = item.getState();
if (item instanceof CallItem) {
return new DynamoDBStringItem(name, convert(state, StringListType.class).toFullString(), time, expireDays);
} else if (item instanceof ContactItem) {
return new DynamoDBBigDecimalItem(name, convert(state, DecimalType.class).toBigDecimal(), time, expireDays);
} else if (item instanceof DateTimeItem) {
return new DynamoDBStringItem(name,
ZONED_DATE_TIME_CONVERTER_STRING.toString(((DateTimeType) state).getZonedDateTime()), time,
expireDays);
} else if (item instanceof ImageItem) {
throw new IllegalArgumentException("Unsupported item " + item.getClass().getSimpleName());
} else if (item instanceof LocationItem) {
return new DynamoDBStringItem(name, state.toFullString(), time, expireDays);
} else if (item instanceof NumberItem) {
return new DynamoDBBigDecimalItem(name, convert(state, DecimalType.class).toBigDecimal(), time, expireDays);
} else if (item instanceof PlayerItem) {
if (state instanceof PlayPauseType) {
switch ((PlayPauseType) state) {
case PLAY:
return new DynamoDBBigDecimalItem(name, PLAY_BIGDECIMAL, time, expireDays);
case PAUSE:
return new DynamoDBBigDecimalItem(name, PAUSE_BIGDECIMAL, time, expireDays);
default:
throw new IllegalArgumentException("Unexpected enum with PlayPauseType: " + state.toString());
}
} else if (state instanceof RewindFastforwardType) {
switch ((RewindFastforwardType) state) {
case FASTFORWARD:
return new DynamoDBBigDecimalItem(name, FAST_FORWARD_BIGDECIMAL, time, expireDays);
case REWIND:
return new DynamoDBBigDecimalItem(name, REWIND_BIGDECIMAL, time, expireDays);
default:
throw new IllegalArgumentException(
"Unexpected enum with RewindFastforwardType: " + state.toString());
}
} else {
throw new IllegalStateException(
String.format("Unexpected state type %s with PlayerItem", state.getClass().getSimpleName()));
}
} else if (item instanceof RollershutterItem) {
// Normalize UP/DOWN to %
return new DynamoDBBigDecimalItem(name, convert(state, PercentType.class).toBigDecimal(), time, expireDays);
} else if (item instanceof StringItem) {
if (state instanceof StringType) {
return new DynamoDBStringItem(name, ((StringType) state).toString(), time, expireDays);
} else if (state instanceof DateTimeType) {
return new DynamoDBStringItem(name,
ZONED_DATE_TIME_CONVERTER_STRING.toString(((DateTimeType) state).getZonedDateTime()), time,
expireDays);
} else {
throw new IllegalStateException(
String.format("Unexpected state type %s with StringItem", state.getClass().getSimpleName()));
}
} else if (item instanceof ColorItem) { // Note: needs to be before parent class DimmerItem
return new DynamoDBStringItem(name, convert(state, HSBType.class).toFullString(), time, expireDays);
} else if (item instanceof DimmerItem) {// Note: needs to be before parent class SwitchItem
// Normalize ON/OFF to %
return new DynamoDBBigDecimalItem(name, convert(state, PercentType.class).toBigDecimal(), time, expireDays);
} else if (item instanceof SwitchItem) {
// Normalize ON/OFF to 1/0
return new DynamoDBBigDecimalItem(name, convert(state, DecimalType.class).toBigDecimal(), time, expireDays);
} else {
throw new IllegalArgumentException("Unsupported item " + item.getClass().getSimpleName());
} }
} }
@Override @Override
public HistoricItem asHistoricItem(final Item item) { public @Nullable HistoricItem asHistoricItem(final Item item) {
final State[] state = new State[1]; return asHistoricItem(item, null);
accept(new DynamoDBItemVisitor() { }
@Override @Override
public void visit(DynamoDBStringItem dynamoStringItem) { public @Nullable HistoricItem asHistoricItem(final Item item, @Nullable Unit<?> targetUnit) {
if (item instanceof ColorItem) { final State deserializedState;
state[0] = new HSBType(dynamoStringItem.getState()); if (this.getState() == null) {
} else if (item instanceof LocationItem) { return null;
state[0] = new PointType(dynamoStringItem.getState()); }
} else if (item instanceof PlayerItem) { try {
String value = dynamoStringItem.getState(); deserializedState = accept(new DynamoDBItemVisitor<@Nullable State>() {
try {
state[0] = PlayPauseType.valueOf(value);
} catch (IllegalArgumentException e) {
state[0] = RewindFastforwardType.valueOf(value);
}
} else if (item instanceof DateTimeItem) {
try {
// Parse ZoneDateTime from string. DATEFORMATTER assumes UTC in case it is not clear
// from the string (should be).
// We convert to default/local timezone for user convenience (e.g. display)
state[0] = new DateTimeType(zonedDateTimeConverter.unconvert(dynamoStringItem.getState())
.withZoneSameInstant(ZoneId.systemDefault()));
} catch (DateTimeParseException e) {
logger.warn("Failed to parse {} as date. Outputting UNDEF instead",
dynamoStringItem.getState());
state[0] = UnDefType.UNDEF;
}
} else if (dynamoStringItem.getState().equals(UNDEFINED_PLACEHOLDER)) {
state[0] = UnDefType.UNDEF;
} else if (item instanceof CallItem) {
String parts = dynamoStringItem.getState();
String[] strings = parts.split(",");
String orig = strings[0];
String dest = strings[1];
state[0] = new StringListType(orig, dest);
} else {
state[0] = new StringType(dynamoStringItem.getState());
}
}
@Override @Override
public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) { public @Nullable State visit(DynamoDBStringItem dynamoStringItem) {
if (item instanceof NumberItem) { String stringState = dynamoStringItem.getState();
state[0] = new DecimalType(dynamoBigDecimalItem.getState()); if (stringState == null) {
} else if (item instanceof DimmerItem) { return null;
state[0] = new PercentType(dynamoBigDecimalItem.getState()); }
} else if (item instanceof SwitchItem) { if (item instanceof ColorItem) {
state[0] = dynamoBigDecimalItem.getState().compareTo(BigDecimal.ONE) == 0 ? OnOffType.ON return new HSBType(stringState);
: OnOffType.OFF; } else if (item instanceof LocationItem) {
} else if (item instanceof ContactItem) { return new PointType(stringState);
state[0] = dynamoBigDecimalItem.getState().compareTo(BigDecimal.ONE) == 0 ? OpenClosedType.OPEN } else if (item instanceof PlayerItem) {
: OpenClosedType.CLOSED; // Backwards-compatibility with legacy schema. New schema uses DynamoDBBigDecimalItem
} else if (item instanceof RollershutterItem) { try {
state[0] = new PercentType(dynamoBigDecimalItem.getState()); return PlayPauseType.valueOf(stringState);
} else { } catch (IllegalArgumentException e) {
logger.warn("Not sure how to convert big decimal item {} to type {}. Using StringType as fallback", return RewindFastforwardType.valueOf(stringState);
dynamoBigDecimalItem.getName(), item.getClass()); }
state[0] = new StringType(dynamoBigDecimalItem.getState().toString()); } else if (item instanceof DateTimeItem) {
try {
// Parse ZoneDateTime from string. DATEFORMATTER assumes UTC in case it is not clear
// from the string (should be).
// We convert to default/local timezone for user convenience (e.g. display)
return new DateTimeType(ZONED_DATE_TIME_CONVERTER_STRING.transformTo(stringState)
.withZoneSameInstant(ZoneId.systemDefault()));
} catch (DateTimeParseException e) {
logger.warn("Failed to parse {} as date. Outputting UNDEF instead", stringState);
return UnDefType.UNDEF;
}
} else if (item instanceof CallItem) {
String parts = stringState;
String[] strings = parts.split(",");
String orig = strings[0];
String dest = strings[1];
return new StringListType(orig, dest);
} else {
return new StringType(dynamoStringItem.getState());
}
} }
@Override
public @Nullable State visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
BigDecimal numberState = dynamoBigDecimalItem.getState();
if (numberState == null) {
return null;
}
if (item instanceof NumberItem) {
NumberItem numberItem = ((NumberItem) item);
Unit<? extends Quantity<?>> unit = targetUnit == null ? numberItem.getUnit() : targetUnit;
if (unit != null) {
return new QuantityType<>(numberState, unit);
} else {
return new DecimalType(numberState);
}
} else if (item instanceof DimmerItem) {
// % values have been stored as-is
return new PercentType(numberState);
} else if (item instanceof SwitchItem) {
return numberState.compareTo(BigDecimal.ZERO) != 0 ? OnOffType.ON : OnOffType.OFF;
} else if (item instanceof ContactItem) {
return numberState.compareTo(BigDecimal.ZERO) != 0 ? OpenClosedType.OPEN
: OpenClosedType.CLOSED;
} else if (item instanceof RollershutterItem) {
// Percents and UP/DOWN have been stored % values (not fractional)
return new PercentType(numberState);
} else if (item instanceof PlayerItem) {
if (numberState.equals(PLAY_BIGDECIMAL)) {
return PlayPauseType.PLAY;
} else if (numberState.equals(PAUSE_BIGDECIMAL)) {
return PlayPauseType.PAUSE;
} else if (numberState.equals(FAST_FORWARD_BIGDECIMAL)) {
return RewindFastforwardType.FASTFORWARD;
} else if (numberState.equals(REWIND_BIGDECIMAL)) {
return RewindFastforwardType.REWIND;
} else {
throw new IllegalArgumentException("Unknown serialized value");
}
} else {
logger.warn(
"Not sure how to convert big decimal item {} to type {}. Using StringType as fallback",
dynamoBigDecimalItem.getName(), item.getClass());
return new StringType(numberState.toString());
}
}
});
if (deserializedState == null) {
return null;
} }
}); return new DynamoDBHistoricItem(getName(), deserializedState, getTime());
return new DynamoDBHistoricItem(getName(), state[0], getTime()); } catch (Exception e) {
logger.trace("Failed to convert state '{}' to item {} {}: {} {}. Data persisted with incompatible item.",
this.state, item.getClass().getSimpleName(), item.getName(), e.getClass().getSimpleName(),
e.getMessage());
return null;
}
} }
/** /**
@ -232,10 +474,53 @@ public abstract class AbstractDynamoDBItem<T> implements DynamoDBItem<T> {
* DynamoItemVisitor) * DynamoItemVisitor)
*/ */
@Override @Override
public abstract void accept(DynamoDBItemVisitor visitor); public abstract <R> R accept(DynamoDBItemVisitor<R> visitor);
@Override @Override
public String toString() { public String toString() {
return DATEFORMATTER.format(time) + ": " + name + " -> " + state.toString(); @Nullable
T localState = state;
return DATEFORMATTER.format(time) + ": " + name + " -> "
+ (localState == null ? "<null>" : localState.toString());
}
@Override
public String getName() {
return name;
}
@Override
public void setName(String name) {
this.name = name;
}
@Override
public ZonedDateTime getTime() {
return time;
}
@Override
@Nullable
public Long getExpiryDate() {
return expiry;
}
@Override
public void setTime(ZonedDateTime time) {
this.time = time;
}
@Override
public @Nullable Integer getExpireDays() {
return expireDays;
}
@Override
public void setExpireDays(@Nullable Integer expireDays) {
this.expireDays = expireDays;
}
public void setExpiry(@Nullable Long expiry) {
this.expiry = expiry;
} }
} }

View File

@ -16,20 +16,37 @@ import java.math.BigDecimal;
import java.math.MathContext; import java.math.MathContext;
import java.time.ZonedDateTime; import java.time.ZonedDateTime;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute; import org.eclipse.jdt.annotation.NonNullByDefault;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBDocument; import org.eclipse.jdt.annotation.Nullable;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBRangeKey; import software.amazon.awssdk.enhanced.dynamodb.mapper.StaticTableSchema;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverted;
/** /**
* DynamoDBItem for items that can be serialized as DynamoDB number * DynamoDBItem for items that can be serialized as DynamoDB number
* *
* @author Sami Salonen - Initial contribution * @author Sami Salonen - Initial contribution
*/ */
@DynamoDBDocument @NonNullByDefault
public class DynamoDBBigDecimalItem extends AbstractDynamoDBItem<BigDecimal> { public class DynamoDBBigDecimalItem extends AbstractDynamoDBItem<BigDecimal> {
private static Class<@Nullable BigDecimal> NULLABLE_BIGDECIMAL = (Class<@Nullable BigDecimal>) BigDecimal.class;
public static StaticTableSchema<DynamoDBBigDecimalItem> TABLE_SCHEMA_LEGACY = getBaseSchemaBuilder(
DynamoDBBigDecimalItem.class, true).newItemSupplier(DynamoDBBigDecimalItem::new)
.addAttribute(NULLABLE_BIGDECIMAL, a -> a.name(ATTRIBUTE_NAME_ITEMSTATE_LEGACY)
.getter(DynamoDBBigDecimalItem::getState).setter(DynamoDBBigDecimalItem::setState))
.build();
public static StaticTableSchema<DynamoDBBigDecimalItem> TABLE_SCHEMA_NEW = getBaseSchemaBuilder(
DynamoDBBigDecimalItem.class, false)
.newItemSupplier(DynamoDBBigDecimalItem::new)
.addAttribute(NULLABLE_BIGDECIMAL,
a -> a.name(ATTRIBUTE_NAME_ITEMSTATE_NUMBER).getter(DynamoDBBigDecimalItem::getState)
.setter(DynamoDBBigDecimalItem::setState))
.addAttribute(NULLABLE_LONG, a -> a.name(ATTRIBUTE_NAME_EXPIRY)
.getter(AbstractDynamoDBItem::getExpiryDate).setter(AbstractDynamoDBItem::setExpiry))
.build();
/** /**
* We get the following error if the BigDecimal has too many digits * We get the following error if the BigDecimal has too many digits
* "Attempting to store more than 38 significant digits in a Number" * "Attempting to store more than 38 significant digits in a Number"
@ -40,58 +57,36 @@ public class DynamoDBBigDecimalItem extends AbstractDynamoDBItem<BigDecimal> {
private static final int MAX_DIGITS_SUPPORTED_BY_AMAZON = 38; private static final int MAX_DIGITS_SUPPORTED_BY_AMAZON = 38;
public DynamoDBBigDecimalItem() { public DynamoDBBigDecimalItem() {
this(null, null, null); this("", null, ZonedDateTime.now(), null);
} }
public DynamoDBBigDecimalItem(String name, BigDecimal state, ZonedDateTime time) { public DynamoDBBigDecimalItem(String name, @Nullable BigDecimal state, ZonedDateTime time,
super(name, state, time); @Nullable Integer expireDays) {
super(name, state, time, expireDays);
} }
@DynamoDBAttribute(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE)
@Override @Override
public BigDecimal getState() { public @Nullable BigDecimal getState() {
// When serializing this to the wire, we round the number in order to ensure // When serializing this to the wire, we round the number in order to ensure
// that it is within the dynamodb limits // that it is within the dynamodb limits
return loseDigits(state); BigDecimal localState = state;
} if (localState == null) {
return null;
@DynamoDBHashKey(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME) }
@Override return loseDigits(localState);
public String getName() {
return name;
} }
@Override @Override
@DynamoDBRangeKey(attributeName = ATTRIBUTE_NAME_TIMEUTC) public void setState(@Nullable BigDecimal state) {
@DynamoDBTypeConverted(converter = ZonedDateTimeConverter.class)
public ZonedDateTime getTime() {
return time;
}
@Override
public void setName(String name) {
this.name = name;
}
@Override
public void setState(BigDecimal state) {
this.state = state; this.state = state;
} }
@Override @Override
public void setTime(ZonedDateTime time) { public <T> T accept(DynamoDBItemVisitor<T> visitor) {
this.time = time; return visitor.visit(this);
}
@Override
public void accept(org.openhab.persistence.dynamodb.internal.DynamoDBItemVisitor visitor) {
visitor.visit(this);
} }
static BigDecimal loseDigits(BigDecimal number) { static BigDecimal loseDigits(BigDecimal number) {
if (number == null) {
return null;
}
return number.round(new MathContext(MAX_DIGITS_SUPPORTED_BY_AMAZON)); return number.round(new MathContext(MAX_DIGITS_SUPPORTED_BY_AMAZON));
} }
} }

View File

@ -1,66 +0,0 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClientBuilder;
import com.amazonaws.services.dynamodbv2.document.DynamoDB;
/**
* Shallow wrapper for Dynamo DB wrappers
*
* @author Sami Salonen - Initial contribution
*/
public class DynamoDBClient {
private final Logger logger = LoggerFactory.getLogger(DynamoDBClient.class);
private DynamoDB dynamo;
private AmazonDynamoDB client;
public DynamoDBClient(AWSCredentials credentials, Regions region) {
client = AmazonDynamoDBClientBuilder.standard().withRegion(region)
.withCredentials(new AWSStaticCredentialsProvider(credentials)).build();
dynamo = new DynamoDB(client);
}
public DynamoDBClient(DynamoDBConfig clientConfig) {
this(clientConfig.getCredentials(), clientConfig.getRegion());
}
public AmazonDynamoDB getDynamoClient() {
return client;
}
public DynamoDB getDynamoDB() {
return dynamo;
}
public void shutdown() {
dynamo.shutdown();
}
public boolean checkConnection() {
try {
dynamo.listTables(1).firstPage();
} catch (Exception e) {
logger.warn("Got internal server error when trying to list tables: {}", e.getMessage());
return false;
}
return true;
}
}

View File

@ -12,8 +12,9 @@
*/ */
package org.openhab.persistence.dynamodb.internal; package org.openhab.persistence.dynamodb.internal;
import java.util.Arrays; import java.nio.file.Path;
import java.util.Map; import java.util.Map;
import java.util.Optional;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import org.eclipse.jdt.annotation.NonNullByDefault; import org.eclipse.jdt.annotation.NonNullByDefault;
@ -21,35 +22,45 @@ import org.eclipse.jdt.annotation.Nullable;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.amazonaws.auth.AWSCredentials; import software.amazon.awssdk.auth.credentials.AwsBasicCredentials;
import com.amazonaws.auth.BasicAWSCredentials; import software.amazon.awssdk.auth.credentials.AwsCredentials;
import com.amazonaws.auth.profile.ProfilesConfigFile; import software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider;
import com.amazonaws.regions.Regions; import software.amazon.awssdk.awscore.retry.AwsRetryPolicy;
import software.amazon.awssdk.core.retry.RetryMode;
import software.amazon.awssdk.core.retry.RetryPolicy;
import software.amazon.awssdk.profiles.ProfileFile;
import software.amazon.awssdk.profiles.ProfileFile.Type;
import software.amazon.awssdk.profiles.ProfileProperty;
import software.amazon.awssdk.regions.Region;
/** /**
* Configuration for DynamoDB connections * Configuration for DynamoDB connections
* *
* If table parameter is specified and is not blank, we use new table schema (ExpectedTableRevision.NEW).
* If tablePrefix parameter is specified and is not blank, we use legacy table schema (ExpectedTableRevision.LEGACY).
* Other cases conservatively set ExpectedTableRevision.MAYBE_LEGACY, detecting the right schema during runtime.
*
*
* @author Sami Salonen - Initial contribution * @author Sami Salonen - Initial contribution
*/ */
@NonNullByDefault @NonNullByDefault
public class DynamoDBConfig { public class DynamoDBConfig {
public static final String DEFAULT_TABLE_PREFIX = "openhab-"; public static final String DEFAULT_TABLE_PREFIX = "openhab-";
public static final boolean DEFAULT_CREATE_TABLE_ON_DEMAND = true; public static final String DEFAULT_TABLE_NAME = "openhab";
public static final long DEFAULT_READ_CAPACITY_UNITS = 1; public static final long DEFAULT_READ_CAPACITY_UNITS = 1;
public static final long DEFAULT_WRITE_CAPACITY_UNITS = 1; public static final long DEFAULT_WRITE_CAPACITY_UNITS = 1;
public static final long DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS = 1000; public static final RetryMode DEFAULT_RETRY_MODE = RetryMode.STANDARD;
public static final int DEFAULT_BUFFER_SIZE = 1000;
private static final Logger LOGGER = LoggerFactory.getLogger(DynamoDBConfig.class); private static final Logger LOGGER = LoggerFactory.getLogger(DynamoDBConfig.class);
private String tablePrefix = DEFAULT_TABLE_PREFIX; private long readCapacityUnits;
private Regions region; private long writeCapacityUnits;
private AWSCredentials credentials; private Region region;
private boolean createTable = DEFAULT_CREATE_TABLE_ON_DEMAND; private AwsCredentials credentials;
private long readCapacityUnits = DEFAULT_READ_CAPACITY_UNITS; private RetryPolicy retryPolicy;
private long writeCapacityUnits = DEFAULT_WRITE_CAPACITY_UNITS; private ExpectedTableSchema tableRevision;
private long bufferCommitIntervalMillis = DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS; private String table;
private int bufferSize = DEFAULT_BUFFER_SIZE; private String tablePrefixLegacy;
private @Nullable Integer expireDays;
/** /**
* *
@ -57,26 +68,26 @@ public class DynamoDBConfig {
* @return DynamoDB configuration. Returns null in case of configuration errors * @return DynamoDB configuration. Returns null in case of configuration errors
*/ */
public static @Nullable DynamoDBConfig fromConfig(Map<String, Object> config) { public static @Nullable DynamoDBConfig fromConfig(Map<String, Object> config) {
ExpectedTableSchema tableRevision;
try { try {
String regionName = (String) config.get("region"); String regionName = (String) config.get("region");
if (regionName == null) { if (regionName == null) {
return null; return null;
} }
final Regions region; final Region region;
try { if (Region.regions().stream().noneMatch(r -> r.toString().equals(regionName))) {
region = Regions.fromName(regionName); LOGGER.warn("Region {} is not matching known regions: {}. The region might not be supported.",
} catch (IllegalArgumentException e) { regionName, Region.regions().stream().map(r -> r.toString()).collect(Collectors.joining(", ")));
LOGGER.error("Specify valid AWS region to use, got {}. Valid values include: {}", regionName, Arrays
.asList(Regions.values()).stream().map(r -> r.getName()).collect(Collectors.joining(",")));
return null;
} }
region = Region.of(regionName);
AWSCredentials credentials; RetryMode retryMode = RetryMode.STANDARD;
AwsCredentials credentials;
String accessKey = (String) config.get("accessKey"); String accessKey = (String) config.get("accessKey");
String secretKey = (String) config.get("secretKey"); String secretKey = (String) config.get("secretKey");
if (accessKey != null && !accessKey.isBlank() && secretKey != null && !secretKey.isBlank()) { if (accessKey != null && !accessKey.isBlank() && secretKey != null && !secretKey.isBlank()) {
LOGGER.debug("accessKey and secretKey specified. Using those."); LOGGER.debug("accessKey and secretKey specified. Using those.");
credentials = new BasicAWSCredentials(accessKey, secretKey); credentials = AwsBasicCredentials.create(accessKey, secretKey);
} else { } else {
LOGGER.debug("accessKey and/or secretKey blank. Checking profilesConfigFile and profile."); LOGGER.debug("accessKey and/or secretKey blank. Checking profilesConfigFile and profile.");
String profilesConfigFile = (String) config.get("profilesConfigFile"); String profilesConfigFile = (String) config.get("profilesConfigFile");
@ -87,28 +98,49 @@ public class DynamoDBConfig {
+ "profile for providing AWS credentials"); + "profile for providing AWS credentials");
return null; return null;
} }
credentials = new ProfilesConfigFile(profilesConfigFile).getCredentials(profile); ProfileFile profileFile = ProfileFile.builder().content(Path.of(profilesConfigFile))
.type(Type.CREDENTIALS).build();
credentials = ProfileCredentialsProvider.builder().profileFile(profileFile).profileName(profile).build()
.resolveCredentials();
retryMode = profileFile.profile(profile).flatMap(p -> p.property(ProfileProperty.RETRY_MODE))
.flatMap(retry_mode -> {
for (RetryMode value : RetryMode.values()) {
if (retry_mode.equalsIgnoreCase(value.name())) {
return Optional.of(value);
}
}
LOGGER.warn("Unknown retry_mode '{}' in profile. Ignoring and using default {} retry mode.",
retry_mode, DEFAULT_RETRY_MODE);
return Optional.empty();
}).orElse(DEFAULT_RETRY_MODE);
LOGGER.debug("Retry mode {}", retryMode);
} }
String table = (String) config.get("tablePrefix"); String table = (String) config.get("table");
String tablePrefixLegacy;
if (table == null || table.isBlank()) { if (table == null || table.isBlank()) {
LOGGER.debug("Using default table name {}", DEFAULT_TABLE_PREFIX); // the new parameter 'table' has not been set. Check whether the legacy parameter 'tablePrefix' is set
table = DEFAULT_TABLE_PREFIX; table = DEFAULT_TABLE_NAME;
} tablePrefixLegacy = (String) config.get("tablePrefix");
if (tablePrefixLegacy == null || tablePrefixLegacy.isBlank()) {
final boolean createTable; LOGGER.debug("Using default table prefix {}", DEFAULT_TABLE_PREFIX);
String createTableParam = (String) config.get("createTable"); // No explicit value has been specified for tablePrefix, user could be still using the legacy setup
if (createTableParam == null || createTableParam.isBlank()) { tableRevision = ExpectedTableSchema.MAYBE_LEGACY;
LOGGER.debug("Creating table on demand: {}", DEFAULT_CREATE_TABLE_ON_DEMAND); tablePrefixLegacy = DEFAULT_TABLE_PREFIX;
createTable = DEFAULT_CREATE_TABLE_ON_DEMAND; } else {
// Explicit value for tablePrefix, user certainly prefers LEGACY
tableRevision = ExpectedTableSchema.LEGACY;
}
} else { } else {
createTable = Boolean.parseBoolean(createTableParam); tableRevision = ExpectedTableSchema.NEW;
tablePrefixLegacy = DEFAULT_TABLE_PREFIX;
} }
final long readCapacityUnits; final long readCapacityUnits;
String readCapacityUnitsParam = (String) config.get("readCapacityUnits"); String readCapacityUnitsParam = (String) config.get("readCapacityUnits");
if (readCapacityUnitsParam == null || readCapacityUnitsParam.isBlank()) { if (readCapacityUnitsParam == null || readCapacityUnitsParam.isBlank()) {
LOGGER.debug("Read capacity units: {}", DEFAULT_READ_CAPACITY_UNITS);
readCapacityUnits = DEFAULT_READ_CAPACITY_UNITS; readCapacityUnits = DEFAULT_READ_CAPACITY_UNITS;
} else { } else {
readCapacityUnits = Long.parseLong(readCapacityUnitsParam); readCapacityUnits = Long.parseLong(readCapacityUnitsParam);
@ -117,66 +149,100 @@ public class DynamoDBConfig {
final long writeCapacityUnits; final long writeCapacityUnits;
String writeCapacityUnitsParam = (String) config.get("writeCapacityUnits"); String writeCapacityUnitsParam = (String) config.get("writeCapacityUnits");
if (writeCapacityUnitsParam == null || writeCapacityUnitsParam.isBlank()) { if (writeCapacityUnitsParam == null || writeCapacityUnitsParam.isBlank()) {
LOGGER.debug("Write capacity units: {}", DEFAULT_WRITE_CAPACITY_UNITS);
writeCapacityUnits = DEFAULT_WRITE_CAPACITY_UNITS; writeCapacityUnits = DEFAULT_WRITE_CAPACITY_UNITS;
} else { } else {
writeCapacityUnits = Long.parseLong(writeCapacityUnitsParam); writeCapacityUnits = Long.parseLong(writeCapacityUnitsParam);
} }
final long bufferCommitIntervalMillis; final @Nullable Integer expireDays;
String bufferCommitIntervalMillisParam = (String) config.get("bufferCommitIntervalMillis"); String expireDaysString = (String) config.get("expireDays");
if (bufferCommitIntervalMillisParam == null || bufferCommitIntervalMillisParam.isBlank()) { if (expireDaysString == null || expireDaysString.isBlank()) {
LOGGER.debug("Buffer commit interval millis: {}", DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS); expireDays = null;
bufferCommitIntervalMillis = DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS;
} else { } else {
bufferCommitIntervalMillis = Long.parseLong(bufferCommitIntervalMillisParam); expireDays = Integer.parseInt(expireDaysString);
if (expireDays <= 0) {
LOGGER.error("expireDays should be positive integer or null");
return null;
}
} }
final int bufferSize; switch (tableRevision) {
String bufferSizeParam = (String) config.get("bufferSize"); case NEW:
if (bufferSizeParam == null || bufferSizeParam.isBlank()) { LOGGER.debug("Using new DynamoDB table schema");
LOGGER.debug("Buffer size: {}", DEFAULT_BUFFER_SIZE); return DynamoDBConfig.newSchema(region, credentials, AwsRetryPolicy.forRetryMode(retryMode), table,
bufferSize = DEFAULT_BUFFER_SIZE; readCapacityUnits, writeCapacityUnits, expireDays);
} else { case LEGACY:
bufferSize = Integer.parseInt(bufferSizeParam); LOGGER.warn(
"Using legacy DynamoDB table schema. It is recommended to transition to new schema by defining 'table' parameter and not configuring 'tablePrefix'");
return DynamoDBConfig.legacySchema(region, credentials, AwsRetryPolicy.forRetryMode(retryMode),
tablePrefixLegacy, readCapacityUnits, writeCapacityUnits);
case MAYBE_LEGACY:
LOGGER.debug(
"Unclear whether we should use new legacy DynamoDB table schema. It is recommended to explicitly define new 'table' parameter. The correct table schema will be detected at runtime.");
return DynamoDBConfig.maybeLegacySchema(region, credentials, AwsRetryPolicy.forRetryMode(retryMode),
table, tablePrefixLegacy, readCapacityUnits, writeCapacityUnits, expireDays);
default:
throw new IllegalStateException("Unhandled enum. Bug");
} }
return new DynamoDBConfig(region, credentials, table, createTable, readCapacityUnits, writeCapacityUnits,
bufferCommitIntervalMillis, bufferSize);
} catch (Exception e) { } catch (Exception e) {
LOGGER.error("Error with configuration", e); LOGGER.error("Error with configuration: {} {}", e.getClass().getSimpleName(), e.getMessage());
return null; return null;
} }
} }
public DynamoDBConfig(Regions region, AWSCredentials credentials, String table, boolean createTable, private static DynamoDBConfig newSchema(Region region, AwsCredentials credentials, RetryPolicy retryPolicy,
long readCapacityUnits, long writeCapacityUnits, long bufferCommitIntervalMillis, int bufferSize) { String table, long readCapacityUnits, long writeCapacityUnits, @Nullable Integer expireDays) {
this.region = region; return new DynamoDBConfig(region, credentials, retryPolicy, table, "", ExpectedTableSchema.NEW,
this.credentials = credentials; readCapacityUnits, writeCapacityUnits, expireDays);
this.tablePrefix = table;
this.createTable = createTable;
this.readCapacityUnits = readCapacityUnits;
this.writeCapacityUnits = writeCapacityUnits;
this.bufferCommitIntervalMillis = bufferCommitIntervalMillis;
this.bufferSize = bufferSize;
} }
public AWSCredentials getCredentials() { private static DynamoDBConfig legacySchema(Region region, AwsCredentials credentials, RetryPolicy retryPolicy,
String tablePrefixLegacy, long readCapacityUnits, long writeCapacityUnits) {
return new DynamoDBConfig(region, credentials, retryPolicy, "", tablePrefixLegacy, ExpectedTableSchema.LEGACY,
readCapacityUnits, writeCapacityUnits, null);
}
private static DynamoDBConfig maybeLegacySchema(Region region, AwsCredentials credentials, RetryPolicy retryPolicy,
String table, String tablePrefixLegacy, long readCapacityUnits, long writeCapacityUnits,
@Nullable Integer expireDays) {
return new DynamoDBConfig(region, credentials, retryPolicy, table, tablePrefixLegacy,
ExpectedTableSchema.MAYBE_LEGACY, readCapacityUnits, writeCapacityUnits, expireDays);
}
private DynamoDBConfig(Region region, AwsCredentials credentials, RetryPolicy retryPolicy, String table,
String tablePrefixLegacy, ExpectedTableSchema tableRevision, long readCapacityUnits,
long writeCapacityUnits, @Nullable Integer expireDays) {
this.region = region;
this.credentials = credentials;
this.retryPolicy = retryPolicy;
this.table = table;
this.tablePrefixLegacy = tablePrefixLegacy;
this.tableRevision = tableRevision;
this.readCapacityUnits = readCapacityUnits;
this.writeCapacityUnits = writeCapacityUnits;
this.expireDays = expireDays;
}
public AwsCredentials getCredentials() {
return credentials; return credentials;
} }
public String getTablePrefix() { public String getTablePrefixLegacy() {
return tablePrefix; return tablePrefixLegacy;
} }
public Regions getRegion() { public String getTable() {
return table;
}
public ExpectedTableSchema getTableRevision() {
return tableRevision;
}
public Region getRegion() {
return region; return region;
} }
public boolean isCreateTable() {
return createTable;
}
public long getReadCapacityUnits() { public long getReadCapacityUnits() {
return readCapacityUnits; return readCapacityUnits;
} }
@ -185,11 +251,11 @@ public class DynamoDBConfig {
return writeCapacityUnits; return writeCapacityUnits;
} }
public long getBufferCommitIntervalMillis() { public RetryPolicy getRetryPolicy() {
return bufferCommitIntervalMillis; return retryPolicy;
} }
public int getBufferSize() { public @Nullable Integer getExpireDays() {
return bufferSize; return expireDays;
} }
} }

View File

@ -14,6 +14,10 @@ package org.openhab.persistence.dynamodb.internal;
import java.time.ZonedDateTime; import java.time.ZonedDateTime;
import javax.measure.Unit;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.openhab.core.items.Item; import org.openhab.core.items.Item;
import org.openhab.core.persistence.HistoricItem; import org.openhab.core.persistence.HistoricItem;
@ -24,35 +28,127 @@ import org.openhab.core.persistence.HistoricItem;
* *
* @author Sami Salonen - Initial contribution * @author Sami Salonen - Initial contribution
*/ */
@NonNullByDefault
public interface DynamoDBItem<T> { public interface DynamoDBItem<T> {
static final String DATE_FORMAT = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"; static final String DATE_FORMAT = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'";
static final String ATTRIBUTE_NAME_TIMEUTC = "timeutc"; static final String ATTRIBUTE_NAME_TIMEUTC_LEGACY = "timeutc";
static final String ATTRIBUTE_NAME_ITEMNAME_LEGACY = "itemname";
static final String ATTRIBUTE_NAME_ITEMNAME = "itemname"; static final String ATTRIBUTE_NAME_ITEMSTATE_LEGACY = "itemstate";
static final String ATTRIBUTE_NAME_TIMEUTC = "t";
static final String ATTRIBUTE_NAME_ITEMSTATE = "itemstate"; static final String ATTRIBUTE_NAME_ITEMNAME = "i";
static final String ATTRIBUTE_NAME_ITEMSTATE_STRING = "s";
static final String ATTRIBUTE_NAME_ITEMSTATE_NUMBER = "n";
static final String ATTRIBUTE_NAME_EXPIRY = "exp";
/** /**
* Convert this AbstractDynamoItem as HistoricItem. * Convert this AbstractDynamoItem as HistoricItem, i.e. converting serialized state back to openHAB state.
*
* Returns null when this instance has null state.
*
* If item is NumberItem and has an unit, the data is converted to QuantityType with item.getUnit().
* *
* @param item Item representing this item. Used to determine item type. * @param item Item representing this item. Used to determine item type.
* @return HistoricItem representing this DynamoDBItem. * @return HistoricItem representing this DynamoDBItem.
*/ */
@Nullable
HistoricItem asHistoricItem(Item item); HistoricItem asHistoricItem(Item item);
/**
* Convert this AbstractDynamoItem as HistoricItem.
*
* Returns null when this instance has null state.
* The implementation can deal with legacy schema as well.
*
* Use this method when repeated calls are expected for same item (avoids the expensive call to item.getUnit())
*
* @param item Item representing this item. Used to determine item type.
* @param targetUnit unit to convert the data if item is with Dimension. Has only effect with NumberItems and with
* numeric DynamoDBItems.
* @return HistoricItem representing this DynamoDBItem.
*/
@Nullable
HistoricItem asHistoricItem(Item item, @Nullable Unit<?> targetUnit);
/**
* Get item name
*
* @return item name
*/
String getName(); String getName();
/**
* Get item state, in the serialized format
*
* @return item state as serialized format
*/
@Nullable
T getState(); T getState();
/**
* Get timestamp of this value
*
* @return timestamp
*/
ZonedDateTime getTime(); ZonedDateTime getTime();
/**
* Get expire time for the DynamoDB item in days.
*
* Does not have any effect with legacy schema.
*
* Also known as time-to-live or TTL.
* Null means that expire is disabled
*
* @return expire time in days
*/
@Nullable
Integer getExpireDays();
/**
* Get expiry date for the DynamoDB item in epoch seconds
*
* This is used with DynamoDB Time to Live TTL feature.
*
* @return expiry date of the data. Equivalent to getTime() + getExpireDays() or null when expireDays is null.
*/
@Nullable
Long getExpiryDate();
/**
* Setter for item name
*
* @param name item name
*/
void setName(String name); void setName(String name);
void setState(T state); /**
* Setter for serialized state
*
* @param state serialized state
*/
void setState(@Nullable T state);
/**
* Set timestamp of the data
*
* @param time timestamp
*/
void setTime(ZonedDateTime time); void setTime(ZonedDateTime time);
void accept(DynamoDBItemVisitor visitor); /**
* Set expire time for the DynamoDB item in days.
*
* Does not have any effect with legacy schema.
*
* Also known as time-to-live or TTL.
* Use null to disable expiration
*
* @param expireDays expire time in days. Should be positive or null.
*
*/
void setExpireDays(@Nullable Integer expireDays);
<R> R accept(DynamoDBItemVisitor<R> visitor);
} }

View File

@ -21,9 +21,9 @@ import org.eclipse.jdt.annotation.NonNullByDefault;
* *
*/ */
@NonNullByDefault @NonNullByDefault
public interface DynamoDBItemVisitor { public interface DynamoDBItemVisitor<T> {
public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem); public T visit(DynamoDBBigDecimalItem dynamoBigDecimalItem);
public void visit(DynamoDBStringItem dynamoStringItem); public T visit(DynamoDBStringItem dynamoStringItem);
} }

View File

@ -12,30 +12,37 @@
*/ */
package org.openhab.persistence.dynamodb.internal; package org.openhab.persistence.dynamodb.internal;
import java.lang.reflect.InvocationTargetException;
import java.net.URI;
import java.time.Duration;
import java.time.Instant;
import java.time.ZonedDateTime; import java.time.ZonedDateTime;
import java.util.ArrayDeque;
import java.util.ArrayList;
import java.util.Collections; import java.util.Collections;
import java.util.Deque; import java.util.Iterator;
import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Locale; import java.util.Locale;
import java.util.Map; import java.util.Map;
import java.util.Map.Entry;
import java.util.Set; import java.util.Set;
import java.util.concurrent.Executors; import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ScheduledExecutorService; import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ScheduledFuture; import java.util.concurrent.ExecutionException;
import java.util.concurrent.TimeUnit; import java.util.concurrent.ExecutorService;
import java.util.function.Function; import java.util.concurrent.ThreadPoolExecutor;
import java.util.stream.Collectors;
import javax.measure.Unit;
import org.eclipse.jdt.annotation.NonNullByDefault; import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable; import org.eclipse.jdt.annotation.Nullable;
import org.openhab.core.common.NamedThreadFactory; import org.openhab.core.common.ThreadPoolManager;
import org.openhab.core.config.core.ConfigurableService; import org.openhab.core.config.core.ConfigurableService;
import org.openhab.core.items.GenericItem;
import org.openhab.core.items.GroupItem;
import org.openhab.core.items.Item; import org.openhab.core.items.Item;
import org.openhab.core.items.ItemNotFoundException; import org.openhab.core.items.ItemNotFoundException;
import org.openhab.core.items.ItemRegistry; import org.openhab.core.items.ItemRegistry;
import org.openhab.core.library.items.NumberItem;
import org.openhab.core.library.types.QuantityType;
import org.openhab.core.persistence.FilterCriteria; import org.openhab.core.persistence.FilterCriteria;
import org.openhab.core.persistence.HistoricItem; import org.openhab.core.persistence.HistoricItem;
import org.openhab.core.persistence.PersistenceItemInfo; import org.openhab.core.persistence.PersistenceItemInfo;
@ -43,31 +50,31 @@ import org.openhab.core.persistence.PersistenceService;
import org.openhab.core.persistence.QueryablePersistenceService; import org.openhab.core.persistence.QueryablePersistenceService;
import org.openhab.core.persistence.strategy.PersistenceStrategy; import org.openhab.core.persistence.strategy.PersistenceStrategy;
import org.openhab.core.types.State; import org.openhab.core.types.State;
import org.openhab.core.types.UnDefType;
import org.osgi.framework.BundleContext; import org.osgi.framework.BundleContext;
import org.osgi.framework.Constants; import org.osgi.framework.Constants;
import org.osgi.service.component.annotations.Activate; import org.osgi.service.component.annotations.Activate;
import org.osgi.service.component.annotations.Component; import org.osgi.service.component.annotations.Component;
import org.osgi.service.component.annotations.Deactivate; import org.osgi.service.component.annotations.Deactivate;
import org.osgi.service.component.annotations.Reference; import org.osgi.service.component.annotations.Reference;
import org.reactivestreams.Subscriber;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.amazonaws.AmazonClientException; import software.amazon.awssdk.auth.credentials.StaticCredentialsProvider;
import com.amazonaws.AmazonServiceException; import software.amazon.awssdk.awscore.AwsRequestOverrideConfiguration;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper; import software.amazon.awssdk.core.async.SdkPublisher;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper.FailedBatch; import software.amazon.awssdk.core.client.config.ClientAsyncConfiguration;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig; import software.amazon.awssdk.core.client.config.ClientOverrideConfiguration;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig.PaginationLoadingStrategy; import software.amazon.awssdk.core.client.config.SdkAdvancedAsyncClientOption;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBQueryExpression; import software.amazon.awssdk.enhanced.dynamodb.DynamoDbAsyncTable;
import com.amazonaws.services.dynamodbv2.datamodeling.PaginatedQueryList; import software.amazon.awssdk.enhanced.dynamodb.DynamoDbEnhancedAsyncClient;
import com.amazonaws.services.dynamodbv2.document.BatchWriteItemOutcome; import software.amazon.awssdk.enhanced.dynamodb.TableSchema;
import com.amazonaws.services.dynamodbv2.model.CreateTableRequest; import software.amazon.awssdk.enhanced.dynamodb.model.QueryEnhancedRequest;
import com.amazonaws.services.dynamodbv2.model.GlobalSecondaryIndex; import software.amazon.awssdk.http.nio.netty.NettyNioAsyncHttpClient;
import com.amazonaws.services.dynamodbv2.model.ProvisionedThroughput; import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClient;
import com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException; import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClientBuilder;
import com.amazonaws.services.dynamodbv2.model.TableDescription; import software.amazon.awssdk.services.dynamodb.model.ResourceNotFoundException;
import com.amazonaws.services.dynamodbv2.model.TableStatus;
import com.amazonaws.services.dynamodbv2.model.WriteRequest;
/** /**
* This is the implementation of the DynamoDB {@link PersistenceService}. It persists item values * This is the implementation of the DynamoDB {@link PersistenceService}. It persists item values
@ -87,92 +94,40 @@ import com.amazonaws.services.dynamodbv2.model.WriteRequest;
QueryablePersistenceService.class }, configurationPid = "org.openhab.dynamodb", // QueryablePersistenceService.class }, configurationPid = "org.openhab.dynamodb", //
property = Constants.SERVICE_PID + "=org.openhab.dynamodb") property = Constants.SERVICE_PID + "=org.openhab.dynamodb")
@ConfigurableService(category = "persistence", label = "DynamoDB Persistence Service", description_uri = DynamoDBPersistenceService.CONFIG_URI) @ConfigurableService(category = "persistence", label = "DynamoDB Persistence Service", description_uri = DynamoDBPersistenceService.CONFIG_URI)
public class DynamoDBPersistenceService extends AbstractBufferedPersistenceService<DynamoDBItem<?>> public class DynamoDBPersistenceService implements QueryablePersistenceService {
implements QueryablePersistenceService {
private static final int MAX_CONCURRENCY = 100;
protected static final String CONFIG_URI = "persistence:dynamodb"; protected static final String CONFIG_URI = "persistence:dynamodb";
private class ExponentialBackoffRetry implements Runnable {
private int retry;
private Map<String, List<WriteRequest>> unprocessedItems;
private @Nullable Exception lastException;
public ExponentialBackoffRetry(Map<String, List<WriteRequest>> unprocessedItems) {
this.unprocessedItems = unprocessedItems;
}
@Override
public void run() {
logger.debug("Error storing object to dynamo, unprocessed items: {}. Retrying with exponential back-off",
unprocessedItems);
lastException = null;
while (!unprocessedItems.isEmpty() && retry < WAIT_MILLIS_IN_RETRIES.length) {
if (!sleep()) {
// Interrupted
return;
}
retry++;
try {
BatchWriteItemOutcome outcome = DynamoDBPersistenceService.this.db.getDynamoDB()
.batchWriteItemUnprocessed(unprocessedItems);
unprocessedItems = outcome.getUnprocessedItems();
lastException = null;
} catch (AmazonServiceException e) {
if (e instanceof ResourceNotFoundException) {
logger.debug(
"DynamoDB query raised unexpected exception: {}. This might happen if table was recently created",
e.getMessage());
} else {
logger.debug("DynamoDB query raised unexpected exception: {}.", e.getMessage());
}
lastException = e;
continue;
}
}
if (unprocessedItems.isEmpty()) {
logger.debug("After {} retries successfully wrote all unprocessed items", retry);
} else {
logger.warn(
"Even after retries failed to write some items. Last exception: {} {}, unprocessed items: {}",
lastException == null ? "null" : lastException.getClass().getName(),
lastException == null ? "null" : lastException.getMessage(), unprocessedItems);
}
}
private boolean sleep() {
try {
long sleepTime;
if (retry == 1 && lastException != null && lastException instanceof ResourceNotFoundException) {
sleepTime = WAIT_ON_FIRST_RESOURCE_NOT_FOUND_MILLIS;
} else {
sleepTime = WAIT_MILLIS_IN_RETRIES[retry];
}
Thread.sleep(sleepTime);
return true;
} catch (InterruptedException e) {
logger.debug("Interrupted while writing data!");
return false;
}
}
public Map<String, List<WriteRequest>> getUnprocessedItems() {
return unprocessedItems;
}
}
private static final int WAIT_ON_FIRST_RESOURCE_NOT_FOUND_MILLIS = 5000;
private static final int[] WAIT_MILLIS_IN_RETRIES = new int[] { 100, 100, 200, 300, 500 };
private static final String DYNAMODB_THREADPOOL_NAME = "dynamodbPersistenceService"; private static final String DYNAMODB_THREADPOOL_NAME = "dynamodbPersistenceService";
private final ItemRegistry itemRegistry; private ItemRegistry itemRegistry;
private @Nullable DynamoDBClient db; private @Nullable DynamoDbEnhancedAsyncClient client;
private final Logger logger = LoggerFactory.getLogger(DynamoDBPersistenceService.class); private @Nullable DynamoDbAsyncClient lowLevelClient;
private final static Logger logger = LoggerFactory.getLogger(DynamoDBPersistenceService.class);
private boolean isProperlyConfigured; private boolean isProperlyConfigured;
private @NonNullByDefault({}) DynamoDBConfig dbConfig; private @Nullable DynamoDBConfig dbConfig;
private @NonNullByDefault({}) DynamoDBTableNameResolver tableNameResolver; private @Nullable DynamoDBTableNameResolver tableNameResolver;
private final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1, private final ExecutorService executor = ThreadPoolManager.getPool(DYNAMODB_THREADPOOL_NAME);
new NamedThreadFactory(DYNAMODB_THREADPOOL_NAME)); private static final Duration TIMEOUT_API_CALL = Duration.ofSeconds(60);
private @Nullable ScheduledFuture<?> writeBufferedDataFuture; private static final Duration TIMEOUT_API_CALL_ATTEMPT = Duration.ofSeconds(5);
private Map<Class<? extends DynamoDBItem<?>>, DynamoDbAsyncTable<? extends DynamoDBItem<?>>> tableCache = new ConcurrentHashMap<>(
2);
private @Nullable URI endpointOverride;
void overrideConfig(AwsRequestOverrideConfiguration.Builder config) {
config.apiCallAttemptTimeout(TIMEOUT_API_CALL_ATTEMPT).apiCallTimeout(TIMEOUT_API_CALL);
}
void overrideConfig(ClientOverrideConfiguration.Builder config) {
DynamoDBConfig localDbConfig = dbConfig;
config.apiCallAttemptTimeout(TIMEOUT_API_CALL_ATTEMPT).apiCallTimeout(TIMEOUT_API_CALL);
if (localDbConfig != null) {
config.retryPolicy(localDbConfig.getRetryPolicy());
}
}
@Activate @Activate
public DynamoDBPersistenceService(final @Reference ItemRegistry itemRegistry) { public DynamoDBPersistenceService(final @Reference ItemRegistry itemRegistry) {
@ -180,26 +135,51 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
} }
/** /**
* For testing. Allows access to underlying DynamoDBClient. * For tests
* */
* @return DynamoDBClient connected to AWS Dyanamo DB. DynamoDBPersistenceService(final ItemRegistry itemRegistry, @Nullable URI endpointOverride) {
this.itemRegistry = itemRegistry;
this.endpointOverride = endpointOverride;
}
/**
* For tests
*/ */
@Nullable @Nullable
DynamoDBClient getDb() { URI getEndpointOverride() {
return db; return endpointOverride;
}
@Nullable
DynamoDbAsyncClient getLowLevelClient() {
return lowLevelClient;
}
ExecutorService getExecutor() {
return executor;
}
@Nullable
DynamoDBTableNameResolver getTableNameResolver() {
return tableNameResolver;
}
@Nullable
DynamoDBConfig getDbConfig() {
return dbConfig;
} }
@Activate @Activate
public void activate(final @Nullable BundleContext bundleContext, final Map<String, Object> config) { public void activate(final @Nullable BundleContext bundleContext, final Map<String, Object> config) {
resetClient(); disconnect();
dbConfig = DynamoDBConfig.fromConfig(config); DynamoDBConfig localDbConfig = dbConfig = DynamoDBConfig.fromConfig(config);
if (dbConfig == null) { if (localDbConfig == null) {
// Configuration was invalid. Abort service activation. // Configuration was invalid. Abort service activation.
// Error is already logger in fromConfig. // Error is already logger in fromConfig.
return; return;
} }
tableNameResolver = new DynamoDBTableNameResolver(localDbConfig.getTableRevision(), localDbConfig.getTable(),
tableNameResolver = new DynamoDBTableNameResolver(dbConfig.getTablePrefix()); localDbConfig.getTablePrefixLegacy());
try { try {
if (!ensureClient()) { if (!ensureClient()) {
logger.error("Error creating dynamodb database client. Aborting service activation."); logger.error("Error creating dynamodb database client. Aborting service activation.");
@ -210,27 +190,6 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
return; return;
} }
writeBufferedDataFuture = null;
resetWithBufferSize(dbConfig.getBufferSize());
long commitIntervalMillis = dbConfig.getBufferCommitIntervalMillis();
if (commitIntervalMillis > 0) {
writeBufferedDataFuture = scheduler.scheduleWithFixedDelay(new Runnable() {
@Override
public void run() {
try {
DynamoDBPersistenceService.this.flushBufferedData();
} catch (RuntimeException e) {
// We want to catch all unexpected exceptions since all unhandled exceptions make
// ScheduledExecutorService halt the regular running of the task.
// It is better to print out the exception, and try again
// (on next cycle)
logger.warn(
"Execution of scheduled flushing of buffered data failed unexpectedly. Ignoring exception, trying again according to configured commit interval of {} ms.",
commitIntervalMillis, e);
}
}
}, 0, commitIntervalMillis, TimeUnit.MILLISECONDS);
}
isProperlyConfigured = true; isProperlyConfigured = true;
logger.debug("dynamodb persistence service activated"); logger.debug("dynamodb persistence service activated");
} }
@ -238,24 +197,45 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
@Deactivate @Deactivate
public void deactivate() { public void deactivate() {
logger.debug("dynamodb persistence service deactivated"); logger.debug("dynamodb persistence service deactivated");
if (writeBufferedDataFuture != null) { logIfManyQueuedTasks();
writeBufferedDataFuture.cancel(false); disconnect();
writeBufferedDataFuture = null;
}
resetClient();
} }
/** /**
* Initializes DynamoDBClient (db field) * Initializes Dynamo DB client and determines schema
* *
* If DynamoDBClient constructor throws an exception, error is logged and false is returned. * If construction fails, error is logged and false is returned.
* *
* @return whether initialization was successful. * @return whether initialization was successful.
*/ */
private boolean ensureClient() { private boolean ensureClient() {
if (db == null) { DynamoDBConfig localDbConfig = dbConfig;
if (localDbConfig == null) {
return false;
}
if (client == null) {
try { try {
db = new DynamoDBClient(dbConfig); synchronized (this) {
if (this.client != null) {
return true;
}
DynamoDbAsyncClientBuilder lowlevelClientBuilder = DynamoDbAsyncClient.builder()
.credentialsProvider(StaticCredentialsProvider.create(localDbConfig.getCredentials()))
.httpClient(NettyNioAsyncHttpClient.builder().maxConcurrency(MAX_CONCURRENCY).build())
.asyncConfiguration(
ClientAsyncConfiguration.builder()
.advancedOption(SdkAdvancedAsyncClientOption.FUTURE_COMPLETION_EXECUTOR,
executor)
.build())
.overrideConfiguration(this::overrideConfig).region(localDbConfig.getRegion());
if (endpointOverride != null) {
logger.debug("DynamoDB has been overriden to {}", endpointOverride);
lowlevelClientBuilder.endpointOverride(endpointOverride);
}
DynamoDbAsyncClient lowlevelClient = lowlevelClientBuilder.build();
client = DynamoDbEnhancedAsyncClient.builder().dynamoDbClient(lowlevelClient).build();
this.lowLevelClient = lowlevelClient;
}
} catch (Exception e) { } catch (Exception e) {
logger.error("Error constructing dynamodb client", e); logger.error("Error constructing dynamodb client", e);
return false; return false;
@ -264,111 +244,84 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
return true; return true;
} }
@Override private CompletableFuture<Boolean> resolveTableSchema() {
public DynamoDBItem<?> persistenceItemFromState(String name, State state, ZonedDateTime time) { DynamoDBTableNameResolver localTableNameResolver = tableNameResolver;
return AbstractDynamoDBItem.fromState(name, state, time); DynamoDbAsyncClient localLowLevelClient = lowLevelClient;
} if (localTableNameResolver == null || localLowLevelClient == null) {
throw new IllegalStateException("tableNameResolver or localLowLevelClient not available");
/**
* Create table (if not present) and wait for table to become active.
*
* Synchronized in order to ensure that at most single thread is creating the table at a time
*
* @param mapper
* @param dtoClass
* @return whether table creation succeeded.
*/
private synchronized boolean createTable(DynamoDBMapper mapper, Class<?> dtoClass) {
if (db == null) {
return false;
} }
String tableName; if (localTableNameResolver.isFullyResolved()) {
try { return CompletableFuture.completedFuture(true);
ProvisionedThroughput provisionedThroughput = new ProvisionedThroughput(dbConfig.getReadCapacityUnits(), } else {
dbConfig.getWriteCapacityUnits()); synchronized (localTableNameResolver) {
CreateTableRequest request = mapper.generateCreateTableRequest(dtoClass); if (localTableNameResolver.isFullyResolved()) {
request.setProvisionedThroughput(provisionedThroughput); return CompletableFuture.completedFuture(true);
if (request.getGlobalSecondaryIndexes() != null) {
for (GlobalSecondaryIndex index : request.getGlobalSecondaryIndexes()) {
index.setProvisionedThroughput(provisionedThroughput);
} }
return localTableNameResolver.resolveSchema(localLowLevelClient,
b -> b.overrideConfiguration(this::overrideConfig), executor).thenApplyAsync(resolved -> {
if (resolved && localTableNameResolver.getTableSchema() == ExpectedTableSchema.LEGACY) {
logger.warn(
"Using legacy table format. Is it recommended to migrate to the new table format: specify the 'table' parameter and unset the old 'tablePrefix' parameter.");
}
return resolved;
}, executor);
} }
tableName = request.getTableName();
try {
db.getDynamoClient().describeTable(tableName);
} catch (ResourceNotFoundException e) {
// No table present, continue with creation
db.getDynamoClient().createTable(request);
} catch (AmazonClientException e) {
logger.error("Table creation failed due to error in describeTable operation", e);
return false;
}
// table found or just created, wait
return waitForTableToBecomeActive(tableName);
} catch (AmazonClientException e) {
logger.error("Exception when creating table", e);
return false;
} }
} }
private boolean waitForTableToBecomeActive(String tableName) { private <T extends DynamoDBItem<?>> DynamoDbAsyncTable<T> getTable(Class<T> dtoClass) {
try { DynamoDbEnhancedAsyncClient localClient = client;
logger.debug("Checking if table '{}' is created...", tableName); DynamoDBTableNameResolver localTableNameResolver = tableNameResolver;
final TableDescription tableDescription; if (!ensureClient() || localClient == null || localTableNameResolver == null) {
try { throw new IllegalStateException("Client not ready");
tableDescription = db.getDynamoDB().getTable(tableName).waitForActive(); }
} catch (IllegalArgumentException e) { ExpectedTableSchema expectedTableSchemaRevision = localTableNameResolver.getTableSchema();
logger.warn("Table '{}' is being deleted: {} {}", tableName, e.getClass().getSimpleName(), String tableName = localTableNameResolver.fromClass(dtoClass);
e.getMessage()); final TableSchema<T> schema = getDynamoDBTableSchema(dtoClass, expectedTableSchemaRevision);
return false; @SuppressWarnings("unchecked") // OK since this is the only place tableCache is populated
} catch (ResourceNotFoundException e) { DynamoDbAsyncTable<T> table = (DynamoDbAsyncTable<T>) tableCache.computeIfAbsent(dtoClass, clz -> {
logger.warn("Table '{}' was deleted unexpectedly: {} {}", tableName, e.getClass().getSimpleName(), return localClient.table(tableName, schema);
e.getMessage()); });
return false; if (table == null) {
} // Invariant. To make null checker happy
boolean success = TableStatus.ACTIVE.equals(TableStatus.fromValue(tableDescription.getTableStatus())); throw new IllegalStateException();
if (success) { }
logger.debug("Creation of table '{}' successful, table status is now {}", tableName, return table;
tableDescription.getTableStatus()); }
} else {
logger.warn("Creation of table '{}' unsuccessful, table status is now {}", tableName, private static <T extends DynamoDBItem<?>> TableSchema<T> getDynamoDBTableSchema(Class<T> dtoClass,
tableDescription.getTableStatus()); ExpectedTableSchema expectedTableSchemaRevision) {
} if (dtoClass.equals(DynamoDBBigDecimalItem.class)) {
return success; @SuppressWarnings("unchecked") // OK thanks to above conditional
} catch (AmazonClientException e) { TableSchema<T> schema = (TableSchema<T>) (expectedTableSchemaRevision == ExpectedTableSchema.NEW
logger.error("Exception when checking table status (describe): {}", e.getMessage()); ? DynamoDBBigDecimalItem.TABLE_SCHEMA_NEW
return false; : DynamoDBBigDecimalItem.TABLE_SCHEMA_LEGACY);
} catch (InterruptedException e) { return schema;
logger.error("Interrupted while trying to check table status: {}", e.getMessage()); } else if (dtoClass.equals(DynamoDBStringItem.class)) {
return false; @SuppressWarnings("unchecked") // OK thanks to above conditional
TableSchema<T> schema = (TableSchema<T>) (expectedTableSchemaRevision == ExpectedTableSchema.NEW
? DynamoDBStringItem.TABLE_SCHEMA_NEW
: DynamoDBStringItem.TABLE_SCHEMA_LEGACY);
return schema;
} else {
throw new IllegalStateException("Unknown DTO class. Bug");
} }
} }
private void resetClient() { private void disconnect() {
if (db == null) { DynamoDbAsyncClient localLowLevelClient = lowLevelClient;
if (client == null || localLowLevelClient == null) {
return; return;
} }
db.shutdown(); localLowLevelClient.close();
db = null; lowLevelClient = null;
client = null;
dbConfig = null; dbConfig = null;
tableNameResolver = null; tableNameResolver = null;
isProperlyConfigured = false; isProperlyConfigured = false;
tableCache.clear();
} }
private DynamoDBMapper getDBMapper(String tableName) {
try {
DynamoDBMapperConfig mapperConfig = new DynamoDBMapperConfig.Builder()
.withTableNameOverride(new DynamoDBMapperConfig.TableNameOverride(tableName))
.withPaginationLoadingStrategy(PaginationLoadingStrategy.LAZY_LOADING).build();
return new DynamoDBMapper(db.getDynamoClient(), mapperConfig);
} catch (AmazonClientException e) {
logger.error("Error getting db mapper: {}", e.getMessage());
throw e;
}
}
@Override
protected boolean isReadyToStore() { protected boolean isReadyToStore() {
return isProperlyConfigured && ensureClient(); return isProperlyConfigured && ensureClient();
} }
@ -388,160 +341,123 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
return Collections.emptySet(); return Collections.emptySet();
} }
@Override
protected void flushBufferedData() {
if (buffer != null && buffer.isEmpty()) {
return;
}
logger.debug("Writing buffered data. Buffer size: {}", buffer.size());
for (;;) {
Map<String, Deque<DynamoDBItem<?>>> itemsByTable = readBuffer();
// Write batch of data, one table at a time
for (Entry<String, Deque<DynamoDBItem<?>>> entry : itemsByTable.entrySet()) {
String tableName = entry.getKey();
Deque<DynamoDBItem<?>> batch = entry.getValue();
if (!batch.isEmpty()) {
flushBatch(getDBMapper(tableName), batch);
}
}
if (buffer != null && buffer.isEmpty()) {
break;
}
}
}
private Map<String, Deque<DynamoDBItem<?>>> readBuffer() {
Map<String, Deque<DynamoDBItem<?>>> batchesByTable = new HashMap<>(2);
// Get batch of data
while (!buffer.isEmpty()) {
DynamoDBItem<?> dynamoItem = buffer.poll();
if (dynamoItem == null) {
break;
}
String tableName = tableNameResolver.fromItem(dynamoItem);
Deque<DynamoDBItem<?>> batch = batchesByTable.computeIfAbsent(tableName, new Function<>() {
@Override
public @Nullable Deque<DynamoDBItem<?>> apply(@Nullable String t) {
return new ArrayDeque<>();
}
});
batch.add(dynamoItem);
}
return batchesByTable;
}
/**
* Flush batch of data to DynamoDB
*
* @param mapper mapper associated with the batch
* @param batch batch of data to write to DynamoDB
*/
private void flushBatch(DynamoDBMapper mapper, Deque<DynamoDBItem<?>> batch) {
long currentTimeMillis = System.currentTimeMillis();
List<FailedBatch> failed = mapper.batchSave(batch);
for (FailedBatch failedBatch : failed) {
if (failedBatch.getException() instanceof ResourceNotFoundException) {
// Table did not exist. Try again after creating table
retryFlushAfterCreatingTable(mapper, batch, failedBatch);
} else {
logger.debug("Batch failed with {}. Retrying next with exponential back-off",
failedBatch.getException().getMessage());
new ExponentialBackoffRetry(failedBatch.getUnprocessedItems()).run();
}
}
if (failed.isEmpty()) {
logger.debug("flushBatch ended with {} items in {} ms: {}", batch.size(),
System.currentTimeMillis() - currentTimeMillis, batch);
} else {
logger.warn(
"flushBatch ended with {} items in {} ms: {}. There were some failed batches that were retried -- check logs for ERRORs to see if writes were successful",
batch.size(), System.currentTimeMillis() - currentTimeMillis, batch);
}
}
/**
* Retry flushing data after creating table associated with mapper
*
* @param mapper mapper associated with the batch
* @param batch original batch of data. Used for logging and to determine table name
* @param failedBatch failed batch that should be retried
*/
private void retryFlushAfterCreatingTable(DynamoDBMapper mapper, Deque<DynamoDBItem<?>> batch,
FailedBatch failedBatch) {
logger.debug("Table was not found. Trying to create table and try saving again");
if (createTable(mapper, batch.peek().getClass())) {
logger.debug("Table creation successful, trying to save again");
if (!failedBatch.getUnprocessedItems().isEmpty()) {
ExponentialBackoffRetry retry = new ExponentialBackoffRetry(failedBatch.getUnprocessedItems());
retry.run();
if (retry.getUnprocessedItems().isEmpty()) {
logger.debug("Successfully saved items after table creation");
}
}
} else {
logger.warn("Table creation failed. Not storing some parts of batch: {}. Unprocessed items: {}", batch,
failedBatch.getUnprocessedItems());
}
}
@Override @Override
public Iterable<HistoricItem> query(FilterCriteria filter) { public Iterable<HistoricItem> query(FilterCriteria filter) {
logger.debug("got a query"); logIfManyQueuedTasks();
Instant start = Instant.now();
String filterDescription = filterToString(filter);
logger.trace("Got a query with filter {}", filterDescription);
DynamoDbEnhancedAsyncClient localClient = client;
DynamoDBTableNameResolver localTableNameResolver = tableNameResolver;
if (!isProperlyConfigured) { if (!isProperlyConfigured) {
logger.debug("Configuration for dynamodb not yet loaded or broken. Not storing item."); logger.debug("Configuration for dynamodb not yet loaded or broken. Returning empty query results.");
return Collections.emptyList(); return Collections.emptyList();
} }
if (!ensureClient()) { if (!ensureClient() || localClient == null || localTableNameResolver == null) {
logger.warn("DynamoDB not connected. Not storing item."); logger.warn("DynamoDB not connected. Returning empty query results.");
return Collections.emptyList(); return Collections.emptyList();
} }
String itemName = filter.getItemName(); //
Item item = getItemFromRegistry(itemName); // Resolve unclear table schema if needed
if (item == null) { //
logger.warn("Could not get item {} from registry!", itemName);
return Collections.emptyList();
}
Class<DynamoDBItem<?>> dtoClass = AbstractDynamoDBItem.getDynamoItemClass(item.getClass());
String tableName = tableNameResolver.fromClass(dtoClass);
DynamoDBMapper mapper = getDBMapper(tableName);
logger.debug("item {} (class {}) will be tried to query using dto class {} from table {}", itemName,
item.getClass(), dtoClass, tableName);
List<HistoricItem> historicItems = new ArrayList<>();
DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression = DynamoDBQueryUtils.createQueryExpression(dtoClass,
filter);
@SuppressWarnings("rawtypes")
final PaginatedQueryList<? extends DynamoDBItem> paginatedList;
try { try {
paginatedList = mapper.query(dtoClass, queryExpression); Boolean resolved = resolveTableSchema().get();
} catch (AmazonServiceException e) { if (!resolved) {
logger.error( logger.warn("Table schema not resolved, cannot query data.");
"DynamoDB query raised unexpected exception: {}. Returning empty collection. " return Collections.emptyList();
+ "Status code 400 (resource not found) might occur if table was just created.", }
e.getMessage()); } catch (InterruptedException e) {
logger.warn("Table schema resolution interrupted, cannot query data");
return Collections.emptyList();
} catch (ExecutionException e) {
Throwable cause = e.getCause();
logger.warn("Table schema resolution errored, cannot query data: {} {}",
cause == null ? e.getClass().getSimpleName() : cause.getClass().getSimpleName(),
cause == null ? e.getMessage() : cause.getMessage());
return Collections.emptyList(); return Collections.emptyList();
} }
for (int itemIndexOnPage = 0; itemIndexOnPage < filter.getPageSize(); itemIndexOnPage++) { try {
int itemIndex = filter.getPageNumber() * filter.getPageSize() + itemIndexOnPage; //
DynamoDBItem<?> dynamoItem; // Proceed with query
try { //
dynamoItem = paginatedList.get(itemIndex); String itemName = filter.getItemName();
} catch (IndexOutOfBoundsException e) { Item item = getItemFromRegistry(itemName);
logger.debug("Index {} is out-of-bounds", itemIndex); if (item == null) {
break; logger.warn("Could not get item {} from registry! Returning empty query results.", itemName);
return Collections.emptyList();
} }
if (dynamoItem != null) { if (item instanceof GroupItem) {
HistoricItem historicItem = dynamoItem.asHistoricItem(item); item = ((GroupItem) item).getBaseItem();
logger.trace("Dynamo item {} converted to historic item: {}", item, historicItem); logger.debug("Item is instanceof GroupItem '{}'", itemName);
historicItems.add(historicItem); if (item == null) {
logger.debug("BaseItem of GroupItem is null. Ignore and give up!");
return Collections.emptyList();
}
if (item instanceof GroupItem) {
logger.debug("BaseItem of GroupItem is a GroupItem too. Ignore and give up!");
return Collections.emptyList();
}
} }
boolean legacy = localTableNameResolver.getTableSchema() == ExpectedTableSchema.LEGACY;
Class<? extends DynamoDBItem<?>> dtoClass = AbstractDynamoDBItem.getDynamoItemClass(item.getClass(),
legacy);
String tableName = localTableNameResolver.fromClass(dtoClass);
DynamoDbAsyncTable<? extends DynamoDBItem<?>> table = getTable(dtoClass);
logger.debug("Item {} (of type {}) will be tried to query using DTO class {} from table {}", itemName,
item.getClass().getSimpleName(), dtoClass.getSimpleName(), tableName);
QueryEnhancedRequest queryExpression = DynamoDBQueryUtils.createQueryExpression(dtoClass,
localTableNameResolver.getTableSchema(), item, filter);
CompletableFuture<List<DynamoDBItem<?>>> itemsFuture = new CompletableFuture<>();
final SdkPublisher<? extends DynamoDBItem<?>> itemPublisher = table.query(queryExpression).items();
Subscriber<DynamoDBItem<?>> pageSubscriber = new PageOfInterestSubscriber<DynamoDBItem<?>>(itemsFuture,
filter.getPageNumber(), filter.getPageSize());
itemPublisher.subscribe(pageSubscriber);
// NumberItem.getUnit() is expensive, we avoid calling it in the loop
// by fetching the unit here.
final Item localItem = item;
final Unit<?> itemUnit = localItem instanceof NumberItem ? ((NumberItem) localItem).getUnit() : null;
try {
@SuppressWarnings("null")
List<HistoricItem> results = itemsFuture.get().stream().map(dynamoItem -> {
HistoricItem historicItem = dynamoItem.asHistoricItem(localItem, itemUnit);
if (historicItem == null) {
logger.warn(
"Dynamo item {} serialized state '{}' cannot be converted to item {} {}. Item type changed since persistence. Ignoring",
dynamoItem.getClass().getSimpleName(), dynamoItem.getState(),
localItem.getClass().getSimpleName(), localItem.getName());
return null;
}
logger.trace("Dynamo item {} converted to historic item: {}", localItem, historicItem);
return historicItem;
}).filter(value -> value != null).collect(Collectors.toList());
logger.debug("Query completed in {} ms. Filter was {}",
Duration.between(start, Instant.now()).toMillis(), filterDescription);
return results;
} catch (InterruptedException e) {
logger.warn("Query interrupted. Filter was {}", filterDescription);
return Collections.emptyList();
} catch (ExecutionException e) {
Throwable cause = e.getCause();
if (cause instanceof ResourceNotFoundException) {
logger.trace("Query failed since the DynamoDB table '{}' does not exist. Filter was {}", tableName,
filterDescription);
} else if (logger.isTraceEnabled()) {
logger.trace("Query failed. Filter was {}", filterDescription, e);
} else {
logger.warn("Query failed {} {}. Filter was {}",
cause == null ? e.getClass().getSimpleName() : cause.getClass().getSimpleName(),
cause == null ? e.getMessage() : cause.getMessage(), filterDescription);
}
return Collections.emptyList();
}
} catch (Exception e) {
logger.error("Unexpected error with query having filter {}: {} {}. Returning empty query results.",
filterDescription, e.getClass().getSimpleName(), e.getMessage());
return Collections.emptyList();
} }
return historicItems;
} }
/** /**
@ -551,19 +467,201 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
* @return item with the given name, or null if no such item exists in item registry. * @return item with the given name, or null if no such item exists in item registry.
*/ */
private @Nullable Item getItemFromRegistry(String itemName) { private @Nullable Item getItemFromRegistry(String itemName) {
Item item = null;
try { try {
if (itemRegistry != null) { return itemRegistry.getItem(itemName);
item = itemRegistry.getItem(itemName);
}
} catch (ItemNotFoundException e1) { } catch (ItemNotFoundException e1) {
logger.error("Unable to get item {} from registry", itemName); return null;
} }
return item;
} }
@Override @Override
public List<PersistenceStrategy> getDefaultStrategies() { public List<PersistenceStrategy> getDefaultStrategies() {
return List.of(PersistenceStrategy.Globals.RESTORE, PersistenceStrategy.Globals.CHANGE); return List.of(PersistenceStrategy.Globals.RESTORE, PersistenceStrategy.Globals.CHANGE);
} }
@Override
public void store(Item item) {
store(item, null);
}
@Override
public void store(Item item, @Nullable String alias) {
// Timestamp and capture state immediately as rest of the store is asynchronous (state might change in between)
ZonedDateTime time = ZonedDateTime.now();
logIfManyQueuedTasks();
if (!(item instanceof GenericItem)) {
return;
}
if (item.getState() instanceof UnDefType) {
logger.debug("Undefined item state received. Not storing item {}.", item.getName());
return;
}
if (!isReadyToStore()) {
logger.warn("Not ready to store (config error?), not storing item {}.", item.getName());
return;
}
// Get Item describing the real type of data
// With non-group items this is same as the argument item. With Group items, this is item describing the type of
// state stored in the group.
final Item itemTemplate;
try {
itemTemplate = getEffectiveItem(item);
} catch (IllegalStateException e) {
// Exception is raised when underlying item type cannot be determined with Group item
// Logged already
return;
}
String effectiveName = (alias != null) ? alias : item.getName();
// We do not want to rely item.state since async context below can execute much later.
// We 'copy' the item for local use. copyItem also normalizes the unit with NumberItems.
final GenericItem copiedItem = copyItem(itemTemplate, item, effectiveName, null);
resolveTableSchema().thenAcceptAsync(resolved -> {
if (!resolved) {
logger.warn("Table schema not resolved, not storing item {}.", copiedItem.getName());
return;
}
DynamoDbEnhancedAsyncClient localClient = client;
DynamoDbAsyncClient localLowlevelClient = lowLevelClient;
DynamoDBConfig localConfig = dbConfig;
DynamoDBTableNameResolver localTableNameResolver = tableNameResolver;
if (!isProperlyConfigured || localClient == null || localLowlevelClient == null || localConfig == null
|| localTableNameResolver == null) {
logger.warn("Not ready to store (config error?), not storing item {}.", item.getName());
return;
}
Integer expireDays = localConfig.getExpireDays();
final DynamoDBItem<?> dto;
switch (localTableNameResolver.getTableSchema()) {
case NEW:
dto = AbstractDynamoDBItem.fromStateNew(copiedItem, time, expireDays);
break;
case LEGACY:
dto = AbstractDynamoDBItem.fromStateLegacy(copiedItem, time);
break;
default:
throw new IllegalStateException("Unexpected. Bug");
}
logger.trace("store() called with item {} {} '{}', which was converted to DTO {}",
copiedItem.getClass().getSimpleName(), effectiveName, copiedItem.getState(), dto);
dto.accept(new DynamoDBItemVisitor<TableCreatingPutItem<? extends DynamoDBItem<?>>>() {
@Override
public TableCreatingPutItem<? extends DynamoDBItem<?>> visit(
DynamoDBBigDecimalItem dynamoBigDecimalItem) {
return new TableCreatingPutItem<DynamoDBBigDecimalItem>(DynamoDBPersistenceService.this,
dynamoBigDecimalItem, getTable(DynamoDBBigDecimalItem.class));
}
@Override
public TableCreatingPutItem<? extends DynamoDBItem<?>> visit(DynamoDBStringItem dynamoStringItem) {
return new TableCreatingPutItem<DynamoDBStringItem>(DynamoDBPersistenceService.this,
dynamoStringItem, getTable(DynamoDBStringItem.class));
}
}).putItemAsync();
}, executor).exceptionally(e -> {
logger.error("Unexcepted error", e);
return null;
});
}
private Item getEffectiveItem(Item item) {
final Item effectiveItem;
if (item instanceof GroupItem) {
Item baseItem = ((GroupItem) item).getBaseItem();
if (baseItem == null) {
// if GroupItem:<ItemType> is not defined in
// *.items using StringType
logger.debug(
"Cannot detect ItemType for {} because the GroupItems' base type isn't set in *.items File.",
item.getName());
Iterator<Item> firstGroupMemberItem = ((GroupItem) item).getMembers().iterator();
if (firstGroupMemberItem.hasNext()) {
effectiveItem = firstGroupMemberItem.next();
} else {
throw new IllegalStateException("GroupItem " + item.getName()
+ " does not have children nor base item set, cannot determine underlying item type. Aborting!");
}
} else {
effectiveItem = baseItem;
}
} else {
effectiveItem = item;
}
return effectiveItem;
}
/**
* Copy item and optionally override name and state
*
* State is normalized to source item's unit with Quantity NumberItems and QuantityTypes
*
* @param itemTemplate 'template item' to be used to construct the new copy. It is also used to determine UoM unit
* and get GenericItem.type
* @param item item that is used to acquire name and state
* @param nameOverride name override for the resulting copy
* @param stateOverride state override for the resulting copy
* @throws IllegalArgumentException when state is QuantityType and not compatible with item
*/
static GenericItem copyItem(Item itemTemplate, Item item, @Nullable String nameOverride,
@Nullable State stateOverride) {
final GenericItem copiedItem;
try {
if (itemTemplate instanceof NumberItem) {
copiedItem = (GenericItem) itemTemplate.getClass().getDeclaredConstructor(String.class, String.class)
.newInstance(itemTemplate.getType(), nameOverride == null ? item.getName() : nameOverride);
} else {
copiedItem = (GenericItem) itemTemplate.getClass().getDeclaredConstructor(String.class)
.newInstance(nameOverride == null ? item.getName() : nameOverride);
}
} catch (InstantiationException | IllegalAccessException | IllegalArgumentException | InvocationTargetException
| NoSuchMethodException | SecurityException e) {
throw new IllegalArgumentException(item.toString(), e);
}
State state = stateOverride == null ? item.getState() : stateOverride;
if (state instanceof QuantityType<?> && itemTemplate instanceof NumberItem) {
Unit<?> itemUnit = ((NumberItem) itemTemplate).getUnit();
if (itemUnit != null) {
State convertedState = ((QuantityType<?>) state).toUnit(itemUnit);
if (convertedState == null) {
logger.error("Unexpected unit conversion failure: {} to item unit {}", state, itemUnit);
throw new IllegalArgumentException(
String.format("Unexpected unit conversion failure: %s to item unit %s", state, itemUnit));
}
state = convertedState;
}
}
copiedItem.setState(state);
return copiedItem;
}
private void logIfManyQueuedTasks() {
if (executor instanceof ThreadPoolExecutor) {
ThreadPoolExecutor localExecutor = (ThreadPoolExecutor) executor;
if (localExecutor.getQueue().size() >= 5) {
logger.trace("executor queue size: {}, remaining space {}. Active threads {}",
localExecutor.getQueue().size(), localExecutor.getQueue().remainingCapacity(),
localExecutor.getActiveCount());
} else if (localExecutor.getQueue().size() >= 50) {
logger.warn(
"Many ({}) tasks queued in executor! This might be sign of bad design or bug in the addon code.",
localExecutor.getQueue().size());
}
}
}
private String filterToString(FilterCriteria filter) {
return String.format(
"FilterCriteria@%s(item=%s, pageNumber=%d, pageSize=%d, time=[%s, %s, %s], state=[%s, %s of %s] )",
System.identityHashCode(filter), filter.getItemName(), filter.getPageNumber(), filter.getPageSize(),
filter.getBeginDate(), filter.getEndDate(), filter.getOrdering(), filter.getOperator(),
filter.getState(), filter.getState() == null ? "null" : filter.getState().getClass().getSimpleName());
}
} }

View File

@ -12,19 +12,23 @@
*/ */
package org.openhab.persistence.dynamodb.internal; package org.openhab.persistence.dynamodb.internal;
import java.lang.reflect.InvocationTargetException;
import java.time.ZonedDateTime; import java.time.ZonedDateTime;
import java.util.Collections;
import org.eclipse.jdt.annotation.NonNullByDefault; import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable; import org.eclipse.jdt.annotation.Nullable;
import org.openhab.core.items.GenericItem;
import org.openhab.core.items.Item;
import org.openhab.core.persistence.FilterCriteria; import org.openhab.core.persistence.FilterCriteria;
import org.openhab.core.persistence.FilterCriteria.Operator; import org.openhab.core.persistence.FilterCriteria.Operator;
import org.openhab.core.persistence.FilterCriteria.Ordering; import org.openhab.core.persistence.FilterCriteria.Ordering;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBQueryExpression; import software.amazon.awssdk.enhanced.dynamodb.AttributeConverter;
import com.amazonaws.services.dynamodbv2.model.AttributeValue; import software.amazon.awssdk.enhanced.dynamodb.Expression;
import com.amazonaws.services.dynamodbv2.model.ComparisonOperator; import software.amazon.awssdk.enhanced.dynamodb.Expression.Builder;
import com.amazonaws.services.dynamodbv2.model.Condition; import software.amazon.awssdk.enhanced.dynamodb.model.QueryConditional;
import software.amazon.awssdk.enhanced.dynamodb.model.QueryEnhancedRequest;
import software.amazon.awssdk.services.dynamodb.model.AttributeValue;
/** /**
* Utility class * Utility class
@ -36,88 +40,141 @@ public class DynamoDBQueryUtils {
/** /**
* Construct dynamodb query from filter * Construct dynamodb query from filter
* *
* @param filter * @param dtoClass dto class
* @param expectedTableSchema table schema to query against
* @param item item corresponding to filter
* @param filter filter for the query
* @return DynamoDBQueryExpression corresponding to the given FilterCriteria * @return DynamoDBQueryExpression corresponding to the given FilterCriteria
* @throws IllegalArgumentException when schema is not fully resolved
*/ */
public static DynamoDBQueryExpression<DynamoDBItem<?>> createQueryExpression( public static QueryEnhancedRequest createQueryExpression(Class<? extends DynamoDBItem<?>> dtoClass,
Class<? extends DynamoDBItem<?>> dtoClass, FilterCriteria filter) { ExpectedTableSchema expectedTableSchema, Item item, FilterCriteria filter) {
DynamoDBItem<?> item = getDynamoDBHashKey(dtoClass, filter.getItemName()); if (!expectedTableSchema.isFullyResolved()) {
final DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression = new DynamoDBQueryExpression<DynamoDBItem<?>>() throw new IllegalArgumentException("Schema not resolved");
.withHashKeyValues(item).withScanIndexForward(filter.getOrdering() == Ordering.ASCENDING)
.withLimit(filter.getPageSize());
maybeAddTimeFilter(queryExpression, filter);
maybeAddStateFilter(filter, queryExpression);
return queryExpression;
}
private static DynamoDBItem<?> getDynamoDBHashKey(Class<? extends DynamoDBItem<?>> dtoClass, String itemName) {
DynamoDBItem<?> item;
try {
item = dtoClass.newInstance();
} catch (InstantiationException e) {
throw new RuntimeException(e);
} catch (IllegalAccessException e) {
throw new RuntimeException(e);
} }
item.setName(itemName); QueryEnhancedRequest.Builder queryBuilder = QueryEnhancedRequest.builder()
return item; .scanIndexForward(filter.getOrdering() == Ordering.ASCENDING);
addFilterbyItemAndTimeFilter(queryBuilder, expectedTableSchema, filter.getItemName(), filter);
addStateFilter(queryBuilder, expectedTableSchema, item, dtoClass, filter);
addProjection(dtoClass, expectedTableSchema, queryBuilder);
return queryBuilder.build();
} }
private static void maybeAddStateFilter(FilterCriteria filter, /**
final DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression) { * Add projection for key parameters only, not expire date
if (filter.getOperator() != null && filter.getState() != null) { */
// Convert filter's state to DynamoDBItem in order get suitable string representation for the state private static void addProjection(Class<? extends DynamoDBItem<?>> dtoClass,
final DynamoDBItem<?> filterState = AbstractDynamoDBItem.fromState(filter.getItemName(), filter.getState(), ExpectedTableSchema expectedTableSchema, QueryEnhancedRequest.Builder queryBuilder) {
ZonedDateTime.now()); boolean legacy = expectedTableSchema == ExpectedTableSchema.LEGACY;
queryExpression.setFilterExpression(String.format("%s %s :opstate", DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE, if (legacy) {
operatorAsString(filter.getOperator()))); queryBuilder.attributesToProject(DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME_LEGACY,
DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC_LEGACY, DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY);
filterState.accept(new DynamoDBItemVisitor() { } else {
acceptAsEmptyDTO(dtoClass, new DynamoDBItemVisitor<@Nullable Void>() {
@Override @Override
public void visit(DynamoDBStringItem dynamoStringItem) { public @Nullable Void visit(DynamoDBStringItem dynamoStringItem) {
queryExpression.setExpressionAttributeValues(Collections.singletonMap(":opstate", queryBuilder.attributesToProject(DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME,
new AttributeValue().withS(dynamoStringItem.getState()))); DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC, DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_STRING);
return null;
} }
@Override @Override
public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) { public @Nullable Void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
queryExpression.setExpressionAttributeValues(Collections.singletonMap(":opstate", queryBuilder.attributesToProject(DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME,
new AttributeValue().withN(dynamoBigDecimalItem.getState().toPlainString()))); DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC, DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_NUMBER);
return null;
} }
}); });
} }
} }
private static @Nullable Condition maybeAddTimeFilter( private static void addStateFilter(QueryEnhancedRequest.Builder queryBuilder,
final DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression, final FilterCriteria filter) { ExpectedTableSchema expectedTableSchema, Item item, Class<? extends DynamoDBItem<?>> dtoClass,
final Condition timeCondition = constructTimeCondition(filter); FilterCriteria filter) {
if (timeCondition != null) { final Expression expression;
queryExpression.setRangeKeyConditions( Builder itemStateTypeExpressionBuilder = Expression.builder()
Collections.singletonMap(DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC, timeCondition)); .expression(String.format("attribute_exists(#attr)"));
boolean legacy = expectedTableSchema == ExpectedTableSchema.LEGACY;
acceptAsEmptyDTO(dtoClass, new DynamoDBItemVisitor<@Nullable Void>() {
@Override
public @Nullable Void visit(DynamoDBStringItem dynamoStringItem) {
itemStateTypeExpressionBuilder.putExpressionName("#attr",
legacy ? DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY
: DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_STRING);
return null;
}
@Override
public @Nullable Void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
itemStateTypeExpressionBuilder.putExpressionName("#attr",
legacy ? DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY
: DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_NUMBER);
return null;
}
});
if (filter.getOperator() != null && filter.getState() != null) {
// Convert filter's state to DynamoDBItem in order get suitable string representation for the state
Expression.Builder stateFilterExpressionBuilder = Expression.builder()
.expression(String.format("#attr %s :value", operatorAsString(filter.getOperator())));
// Following will throw IllegalArgumentException when filter state is not compatible with
// item. This is acceptable.
GenericItem stateToFind = DynamoDBPersistenceService.copyItem(item, item, filter.getItemName(),
filter.getState());
acceptAsDTO(stateToFind, legacy, new DynamoDBItemVisitor<@Nullable Void>() {
@Override
public @Nullable Void visit(DynamoDBStringItem serialized) {
stateFilterExpressionBuilder.putExpressionName("#attr",
legacy ? DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY
: DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_STRING);
stateFilterExpressionBuilder.putExpressionValue(":value",
AttributeValue.builder().s(serialized.getState()).build());
return null;
}
@SuppressWarnings("null")
@Override
public @Nullable Void visit(DynamoDBBigDecimalItem serialized) {
stateFilterExpressionBuilder.putExpressionName("#attr",
legacy ? DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY
: DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_NUMBER);
stateFilterExpressionBuilder.putExpressionValue(":value",
AttributeValue.builder().n(serialized.getState().toPlainString()).build());
return null;
}
});
expression = Expression.join(stateFilterExpressionBuilder.build(), itemStateTypeExpressionBuilder.build(),
"AND");
queryBuilder.filterExpression(expression);
} else {
expression = itemStateTypeExpressionBuilder.build();
} }
return timeCondition; queryBuilder.filterExpression(expression);
} }
private static @Nullable Condition constructTimeCondition(FilterCriteria filter) { private static void addFilterbyItemAndTimeFilter(QueryEnhancedRequest.Builder queryBuilder,
ExpectedTableSchema expectedTableSchema, String partition, final FilterCriteria filter) {
boolean hasBegin = filter.getBeginDate() != null; boolean hasBegin = filter.getBeginDate() != null;
boolean hasEnd = filter.getEndDate() != null; boolean hasEnd = filter.getEndDate() != null;
boolean legacy = expectedTableSchema == ExpectedTableSchema.LEGACY;
AttributeConverter<ZonedDateTime> timeConverter = AbstractDynamoDBItem.getTimestampConverter(legacy);
final Condition timeCondition;
if (!hasBegin && !hasEnd) { if (!hasBegin && !hasEnd) {
timeCondition = null; // No need to place time filter filter but we do filter by partition
queryBuilder.queryConditional(QueryConditional.keyEqualTo(k -> k.partitionValue(partition)));
} else if (hasBegin && !hasEnd) { } else if (hasBegin && !hasEnd) {
timeCondition = new Condition().withComparisonOperator(ComparisonOperator.GE).withAttributeValueList( queryBuilder.queryConditional(QueryConditional.sortGreaterThan(
new AttributeValue().withS(filter.getBeginDate().format(AbstractDynamoDBItem.DATEFORMATTER))); k -> k.partitionValue(partition).sortValue(timeConverter.transformFrom(filter.getBeginDate()))));
} else if (!hasBegin && hasEnd) { } else if (!hasBegin && hasEnd) {
timeCondition = new Condition().withComparisonOperator(ComparisonOperator.LE).withAttributeValueList( queryBuilder.queryConditional(QueryConditional.sortLessThan(
new AttributeValue().withS(filter.getEndDate().format(AbstractDynamoDBItem.DATEFORMATTER))); k -> k.partitionValue(partition).sortValue(timeConverter.transformFrom(filter.getEndDate()))));
} else { } else {
timeCondition = new Condition().withComparisonOperator(ComparisonOperator.BETWEEN).withAttributeValueList( assert hasBegin && hasEnd; // invariant
new AttributeValue().withS(filter.getBeginDate().format(AbstractDynamoDBItem.DATEFORMATTER)), queryBuilder.queryConditional(QueryConditional.sortBetween(
new AttributeValue().withS(filter.getEndDate().format(AbstractDynamoDBItem.DATEFORMATTER))); k -> k.partitionValue(partition).sortValue(timeConverter.transformFrom(filter.getBeginDate())),
k -> k.partitionValue(partition).sortValue(timeConverter.transformFrom(filter.getEndDate()))));
} }
return timeCondition;
} }
/** /**
@ -145,4 +202,23 @@ public class DynamoDBQueryUtils {
throw new IllegalStateException("Unknown operator " + op); throw new IllegalStateException("Unknown operator " + op);
} }
} }
private static <T> void acceptAsDTO(Item item, boolean legacy, DynamoDBItemVisitor<T> visitor) {
ZonedDateTime dummyTimestamp = ZonedDateTime.now();
if (legacy) {
AbstractDynamoDBItem.fromStateLegacy(item, dummyTimestamp).accept(visitor);
} else {
AbstractDynamoDBItem.fromStateNew(item, dummyTimestamp, null).accept(visitor);
}
}
private static <T> void acceptAsEmptyDTO(Class<? extends DynamoDBItem<?>> dtoClass,
DynamoDBItemVisitor<T> visitor) {
try {
dtoClass.getDeclaredConstructor().newInstance().accept(visitor);
} catch (InstantiationException | IllegalAccessException | IllegalArgumentException | InvocationTargetException
| NoSuchMethodException | SecurityException e) {
throw new IllegalStateException(e);
}
}
} }

View File

@ -14,64 +14,59 @@ package org.openhab.persistence.dynamodb.internal;
import java.time.ZonedDateTime; import java.time.ZonedDateTime;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute; import org.eclipse.jdt.annotation.NonNullByDefault;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBDocument; import org.eclipse.jdt.annotation.Nullable;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBRangeKey; import software.amazon.awssdk.enhanced.dynamodb.mapper.StaticTableSchema;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverted;
/** /**
* DynamoDBItem for items that can be serialized as DynamoDB string * DynamoDBItem for items that can be serialized as DynamoDB string
* *
* @author Sami Salonen - Initial contribution * @author Sami Salonen - Initial contribution
*/ */
@DynamoDBDocument @NonNullByDefault
public class DynamoDBStringItem extends AbstractDynamoDBItem<String> { public class DynamoDBStringItem extends AbstractDynamoDBItem<String> {
private static Class<@Nullable String> NULLABLE_STRING = (Class<@Nullable String>) String.class;
public static final StaticTableSchema<DynamoDBStringItem> TABLE_SCHEMA_LEGACY = getBaseSchemaBuilder(
DynamoDBStringItem.class, true)
.newItemSupplier(
DynamoDBStringItem::new)
.addAttribute(NULLABLE_STRING, a -> a.name(DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY)
.getter(DynamoDBStringItem::getState).setter(DynamoDBStringItem::setState))
.build();
public static final StaticTableSchema<DynamoDBStringItem> TABLE_SCHEMA_NEW = getBaseSchemaBuilder(
DynamoDBStringItem.class, false)
.newItemSupplier(DynamoDBStringItem::new)
.addAttribute(NULLABLE_STRING,
a -> a.name(DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_STRING)
.getter(DynamoDBStringItem::getState).setter(DynamoDBStringItem::setState))
.addAttribute(NULLABLE_LONG, a -> a.name(ATTRIBUTE_NAME_EXPIRY)
.getter(AbstractDynamoDBItem::getExpiryDate).setter(AbstractDynamoDBItem::setExpiry))
.build();
public DynamoDBStringItem() { public DynamoDBStringItem() {
this(null, null, null); this("", null, ZonedDateTime.now(), null);
} }
public DynamoDBStringItem(String name, String state, ZonedDateTime time) { public DynamoDBStringItem(String name, @Nullable String state, ZonedDateTime time, @Nullable Integer expireDays) {
super(name, state, time); super(name, state, time, expireDays);
} }
@DynamoDBAttribute(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE)
@Override @Override
public String getState() { public @Nullable String getState() {
return state; return state;
} }
@DynamoDBHashKey(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME)
@Override @Override
public String getName() { public void setState(@Nullable String state) {
return name;
}
@Override
@DynamoDBRangeKey(attributeName = ATTRIBUTE_NAME_TIMEUTC)
@DynamoDBTypeConverted(converter = ZonedDateTimeConverter.class)
public ZonedDateTime getTime() {
return time;
}
@Override
public void accept(org.openhab.persistence.dynamodb.internal.DynamoDBItemVisitor visitor) {
visitor.visit(this);
}
@Override
public void setName(String name) {
this.name = name;
}
@Override
public void setState(String state) {
this.state = state; this.state = state;
} }
@Override @Override
public void setTime(ZonedDateTime time) { public <T> T accept(DynamoDBItemVisitor<T> visitor) {
this.time = time; return visitor.visit(this);
} }
} }

View File

@ -12,37 +12,111 @@
*/ */
package org.openhab.persistence.dynamodb.internal; package org.openhab.persistence.dynamodb.internal;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutorService;
import java.util.function.Consumer;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClient;
import software.amazon.awssdk.services.dynamodb.model.DescribeTableRequest;
import software.amazon.awssdk.services.dynamodb.model.ResourceNotFoundException;
import software.amazon.awssdk.services.dynamodb.model.TableStatus;
/** /**
* The DynamoDBTableNameResolver resolves DynamoDB table name for a given item. * The DynamoDBTableNameResolver resolves DynamoDB table name for a given item.
* *
* @author Sami Salonen - Initial contribution * @author Sami Salonen - Initial contribution
* *
*/ */
@NonNullByDefault
public class DynamoDBTableNameResolver { public class DynamoDBTableNameResolver {
private final Logger logger = LoggerFactory.getLogger(DynamoDBTableNameResolver.class);
private final String tablePrefix; private final String tablePrefix;
private ExpectedTableSchema tableRevision;
private String table;
public DynamoDBTableNameResolver(String tablePrefix) { public DynamoDBTableNameResolver(ExpectedTableSchema tableRevision, String table, String tablePrefix) {
this.tableRevision = tableRevision;
this.table = table;
this.tablePrefix = tablePrefix; this.tablePrefix = tablePrefix;
switch (tableRevision) {
case NEW:
if (table.isBlank()) {
throw new IllegalArgumentException("table should be specified with NEW schema");
}
break;
case MAYBE_LEGACY:
if (table.isBlank()) {
throw new IllegalArgumentException("table should be specified with MAYBE_LEGACY schema");
}
// fall-through
case LEGACY:
if (tablePrefix.isBlank()) {
throw new IllegalArgumentException("tablePrefix should be specified with LEGACY schema");
}
break;
default:
throw new IllegalArgumentException("Bug");
}
} }
/**
* Create instance of DynamoDBTableNameResolver using given DynamoDBItem. Item's class is used to determine the
* table name.
*
*
* @param item dto to use to determine table name
* @return table name
* @throws IllegalStateException when table schmea is not determined
*/
public String fromItem(DynamoDBItem<?> item) { public String fromItem(DynamoDBItem<?> item) {
final String[] tableName = new String[1]; if (!isFullyResolved()) {
throw new IllegalStateException();
}
switch (tableRevision) {
case NEW:
return getTableNameAccordingToNewSchema();
case LEGACY:
return getTableNameAccordingToLegacySchema(item);
default:
throw new IllegalArgumentException("Bug");
}
}
/**
* Get table name according to new schema. This instance does not have to have fully determined schema
*
* @return table name
*/
private String getTableNameAccordingToNewSchema() {
return table;
}
/**
* Get table name according to legacy schema. This instance does not have to have fully determined schema
*
* @param item dto to use to determine table name
* @return table name
*/
private String getTableNameAccordingToLegacySchema(DynamoDBItem<?> item) {
// Use the visitor pattern to deduce the table name // Use the visitor pattern to deduce the table name
item.accept(new DynamoDBItemVisitor() { return item.accept(new DynamoDBItemVisitor<String>() {
@Override @Override
public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) { public String visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
tableName[0] = tablePrefix + "bigdecimal"; return tablePrefix + "bigdecimal";
} }
@Override @Override
public void visit(DynamoDBStringItem dynamoStringItem) { public String visit(DynamoDBStringItem dynamoStringItem) {
tableName[0] = tablePrefix + "string"; return tablePrefix + "string";
} }
}); });
return tableName[0];
} }
/** /**
@ -62,4 +136,84 @@ public class DynamoDBTableNameResolver {
} }
return this.fromItem(dummy); return this.fromItem(dummy);
} }
/**
* Whether we have determined the schema and table names to use
*
* @return true when schema revision is clearly specified
*/
public boolean isFullyResolved() {
return tableRevision.isFullyResolved();
}
public CompletableFuture<Boolean> resolveSchema(DynamoDbAsyncClient lowLevelClient,
Consumer<DescribeTableRequest.Builder> describeTableRequestMutator, ExecutorService executor) {
CompletableFuture<Boolean> resolved = new CompletableFuture<>();
if (isFullyResolved()) {
resolved.complete(true);
}
String numberTableLegacy = getTableNameAccordingToLegacySchema(new DynamoDBBigDecimalItem());
String stringTableLegacy = getTableNameAccordingToLegacySchema(new DynamoDBStringItem());
CompletableFuture<@Nullable Boolean> tableSchemaNumbers = tableIsPresent(lowLevelClient,
describeTableRequestMutator, executor, numberTableLegacy);
CompletableFuture<@Nullable Boolean> tableSchemaStrings = tableIsPresent(lowLevelClient,
describeTableRequestMutator, executor, stringTableLegacy);
tableSchemaNumbers.thenAcceptBothAsync(tableSchemaStrings, (table1Present, table2Present) -> {
if (table1Present != null && table2Present != null) {
// Since the Booleans are not null, we know for sure whether table is present or not
// If old tables do not exist, we default to new table layout/schema
tableRevision = (!table1Present && !table2Present) ? ExpectedTableSchema.NEW
: ExpectedTableSchema.LEGACY;
}
resolved.complete(table1Present != null && table2Present != null);
}, executor).exceptionally(e -> {
// should not happen as individual futures have exceptions handled
logger.error("Unexpected error. BUG", e);
resolved.complete(false);
return null;
});
return resolved;
}
/**
*
* @return whether table exists, or null when state is unknown
*/
private CompletableFuture<@Nullable Boolean> tableIsPresent(DynamoDbAsyncClient lowLevelClient,
Consumer<DescribeTableRequest.Builder> describeTableRequestMutator, ExecutorService executor,
String tableName) {
CompletableFuture<@Nullable Boolean> tableSchema = new CompletableFuture<>();
lowLevelClient.describeTable(b -> b.tableName(tableName).applyMutation(describeTableRequestMutator))
.thenApplyAsync(r -> r.table().tableStatus(), executor)
.thenApplyAsync(tableStatus -> tableIsBeingRemoved(tableStatus) ? false : true)
.thenAccept(r -> tableSchema.complete(r)).exceptionally(exception -> {
Throwable cause = exception.getCause();
if (cause instanceof ResourceNotFoundException) {
tableSchema.complete(false);
} else {
logger.warn(
"Could not verify whether table {} is present: {} {}. Cannot determine table schema.",
tableName,
cause == null ? exception.getClass().getSimpleName() : cause.getClass().getSimpleName(),
cause == null ? exception.getMessage() : cause.getMessage());
// Other error, we could not resolve schema...
tableSchema.complete(null);
}
return null;
});
return tableSchema;
}
private boolean tableIsBeingRemoved(TableStatus tableStatus) {
return (tableStatus == TableStatus.ARCHIVING || tableStatus == TableStatus.DELETING
|| tableStatus == TableStatus.ARCHIVED);
}
public ExpectedTableSchema getTableSchema() {
return tableRevision;
}
} }

View File

@ -0,0 +1,35 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
* Expected revision of the DynamoDB schema
*
* NEW: Read and create data using new schemas
* LEGACY: Read and create data using old schemas, compatible with first version of DynamoDB persistence addon
* MAYBE_LEGACY: Try to read and create data using old schemas, but fallback to NEW if the old tables do not exist.
*
* @author Sami Salonen - Initial contribution
*/
@NonNullByDefault
public enum ExpectedTableSchema {
NEW,
LEGACY,
MAYBE_LEGACY;
public boolean isFullyResolved() {
return this != ExpectedTableSchema.MAYBE_LEGACY;
}
}

View File

@ -0,0 +1,98 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.atomic.AtomicInteger;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.reactivestreams.Subscriber;
import org.reactivestreams.Subscription;
/**
* Subscriber that subscribes the page of interest
*
* @author Sami Salonen - Initial contribution
*/
@NonNullByDefault
public class PageOfInterestSubscriber<T> implements Subscriber<T> {
private AtomicInteger skipped = new AtomicInteger();
private int skip;
private @Nullable Subscription subscription;
private int pageIndex;
private int pageSize;
private List<T> page;
private CompletableFuture<List<T>> future;
/**
* Create new PageOfInterestSubscriber
*
* @param subscriber subscriber to get the page of interest
* @param pageIndex page index that we want subscribe
* @param pageSize page size
*/
protected PageOfInterestSubscriber(CompletableFuture<List<T>> future, int pageIndex, int pageSize) {
this.future = future;
this.pageIndex = pageIndex;
this.pageSize = pageSize;
this.page = new ArrayList<>();
this.skip = pageIndex * pageSize;
}
@Override
public void onSubscribe(@Nullable Subscription subscription) {
this.subscription = subscription;
if (subscription != null) {
subscription.request(pageSize * (pageIndex + 1));
}
}
@Override
public void onNext(T t) {
Subscription localSubscription = subscription;
if (localSubscription == null) {
throw new IllegalStateException(
"Subscriber API has been contract violated: expecting a non-null subscriber");
}
if (future.isCancelled()) {
localSubscription.cancel();
onError(new InterruptedException());
} else if (skipped.getAndIncrement() >= skip && page.size() < pageSize) {
// We have skipped enough, start accumulating
page.add(t);
if (page.size() == pageSize) {
// We have the full page read
localSubscription.cancel();
onComplete();
}
}
}
@Override
public void onError(@NonNullByDefault({}) Throwable t) {
if (!future.isDone()) {
future.completeExceptionally(t);
}
}
@Override
public void onComplete() {
if (!future.isDone()) {
future.complete(page);
}
}
}

View File

@ -0,0 +1,255 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import java.time.Duration;
import java.time.Instant;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.CompletionException;
import java.util.concurrent.ExecutorService;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import software.amazon.awssdk.core.internal.waiters.ResponseOrException;
import software.amazon.awssdk.enhanced.dynamodb.DynamoDbAsyncTable;
import software.amazon.awssdk.enhanced.dynamodb.model.CreateTableEnhancedRequest;
import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClient;
import software.amazon.awssdk.services.dynamodb.model.DescribeTableResponse;
import software.amazon.awssdk.services.dynamodb.model.ProvisionedThroughput;
import software.amazon.awssdk.services.dynamodb.model.ResourceInUseException;
import software.amazon.awssdk.services.dynamodb.model.ResourceNotFoundException;
/**
* PutItem request which creates table if needed.
*
* Designed such that competing PutItem requests should complete successfully, only one of them
* 'winning the race' and creating the table.
*
*
* PutItem
* . |
* . \ (ERR: ResourceNotFoundException) (1)
* ....|
* ....CreateTable
* ....|.........\
* .... \ (OK)....\ (ERR: ResourceInUseException) (2)
* ......|..................|
* ..... |..................|
* ..... |...........Wait for table to become active
* ..... |......................\
* ..... |......................| (OK)
* ..... |......................|
* ..... |......................PutItem
* ..... |
* ..... |
* ..... Wait for table to become active
* ......|
* .......\
* ........| (OK)
* ........|
* ........\
* ....... Configure TTL (no-op with legacy schema)
* ..........|
* ...........\ (OK)
* ...........|
* ...........PutItem
*
*
* (1) Most likely table does not exist yet
* (2) Raised when Table created by someone else
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class TableCreatingPutItem<T extends DynamoDBItem<?>> {
private final Logger logger = LoggerFactory.getLogger(TableCreatingPutItem.class);
private final DynamoDBPersistenceService service;
private T dto;
private DynamoDbAsyncTable<T> table;
private CompletableFuture<Void> aggregateFuture = new CompletableFuture<Void>();
private Instant start = Instant.now();
private ExecutorService executor;
private DynamoDbAsyncClient lowLevelClient;
private DynamoDBConfig dbConfig;
private DynamoDBTableNameResolver tableNameResolver;
public TableCreatingPutItem(DynamoDBPersistenceService service, T dto, DynamoDbAsyncTable<T> table) {
this.service = service;
this.dto = dto;
this.table = table;
this.executor = this.service.getExecutor();
DynamoDbAsyncClient localLowLevelClient = this.service.getLowLevelClient();
DynamoDBConfig localDbConfig = this.service.getDbConfig();
DynamoDBTableNameResolver localTableNameResolver = this.service.getTableNameResolver();
if (localLowLevelClient == null || localDbConfig == null || localTableNameResolver == null) {
throw new IllegalStateException("Service is not ready");
}
lowLevelClient = localLowLevelClient;
dbConfig = localDbConfig;
tableNameResolver = localTableNameResolver;
}
public CompletableFuture<Void> putItemAsync() {
start = Instant.now();
return internalPutItemAsync(false, true);
}
private CompletableFuture<Void> internalPutItemAsync(boolean createTable, boolean recursionAllowed) {
if (createTable) {
// Try again, first creating the table
Instant tableCreationStart = Instant.now();
table.createTable(CreateTableEnhancedRequest.builder()
.provisionedThroughput(
ProvisionedThroughput.builder().readCapacityUnits(dbConfig.getReadCapacityUnits())
.writeCapacityUnits(dbConfig.getWriteCapacityUnits()).build())
.build())//
.whenCompleteAsync((resultTableCreation, exceptionTableCreation) -> {
if (exceptionTableCreation == null) {
logger.trace("PutItem: Table created in {} ms. Proceeding to TTL creation.",
Duration.between(tableCreationStart, Instant.now()).toMillis());
//
// Table creation OK. Configure TTL
//
boolean legacy = tableNameResolver.getTableSchema() == ExpectedTableSchema.LEGACY;
waitForTableToBeActive().thenComposeAsync(_void -> {
if (legacy) {
// We have legacy table schema. TTL configuration is skipped
return CompletableFuture.completedFuture(null);
} else {
// We have the new table schema -> configure TTL
// for the newly created table
return lowLevelClient.updateTimeToLive(req -> req
.overrideConfiguration(this.service::overrideConfig)
.tableName(table.tableName()).timeToLiveSpecification(spec -> spec
.attributeName(DynamoDBItem.ATTRIBUTE_NAME_EXPIRY).enabled(true)));
}
}, executor)
//
// Table is ready and TTL configured (possibly with error)
//
.whenCompleteAsync((resultTTL, exceptionTTL) -> {
if (exceptionTTL == null) {
//
// TTL configuration OK, continue with PutItem
//
logger.trace("PutItem: TTL configured successfully");
internalPutItemAsync(false, false);
} else {
//
// TTL configuration failed, abort
//
logger.trace("PutItem: TTL configuration failed");
Throwable exceptionTTLCause = exceptionTTL.getCause();
aggregateFuture.completeExceptionally(
exceptionTTLCause == null ? exceptionTTL : exceptionTTLCause);
}
}, executor);
} else {
// Table creation failed. We give up and complete the aggregate
// future -- unless the error was ResourceInUseException, in which case wait for
// table to become active and try again
Throwable cause = exceptionTableCreation.getCause();
if (cause instanceof ResourceInUseException) {
logger.trace(
"PutItem: table creation failed (will be retried) with {} {}. Perhaps tried to create table that already exists. Trying one more time without creating table.",
cause.getClass().getSimpleName(), cause.getMessage());
// Wait table to be active, then retry PutItem
waitForTableToBeActive().whenCompleteAsync((_tableWaitResponse, tableWaitException) -> {
if (tableWaitException != null) {
// error when waiting for table to become active
Throwable tableWaitExceptionCause = tableWaitException.getCause();
logger.warn(
"PutItem: failed (final) with {} {} when waiting to become active. Aborting.",
tableWaitExceptionCause == null
? tableWaitException.getClass().getSimpleName()
: tableWaitExceptionCause.getClass().getSimpleName(),
tableWaitExceptionCause == null ? tableWaitException.getMessage()
: tableWaitExceptionCause.getMessage());
aggregateFuture.completeExceptionally(
tableWaitExceptionCause == null ? tableWaitException
: tableWaitExceptionCause);
}
}, executor)
// table wait OK, retry PutItem
.thenRunAsync(() -> internalPutItemAsync(false, false), executor);
} else {
logger.warn("PutItem: failed (final) with {} {}. Aborting.",
cause == null ? exceptionTableCreation.getClass().getSimpleName()
: cause.getClass().getSimpleName(),
cause == null ? exceptionTableCreation.getMessage() : cause.getMessage());
aggregateFuture.completeExceptionally(cause == null ? exceptionTableCreation : cause);
}
}
}, executor);
} else {
// First try, optimistically assuming that table exists
table.putItem(dto).whenCompleteAsync((result, exception) -> {
if (exception == null) {
logger.trace("PutItem: DTO {} was successfully written in {} ms.", dto,
Duration.between(start, Instant.now()).toMillis());
aggregateFuture.complete(result);
} else {
// PutItem failed. We retry i failure was due to non-existing table. Retry is triggered by calling
// this method again with createTable=true)
// With other errors, we abort.
if (!(exception instanceof CompletionException)) {
logger.error("PutItem: Expecting only CompletionException, got {} {}. BUG",
exception.getClass().getName(), exception.getMessage());
aggregateFuture.completeExceptionally(new IllegalStateException("unexpected exception"));
}
Throwable cause = exception.getCause();
if (cause instanceof ResourceNotFoundException && recursionAllowed) {
logger.trace(
"PutItem: Table '{}' was not present. Retrying, this time creating the table first",
table.tableName());
internalPutItemAsync(true, true);
} else {
logger.warn("PutItem: failed (final) with {} {}. Aborting.",
cause == null ? exception.getClass().getSimpleName() : cause.getClass().getSimpleName(),
cause == null ? exception.getMessage() : cause.getMessage());
aggregateFuture.completeExceptionally(cause == null ? exception : cause);
}
}
}, executor);
}
return aggregateFuture;
}
private CompletableFuture<Void> waitForTableToBeActive() {
return lowLevelClient.waiter()
.waitUntilTableExists(
req -> req.tableName(table.tableName()).overrideConfiguration(this.service::overrideConfig))
.thenAcceptAsync(tableWaitResponse -> {
// if waiter fails, the future is completed exceptionally (not entering this step)
ResponseOrException<DescribeTableResponse> responseOrException = tableWaitResponse.matched();
logger.trace("PutItem: Table wait completed sucessfully with {} attempts: {}",
tableWaitResponse.attemptsExecuted(), toString(responseOrException));
}, executor);
}
private String toString(ResponseOrException<?> responseOrException) {
if (responseOrException.response().isPresent()) {
return String.format("response=%s", responseOrException.response().get());
} else if (responseOrException.exception().isPresent()) {
Throwable exception = responseOrException.exception().get();
return String.format("exception=%s %s", exception.getClass().getSimpleName(), exception.getMessage());
} else {
return String.format("<N/A>");
}
}
}

View File

@ -12,9 +12,17 @@
# #
# The following parameters are used to configure Amazon DynamoDB Persistence. # The following parameters are used to configure Amazon DynamoDB Persistence.
# #
# Further details at https://docs.openhab.org/addons/persistence/dynamodb/readme.html # Further details at https://www.openhab.org/addons/persistence/dynamodb/
# #
# PID SETTING
#
# When configuring the persistence using file (instead UI),
# make sure the first line in the configuration file is the
# pid definition (remove the comment prefix #)
#pid:pid:org.openhab.dynamodb
# #
# CONNECTION SETTINGS (follow OPTION 1 or OPTION 2) # CONNECTION SETTINGS (follow OPTION 1 or OPTION 2)
# #
@ -26,50 +34,58 @@
# OPTION 2 (using profilesConfigFile and profile) # OPTION 2 (using profilesConfigFile and profile)
# where profilesConfigFile points to AWS credentials file # where profilesConfigFile points to AWS credentials file
# Please note that the user that runs openHAB must have approriate read rights to the credential file.
# See below for an example how the credentials file should look like
#profilesConfigFile=/etc/openhab2/aws_creds #profilesConfigFile=/etc/openhab2/aws_creds
#profile=fooprofile #profile=fooprofile
#region=eu-west-1 #region=eu-west-1
# UNCOMMENT THE BELOW ALWAYS (otherwise legacy table schema with 'tablePrefix' is used)
#table=openhab
# Credentials file example: # Credentials file example:
# #
# [fooprofile] # [fooprofile]
# aws_access_key_id=AKIAIOSFODNN7EXAMPLE # aws_access_key_id=AKIAIOSFODNN7EXAMPLE
# aws_secret_access_key=3+AAAAABBBbbbCCCCCCdddddd+7mnbIOLH # aws_secret_access_key=3+AAAAABBBbbbCCCCCCdddddd+7mnbIOLH
# #
# ADVANCED CONFIGURATION (OPTIONAL) # ADVANCED CONFIGURATION (OPTIONAL)
# #
# Expire time for data in days (relative to stored timestamp).
# Data older than this is removed automatically using DynamoDB Time to Live (TTL)
# feature.
#expireDays=
# read capacity for the created tables # read capacity for the created tables
#readCapacityUnits=1 #readCapacityUnits=1
# write capacity for the created tables # write capacity for the created tables
#writeCapacityUnits=1 #writeCapacityUnits=1
# table prefix used in the name of created tables # LEGACY SCHEMA: table prefix used in the name of created tables
#tablePrefix=openhab- #tablePrefix=openhab-
--> -->
<parameter name="region" type="text" required="true"> <parameter name="region" type="text" required="true">
<label>AWS region ID</label> <label>AWS region ID</label>
<description><![CDATA[AWS region ID as described in Step 2 in Setting up Amazon account.<br /> <description><![CDATA[AWS region ID<br />
The region needs to match the region of the AWS user that will access Amazon DynamoDB.<br /> The region needs to match the region of the AWS user that will access Amazon DynamoDB.<br />
For example, eu-west-1.]]></description> For example, eu-west-1.]]></description>
</parameter> </parameter>
<parameter name="accessKey" type="text" required="false"> <parameter name="accessKey" type="text" required="false">
<label>AWS access key</label> <label>AWS access key</label>
<description><![CDATA[AWS access key of the AWS user that will access Amazon DynamoDB. <description><![CDATA[AWS access key<br />
<br />
Give either 1) access key and secret key, or 2) credentials file and profile name. Give either 1) access key and secret key, or 2) credentials file and profile name.
]]></description> ]]></description>
</parameter> </parameter>
<parameter name="secretKey" type="text" required="false"> <parameter name="secretKey" type="text" required="false">
<label>AWS secret key</label> <label>AWS secret key</label>
<description><![CDATA[AWS secret key of the AWS user that will access Amazon DynamoDB. <description><![CDATA[AWS secret key<br />
<br />
Give either 1) access key and secret key, or 2) credentials file and profile name. Give either 1) access key and secret key, or 2) credentials file and profile name.
]]></description> ]]></description>
</parameter> </parameter>
@ -78,39 +94,54 @@
<parameter name="profilesConfigFile" type="text" required="false"> <parameter name="profilesConfigFile" type="text" required="false">
<label>AWS credentials file</label> <label>AWS credentials file</label>
<description><![CDATA[Path to the AWS credentials file. <br /> <description><![CDATA[Path to the AWS credentials file. <br />
For example, /etc/openhab2/aws_creds. For example, /etc/openhab/aws_creds. Please note that the user that runs openHAB must have approriate read rights to the credential file. <br />
Please note that the user that runs openHAB must have approriate read rights to the credential file.
<br />
Give either 1) access key and secret key, or 2) credentials file and profile name. Give either 1) access key and secret key, or 2) credentials file and profile name.
]]></description> ]]></description>
</parameter> </parameter>
<parameter name="profile" type="text" required="false"> <parameter name="profile" type="text" required="false">
<label>Profile name</label> <label>Profile name</label>
<description><![CDATA[Name of the profile to use in AWS credentials file <description><![CDATA[Profile name in AWS credentials file. <br />
<br />
Give either 1) access key and secret key, or 2) credentials file and profile name. Give either 1) access key and secret key, or 2) credentials file and profile name.
]]></description> ]]></description>
</parameter> </parameter>
<parameter name="table" type="text" required="false">
<label>Table</label>
<description><![CDATA[Table name. <br />
Specify this parameter over Table Prefix to use the new optimized table format.]]></description>
<default>openhab</default> <!-- set by default, preferring new schema format -->
</parameter>
<parameter name="readCapacityUnits" type="integer" required="false" min="1"> <parameter name="readCapacityUnits" type="integer" required="false" min="1">
<description>Read capacity for the created tables. Default is 1.</description> <description><![CDATA[Provisioned read capacity.<br />
<label>Read capacity</label> Default is 1.]]></description>
<label>Read Capacity</label>
<advanced>true</advanced> <advanced>true</advanced>
</parameter> </parameter>
<parameter name="writeCapacityUnits" type="integer" required="false" min="1"> <parameter name="writeCapacityUnits" type="integer" required="false" min="1">
<label>Write capacity</label> <label>Write Capacity</label>
<description>Write capacity for the created tables. Default is 1.</description> <description><![CDATA[Provisioned write capacity.<br />
Default is 1.]]></description>
<advanced>true</advanced> <advanced>true</advanced>
</parameter> </parameter>
<parameter name="tablePrefix" type="text" required="false"> <parameter name="expireDays" type="integer" required="false" min="1">
<label>Table prefix</label> <label>Data Expiry, in Days</label>
<description>Table prefix used in the name of created tables. Default is openhab-</description> <description><![CDATA[Expire time for data.<br />
Data older than this is automatically removed by DynamoDB Time to Live (TTL) feature. Use empty value to disable data expiration.
]]></description>
<advanced>true</advanced> <advanced>true</advanced>
<default></default> <!-- empty by default, giving preference to new table schema -->
</parameter>
<parameter name="tablePrefix" type="text" required="false">
<label>Table Prefix</label>
<description><![CDATA[Legacy: Table prefix used in the name of created tables. <br />
Default is "openhab-"]]></description>
<advanced>true</advanced>
<default></default> <!-- empty by default, giving preference to new table schema -->
</parameter> </parameter>
</config-description> </config-description>

View File

@ -17,7 +17,8 @@ import static org.junit.jupiter.api.Assertions.assertEquals;
import java.io.IOException; import java.io.IOException;
import org.eclipse.jdt.annotation.NonNullByDefault; import org.eclipse.jdt.annotation.NonNullByDefault;
import org.junit.jupiter.api.Test; import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.CsvSource;
import org.openhab.core.library.items.CallItem; import org.openhab.core.library.items.CallItem;
import org.openhab.core.library.items.ColorItem; import org.openhab.core.library.items.ColorItem;
import org.openhab.core.library.items.ContactItem; import org.openhab.core.library.items.ContactItem;
@ -38,58 +39,71 @@ import org.openhab.core.library.items.SwitchItem;
@NonNullByDefault @NonNullByDefault
public class AbstractDynamoDBItemGetDynamoItemClassTest { public class AbstractDynamoDBItemGetDynamoItemClassTest {
@Test @ParameterizedTest
public void testCallItem() throws IOException { @CsvSource({ "true", "false" })
assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(CallItem.class)); public void testCallItem(boolean legacy) throws IOException {
assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(CallItem.class, legacy));
} }
@Test @ParameterizedTest
public void testContactItem() throws IOException { @CsvSource({ "true", "false" })
assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(ContactItem.class)); public void testContactItem(boolean legacy) throws IOException {
assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(ContactItem.class, legacy));
} }
@Test @ParameterizedTest
public void testDateTimeItem() throws IOException { @CsvSource({ "true", "false" })
assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(DateTimeItem.class)); public void testDateTimeItem(boolean legacy) throws IOException {
assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(DateTimeItem.class, legacy));
} }
@Test @ParameterizedTest
public void testStringItem() throws IOException { @CsvSource({ "true", "false" })
assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(StringItem.class)); public void testStringItem(boolean legacy) throws IOException {
assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(StringItem.class, legacy));
} }
@Test @ParameterizedTest
public void testLocationItem() throws IOException { @CsvSource({ "true", "false" })
assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(LocationItem.class)); public void testLocationItem(boolean legacy) throws IOException {
assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(LocationItem.class, legacy));
} }
@Test @ParameterizedTest
public void testNumberItem() throws IOException { @CsvSource({ "true", "false" })
assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(NumberItem.class)); public void testNumberItem(boolean legacy) throws IOException {
assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(NumberItem.class, legacy));
} }
@Test @ParameterizedTest
public void testColorItem() throws IOException { @CsvSource({ "true", "false" })
assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(ColorItem.class)); public void testColorItem(boolean legacy) throws IOException {
assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(ColorItem.class, legacy));
} }
@Test @ParameterizedTest
public void testDimmerItem() throws IOException { @CsvSource({ "true", "false" })
assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(DimmerItem.class)); public void testDimmerItem(boolean legacy) throws IOException {
assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(DimmerItem.class, legacy));
} }
@Test @ParameterizedTest
public void testPlayerItem() throws IOException { @CsvSource({ "true", "false" })
assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(PlayerItem.class)); public void testPlayerItem(boolean legacy) throws IOException {
assertEquals(legacy ? DynamoDBStringItem.class : DynamoDBBigDecimalItem.class,
AbstractDynamoDBItem.getDynamoItemClass(PlayerItem.class, legacy));
} }
@Test @ParameterizedTest
public void testRollershutterItem() throws IOException { @CsvSource({ "true", "false" })
assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(RollershutterItem.class)); public void testRollershutterItem(boolean legacy) throws IOException {
assertEquals(DynamoDBBigDecimalItem.class,
AbstractDynamoDBItem.getDynamoItemClass(RollershutterItem.class, legacy));
} }
@Test @ParameterizedTest
public void testOnOffTypeWithSwitchItem() throws IOException { @CsvSource({ "true", "false" })
assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(SwitchItem.class)); public void testOnOffTypeWithSwitchItem(boolean legacy) throws IOException {
assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(SwitchItem.class, legacy));
} }
} }

View File

@ -23,8 +23,9 @@ import java.util.Objects;
import java.util.TimeZone; import java.util.TimeZone;
import org.eclipse.jdt.annotation.NonNullByDefault; import org.eclipse.jdt.annotation.NonNullByDefault;
import org.junit.jupiter.api.Test; import org.junit.jupiter.params.ParameterizedTest;
import org.openhab.core.items.Item; import org.junit.jupiter.params.provider.CsvSource;
import org.openhab.core.items.GenericItem;
import org.openhab.core.library.items.CallItem; import org.openhab.core.library.items.CallItem;
import org.openhab.core.library.items.ColorItem; import org.openhab.core.library.items.ColorItem;
import org.openhab.core.library.items.ContactItem; import org.openhab.core.library.items.ContactItem;
@ -47,7 +48,6 @@ import org.openhab.core.library.types.StringType;
import org.openhab.core.library.types.UpDownType; import org.openhab.core.library.types.UpDownType;
import org.openhab.core.persistence.HistoricItem; import org.openhab.core.persistence.HistoricItem;
import org.openhab.core.types.State; import org.openhab.core.types.State;
import org.openhab.core.types.UnDefType;
/** /**
* Test for AbstractDynamoDBItem.fromState and AbstractDynamoDBItem.asHistoricItem for all kind of states * Test for AbstractDynamoDBItem.fromState and AbstractDynamoDBItem.asHistoricItem for all kind of states
@ -60,18 +60,24 @@ public class AbstractDynamoDBItemSerializationTest {
private final ZonedDateTime date = ZonedDateTime.ofInstant(Instant.ofEpochSecond(400), ZoneId.systemDefault()); private final ZonedDateTime date = ZonedDateTime.ofInstant(Instant.ofEpochSecond(400), ZoneId.systemDefault());
/** /**
* Generic function testing serialization of item state to internal format in DB. In other words, conversion of * Generic function testing serialization of GenericItem state to internal format in DB. In other words, conversion
* Item with state to DynamoDBItem * of
* GenericItem with state to DynamoDBItem
* *
* @param state item state * @param legacy whether we have legacy
* @param expectedState internal format in DB representing the item state * @param GenericItem item
* @param stateOverride state
* @param expectedState internal format in DB representing the GenericItem state
* @return dynamo db item * @return dynamo db item
* @throws IOException * @throws IOException
*/ */
public DynamoDBItem<?> testStateGeneric(State state, Object expectedState) throws IOException { public DynamoDBItem<?> testSerializationToDTO(boolean legacy, GenericItem item, State stateOverride,
DynamoDBItem<?> dbItem = AbstractDynamoDBItem.fromState("item1", state, date); Object expectedState) throws IOException {
item.setState(stateOverride);
DynamoDBItem<?> dbItem = legacy ? AbstractDynamoDBItem.fromStateLegacy(item, date)
: AbstractDynamoDBItem.fromStateNew(item, date, null);
assertEquals("item1", dbItem.getName()); assertEquals("foo", dbItem.getName());
assertEquals(date, dbItem.getTime()); assertEquals(date, dbItem.getTime());
Object actualState = dbItem.getState(); Object actualState = dbItem.getState();
assertNotNull(actualState); assertNotNull(actualState);
@ -91,19 +97,20 @@ public class AbstractDynamoDBItemSerializationTest {
* Test state deserialization, that is DynamoDBItem conversion to HistoricItem * Test state deserialization, that is DynamoDBItem conversion to HistoricItem
* *
* @param dbItem dynamo db item * @param dbItem dynamo db item
* @param item parameter for DynamoDBItem.asHistoricItem * @param GenericItem parameter for DynamoDBItem.asHistoricItem
* @param expectedState Expected state of the historic item. DecimalTypes are compared with reduced accuracy * @param expectedState Expected state of the historic item. DecimalTypes are compared with reduced accuracy
* @return * @return
* @throws IOException * @throws IOException
*/ */
public HistoricItem testAsHistoricGeneric(DynamoDBItem<?> dbItem, Item item, Object expectedState) public HistoricItem testAsHistoricGeneric(DynamoDBItem<?> dbItem, GenericItem item, Object expectedState)
throws IOException { throws IOException {
HistoricItem historicItem = dbItem.asHistoricItem(item); HistoricItem historicItem = dbItem.asHistoricItem(item);
assertNotNull(historicItem);
assertEquals("item1", historicItem.getName()); assert historicItem != null; // getting rid off null pointer access warning
assertEquals("foo", historicItem.getName());
assertEquals(date, historicItem.getTimestamp()); assertEquals(date, historicItem.getTimestamp());
assertEquals(expectedState.getClass(), historicItem.getState().getClass()); assertEquals(expectedState.getClass(), historicItem.getState().getClass());
if (expectedState instanceof DecimalType) { if (expectedState.getClass() == DecimalType.class) {
// serialization loses accuracy, take this into consideration // serialization loses accuracy, take this into consideration
BigDecimal expectedRounded = DynamoDBBigDecimalItem BigDecimal expectedRounded = DynamoDBBigDecimalItem
.loseDigits(((DecimalType) expectedState).toBigDecimal()); .loseDigits(((DecimalType) expectedState).toBigDecimal());
@ -117,148 +124,189 @@ public class AbstractDynamoDBItemSerializationTest {
return historicItem; return historicItem;
} }
@Test @ParameterizedTest
public void testUndefWithNumberItem() throws IOException { @CsvSource({ "true", "false" })
final DynamoDBItem<?> dbitem = testStateGeneric(UnDefType.UNDEF, "<org.openhab.core.types.UnDefType.UNDEF>"); public void testCallTypeWithCallItemLegacy(boolean legacy) throws IOException {
assertTrue(dbitem instanceof DynamoDBStringItem); GenericItem item = new CallItem("foo");
testAsHistoricGeneric(dbitem, new NumberItem("foo"), UnDefType.UNDEF); final DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item, new StringListType("origNum", "destNum"),
"origNum,destNum");
testAsHistoricGeneric(dbitem, item, new StringListType("origNum", "destNum"));
} }
@Test @ParameterizedTest
public void testCallTypeWithCallItem() throws IOException { @CsvSource({ "true", "false" })
final DynamoDBItem<?> dbitem = testStateGeneric(new StringListType("origNum", "destNum"), "origNum,destNum"); public void testOpenClosedTypeWithContactItem(boolean legacy) throws IOException {
testAsHistoricGeneric(dbitem, new CallItem("foo"), new StringListType("origNum", "destNum")); GenericItem item = new ContactItem("foo");
final DynamoDBItem<?> dbitemOpen = testSerializationToDTO(legacy, item, OpenClosedType.CLOSED, BigDecimal.ZERO);
testAsHistoricGeneric(dbitemOpen, item, OpenClosedType.CLOSED);
final DynamoDBItem<?> dbitemClosed = testSerializationToDTO(legacy, item, OpenClosedType.OPEN, BigDecimal.ONE);
testAsHistoricGeneric(dbitemClosed, item, OpenClosedType.OPEN);
} }
@Test @ParameterizedTest
public void testOpenClosedTypeWithContactItem() throws IOException { @CsvSource({ "true", "false" })
final DynamoDBItem<?> dbitemOpen = testStateGeneric(OpenClosedType.CLOSED, BigDecimal.ZERO); public void testDateTimeTypeWithDateTimeItem(boolean legacy) throws IOException {
testAsHistoricGeneric(dbitemOpen, new ContactItem("foo"), OpenClosedType.CLOSED); GenericItem item = new DateTimeItem("foo");
final DynamoDBItem<?> dbitemClosed = testStateGeneric(OpenClosedType.OPEN, BigDecimal.ONE);
testAsHistoricGeneric(dbitemClosed, new ContactItem("foo"), OpenClosedType.OPEN);
}
@Test
public void testDateTimeTypeWithDateTimeItem() throws IOException {
ZonedDateTime zdt = ZonedDateTime.parse("2016-05-01T13:46:00.050Z"); ZonedDateTime zdt = ZonedDateTime.parse("2016-05-01T13:46:00.050Z");
DynamoDBItem<?> dbitem = testStateGeneric(new DateTimeType(zdt.toString()), "2016-05-01T13:46:00.050Z"); DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item, new DateTimeType(zdt.toString()),
testAsHistoricGeneric(dbitem, new DateTimeItem("foo"),
new DateTimeType(zdt.withZoneSameInstant(ZoneId.systemDefault())));
}
@Test
public void testDateTimeTypeWithStringItem() throws IOException {
DynamoDBItem<?> dbitem = testStateGeneric(new DateTimeType(ZonedDateTime.parse("2016-05-01T13:46:00.050Z")),
"2016-05-01T13:46:00.050Z"); "2016-05-01T13:46:00.050Z");
testAsHistoricGeneric(dbitem, new StringItem("foo"), new StringType("2016-05-01T13:46:00.050Z")); testAsHistoricGeneric(dbitem, item, new DateTimeType(zdt.withZoneSameInstant(ZoneId.systemDefault())));
} }
@Test @ParameterizedTest
public void testDateTimeTypeLocalWithDateTimeItem() throws IOException { @CsvSource({ "true", "false" })
DynamoDBItem<?> dbitem = testStateGeneric(new DateTimeType("2016-07-17T19:38:07.050+0300"), public void testDateTimeTypeWithStringItem(boolean legacy) throws IOException {
"2016-07-17T16:38:07.050Z"); GenericItem item = new StringItem("foo");
DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item,
new DateTimeType(ZonedDateTime.parse("2016-05-01T13:46:00.050Z")), "2016-05-01T13:46:00.050Z");
testAsHistoricGeneric(dbitem, item, new StringType("2016-05-01T13:46:00.050Z"));
}
@ParameterizedTest
@CsvSource({ "true", "false" })
public void testDateTimeTypeLocalWithDateTimeItem(boolean legacy) throws IOException {
GenericItem item = new DateTimeItem("foo");
ZonedDateTime expectedZdt = Instant.ofEpochMilli(1468773487050L).atZone(ZoneId.systemDefault()); ZonedDateTime expectedZdt = Instant.ofEpochMilli(1468773487050L).atZone(ZoneId.systemDefault());
testAsHistoricGeneric(dbitem, new DateTimeItem("foo"), new DateTimeType(expectedZdt)); DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item, new DateTimeType("2016-07-17T19:38:07.050+0300"),
"2016-07-17T16:38:07.050Z");
testAsHistoricGeneric(dbitem, item, new DateTimeType(expectedZdt));
} }
@Test @ParameterizedTest
public void testDateTimeTypeLocalWithStringItem() throws IOException { @CsvSource({ "true", "false" })
public void testDateTimeTypeLocalWithStringItem(boolean legacy) throws IOException {
GenericItem item = new StringItem("foo");
Instant instant = Instant.ofEpochMilli(1468773487050L); // GMT: Sun, 17 Jul 2016 16:38:07.050 GMT Instant instant = Instant.ofEpochMilli(1468773487050L); // GMT: Sun, 17 Jul 2016 16:38:07.050 GMT
ZonedDateTime zdt = instant.atZone(TimeZone.getTimeZone("GMT+03:00").toZoneId()); ZonedDateTime zdt = instant.atZone(TimeZone.getTimeZone("GMT+03:00").toZoneId());
DynamoDBItem<?> dbitem = testStateGeneric(new DateTimeType(zdt), "2016-07-17T16:38:07.050Z"); DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item, new DateTimeType(zdt),
testAsHistoricGeneric(dbitem, new StringItem("foo"), new StringType("2016-07-17T16:38:07.050Z")); "2016-07-17T16:38:07.050Z");
testAsHistoricGeneric(dbitem, item, new StringType("2016-07-17T16:38:07.050Z"));
} }
@Test @ParameterizedTest
public void testPointTypeWithLocationItem() throws IOException { @CsvSource({ "true", "false" })
public void testPointTypeWithLocationItem(boolean legacy) throws IOException {
GenericItem item = new LocationItem("foo");
final PointType point = new PointType(new DecimalType(60.3), new DecimalType(30.2), new DecimalType(510.90)); final PointType point = new PointType(new DecimalType(60.3), new DecimalType(30.2), new DecimalType(510.90));
String expected = point.getLatitude().toBigDecimal().toString() + "," String expected = point.getLatitude().toBigDecimal().toString() + ","
+ point.getLongitude().toBigDecimal().toString() + "," + point.getAltitude().toBigDecimal().toString(); + point.getLongitude().toBigDecimal().toString() + "," + point.getAltitude().toBigDecimal().toString();
DynamoDBItem<?> dbitem = testStateGeneric(point, expected); DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item, point, expected);
testAsHistoricGeneric(dbitem, new LocationItem("foo"), point); testAsHistoricGeneric(dbitem, item, point);
} }
@Test @ParameterizedTest
public void testDecimalTypeWithNumberItem() throws IOException { @CsvSource({ "true", "false" })
DynamoDBItem<?> dbitem = testStateGeneric(new DecimalType("3.2"), new BigDecimal("3.2")); public void testDecimalTypeWithNumberItem(boolean legacy) throws IOException {
testAsHistoricGeneric(dbitem, new NumberItem("foo"), new DecimalType("3.2")); GenericItem item = new NumberItem("foo");
DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item, new DecimalType("3.2"), new BigDecimal("3.2"));
testAsHistoricGeneric(dbitem, item, new DecimalType("3.2"));
} }
@Test @ParameterizedTest
public void testPercentTypeWithColorItem() throws IOException { @CsvSource({ "true", "false" })
DynamoDBItem<?> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2")); public void testPercentTypeWithColorItem(boolean legacy) throws IOException {
testAsHistoricGeneric(dbitem, new ColorItem("foo"), new PercentType(new BigDecimal("3.2"))); GenericItem item = new ColorItem("foo");
DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item, new PercentType(new BigDecimal("3.2")),
"0,0,3.2");
testAsHistoricGeneric(dbitem, item, new HSBType(DecimalType.ZERO, PercentType.ZERO, new PercentType("3.2")));
} }
@Test @ParameterizedTest
public void testPercentTypeWithDimmerItem() throws IOException { @CsvSource({ "true", "false" })
DynamoDBItem<?> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2")); public void testPercentTypeWithDimmerItem(boolean legacy) throws IOException {
testAsHistoricGeneric(dbitem, new DimmerItem("foo"), new PercentType(new BigDecimal("3.2"))); GenericItem item = new DimmerItem("foo");
DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item, new PercentType(new BigDecimal("3.2")),
new BigDecimal("3.2"));
testAsHistoricGeneric(dbitem, item, new PercentType(new BigDecimal("3.2")));
} }
@Test @ParameterizedTest
public void testPercentTypeWithRollerShutterItem() throws IOException { @CsvSource({ "true", "false" })
DynamoDBItem<?> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2")); public void testPercentTypeWithRollerShutterItem(boolean legacy) throws IOException {
testAsHistoricGeneric(dbitem, new RollershutterItem("foo"), new PercentType(new BigDecimal("3.2"))); GenericItem item = new RollershutterItem("foo");
DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item, new PercentType(81), new BigDecimal("81"));
testAsHistoricGeneric(dbitem, item, new PercentType(81));
} }
@Test @ParameterizedTest
public void testPercentTypeWithNumberItem() throws IOException { @CsvSource({ "true", "false" })
DynamoDBItem<?> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2")); public void testUpDownTypeWithRollershutterItem(boolean legacy) throws IOException {
// note: comes back as DecimalType instead of the original PercentType GenericItem item = new RollershutterItem("foo");
testAsHistoricGeneric(dbitem, new NumberItem("foo"), new DecimalType(new BigDecimal("3.2")));
}
@Test
public void testUpDownTypeWithRollershutterItem() throws IOException {
// note: comes back as PercentType instead of the original UpDownType // note: comes back as PercentType instead of the original UpDownType
DynamoDBItem<?> dbItemDown = testStateGeneric(UpDownType.DOWN, BigDecimal.ZERO); {
testAsHistoricGeneric(dbItemDown, new RollershutterItem("foo"), new PercentType(BigDecimal.ZERO)); // down == 1.0 = 100%
State expectedDeserializedState = PercentType.HUNDRED;
DynamoDBItem<?> dbItemDown = testSerializationToDTO(legacy, item, UpDownType.DOWN, new BigDecimal(100));
testAsHistoricGeneric(dbItemDown, item, expectedDeserializedState);
assertEquals(UpDownType.DOWN, expectedDeserializedState.as(UpDownType.class));
}
DynamoDBItem<?> dbItemUp = testStateGeneric(UpDownType.UP, BigDecimal.ONE); {
testAsHistoricGeneric(dbItemUp, new RollershutterItem("foo"), new PercentType(BigDecimal.ONE)); // up == 0
State expectedDeserializedState = PercentType.ZERO;
DynamoDBItem<?> dbItemUp = testSerializationToDTO(legacy, item, UpDownType.UP, BigDecimal.ZERO);
testAsHistoricGeneric(dbItemUp, item, expectedDeserializedState);
assertEquals(UpDownType.UP, expectedDeserializedState.as(UpDownType.class));
}
} }
@Test @ParameterizedTest
public void testStringTypeWithStringItem() throws IOException { @CsvSource({ "true", "false" })
DynamoDBItem<?> dbitem = testStateGeneric(new StringType("foo bar"), "foo bar"); public void testStringTypeWithStringItem(boolean legacy) throws IOException {
testAsHistoricGeneric(dbitem, new StringItem("foo"), new StringType("foo bar")); GenericItem item = new StringItem("foo");
DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item, new StringType("foo bar"), "foo bar");
testAsHistoricGeneric(dbitem, item, new StringType("foo bar"));
} }
@Test @ParameterizedTest
public void testOnOffTypeWithColorItem() throws IOException { @CsvSource({ "true", "false" })
DynamoDBItem<?> dbitemOff = testStateGeneric(OnOffType.OFF, BigDecimal.ZERO); public void testOnOffTypeWithColorItem(boolean legacy) throws IOException {
testAsHistoricGeneric(dbitemOff, new ColorItem("foo"), new PercentType(BigDecimal.ZERO)); GenericItem item = new ColorItem("foo");
DynamoDBItem<?> dbitemOff = testSerializationToDTO(legacy, item, OnOffType.OFF, "0,0,0");
testAsHistoricGeneric(dbitemOff, item, HSBType.BLACK);
DynamoDBItem<?> dbitemOn = testStateGeneric(OnOffType.ON, BigDecimal.ONE); DynamoDBItem<?> dbitemOn = testSerializationToDTO(legacy, item, OnOffType.ON, "0,0,100");
testAsHistoricGeneric(dbitemOn, new ColorItem("foo"), new PercentType(BigDecimal.ONE)); testAsHistoricGeneric(dbitemOn, item, HSBType.WHITE);
} }
@Test @ParameterizedTest
public void testOnOffTypeWithDimmerItem() throws IOException { @CsvSource({ "true", "false" })
DynamoDBItem<?> dbitemOff = testStateGeneric(OnOffType.OFF, BigDecimal.ZERO); public void testOnOffTypeWithDimmerItem(boolean legacy) throws IOException {
testAsHistoricGeneric(dbitemOff, new DimmerItem("foo"), new PercentType(BigDecimal.ZERO)); GenericItem item = new DimmerItem("foo");
{
State expectedDeserializedState = PercentType.ZERO;
DynamoDBItem<?> dbitemOff = testSerializationToDTO(legacy, item, OnOffType.OFF, BigDecimal.ZERO);
testAsHistoricGeneric(dbitemOff, item, expectedDeserializedState);
assertEquals(OnOffType.OFF, expectedDeserializedState.as(OnOffType.class));
}
DynamoDBItem<?> dbitemOn = testStateGeneric(OnOffType.ON, BigDecimal.ONE); {
testAsHistoricGeneric(dbitemOn, new DimmerItem("foo"), new PercentType(BigDecimal.ONE)); State expectedDeserializedState = PercentType.HUNDRED;
DynamoDBItem<?> dbitemOn = testSerializationToDTO(legacy, item, OnOffType.ON, new BigDecimal(100));
testAsHistoricGeneric(dbitemOn, item, expectedDeserializedState);
assertEquals(OnOffType.ON, expectedDeserializedState.as(OnOffType.class));
}
} }
@Test @ParameterizedTest
public void testOnOffTypeWithSwitchItem() throws IOException { @CsvSource({ "true", "false" })
DynamoDBItem<?> dbitemOff = testStateGeneric(OnOffType.OFF, BigDecimal.ZERO); public void testOnOffTypeWithSwitchItem(boolean legacy) throws IOException {
testAsHistoricGeneric(dbitemOff, new SwitchItem("foo"), OnOffType.OFF); GenericItem item = new SwitchItem("foo");
DynamoDBItem<?> dbitemOff = testSerializationToDTO(legacy, item, OnOffType.OFF, BigDecimal.ZERO);
testAsHistoricGeneric(dbitemOff, item, OnOffType.OFF);
DynamoDBItem<?> dbitemOn = testStateGeneric(OnOffType.ON, BigDecimal.ONE); DynamoDBItem<?> dbitemOn = testSerializationToDTO(legacy, item, OnOffType.ON, BigDecimal.ONE);
testAsHistoricGeneric(dbitemOn, new SwitchItem("foo"), OnOffType.ON); testAsHistoricGeneric(dbitemOn, item, OnOffType.ON);
} }
@Test @ParameterizedTest
public void testHSBTypeWithColorItem() throws IOException { @CsvSource({ "true", "false" })
public void testHSBTypeWithColorItem(boolean legacy) throws IOException {
GenericItem item = new ColorItem("foo");
HSBType hsb = new HSBType(new DecimalType(1.5), new PercentType(new BigDecimal(2.5)), HSBType hsb = new HSBType(new DecimalType(1.5), new PercentType(new BigDecimal(2.5)),
new PercentType(new BigDecimal(3.5))); new PercentType(new BigDecimal(3.5)));
DynamoDBItem<?> dbitem = testStateGeneric(hsb, "1.5,2.5,3.5"); DynamoDBItem<?> dbitem = testSerializationToDTO(legacy, item, hsb, "1.5,2.5,3.5");
testAsHistoricGeneric(dbitem, new ColorItem("foo"), hsb); testAsHistoricGeneric(dbitem, item, hsb);
} }
} }

View File

@ -21,7 +21,6 @@ import java.util.Iterator;
import org.eclipse.jdt.annotation.NonNullByDefault; import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable; import org.eclipse.jdt.annotation.Nullable;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.openhab.core.persistence.FilterCriteria; import org.openhab.core.persistence.FilterCriteria;
import org.openhab.core.persistence.FilterCriteria.Operator; import org.openhab.core.persistence.FilterCriteria.Operator;
@ -85,23 +84,15 @@ public abstract class AbstractTwoItemIntegrationTest extends BaseIntegrationTest
assertEquals(expected, actual); assertEquals(expected, actual);
} }
@BeforeAll
public static void checkService() throws InterruptedException {
String msg = "DynamoDB integration tests will be skipped. Did you specify AWS credentials for testing? "
+ "See BaseIntegrationTest for more details";
if (service == null) {
System.out.println(msg);
}
assumeTrue(service != null, msg);
}
/** /**
* Asserts that iterable contains correct items and nothing else * Asserts that iterable contains correct items and nothing else
* *
*/ */
private void assertIterableContainsItems(Iterable<HistoricItem> iterable, boolean ascending) { protected void assertIterableContainsItems(Iterable<HistoricItem> iterable, boolean ascending) {
Iterator<HistoricItem> iterator = iterable.iterator(); Iterator<HistoricItem> iterator = iterable.iterator();
assertTrue(iterator.hasNext());
HistoricItem actual1 = iterator.next(); HistoricItem actual1 = iterator.next();
assertTrue(iterator.hasNext());
HistoricItem actual2 = iterator.next(); HistoricItem actual2 = iterator.next();
assertFalse(iterator.hasNext()); assertFalse(iterator.hasNext());
@ -129,228 +120,287 @@ public abstract class AbstractTwoItemIntegrationTest extends BaseIntegrationTest
@Test @Test
public void testQueryUsingName() { public void testQueryUsingName() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOrdering(Ordering.ASCENDING); FilterCriteria criteria = new FilterCriteria();
criteria.setItemName(getItemName()); criteria.setOrdering(Ordering.ASCENDING);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setItemName(getItemName());
assertIterableContainsItems(iterable, true); @SuppressWarnings("null")
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertIterableContainsItems(iterable, true);
});
} }
@Test @Test
public void testQueryUsingNameAndStart() { public void testQueryUsingNameAndStart() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOrdering(Ordering.ASCENDING); FilterCriteria criteria = new FilterCriteria();
criteria.setItemName(getItemName()); criteria.setOrdering(Ordering.ASCENDING);
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setBeginDate(beforeStore);
assertIterableContainsItems(iterable, true); @SuppressWarnings("null")
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertIterableContainsItems(iterable, true);
});
} }
@Test @Test
public void testQueryUsingNameAndStartNoMatch() { public void testQueryUsingNameAndStartNoMatch() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setItemName(getItemName());
criteria.setBeginDate(afterStore2); FilterCriteria criteria = new FilterCriteria();
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setItemName(getItemName());
assertFalse(iterable.iterator().hasNext()); criteria.setBeginDate(afterStore2);
@SuppressWarnings("null")
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertFalse(iterable.iterator().hasNext());
});
} }
@Test @Test
public void testQueryUsingNameAndEnd() { public void testQueryUsingNameAndEnd() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOrdering(Ordering.ASCENDING); FilterCriteria criteria = new FilterCriteria();
criteria.setItemName(getItemName()); criteria.setOrdering(Ordering.ASCENDING);
criteria.setEndDate(afterStore2); criteria.setItemName(getItemName());
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(afterStore2);
assertIterableContainsItems(iterable, true); @SuppressWarnings("null")
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertIterableContainsItems(iterable, true);
});
} }
@Test @Test
public void testQueryUsingNameAndEndNoMatch() { public void testQueryUsingNameAndEndNoMatch() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setItemName(getItemName()); FilterCriteria criteria = new FilterCriteria();
criteria.setEndDate(beforeStore); criteria.setItemName(getItemName());
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(beforeStore);
assertFalse(iterable.iterator().hasNext()); @SuppressWarnings("null")
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertFalse(iterable.iterator().hasNext());
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEnd() { public void testQueryUsingNameAndStartAndEnd() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOrdering(Ordering.ASCENDING); FilterCriteria criteria = new FilterCriteria();
criteria.setItemName(getItemName()); criteria.setOrdering(Ordering.ASCENDING);
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
criteria.setEndDate(afterStore2); criteria.setBeginDate(beforeStore);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(afterStore2);
assertIterableContainsItems(iterable, true); @SuppressWarnings("null")
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertIterableContainsItems(iterable, true);
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEndDesc() { public void testQueryUsingNameAndStartAndEndDesc() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOrdering(Ordering.DESCENDING); FilterCriteria criteria = new FilterCriteria();
criteria.setItemName(getItemName()); criteria.setOrdering(Ordering.DESCENDING);
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
criteria.setEndDate(afterStore2); criteria.setBeginDate(beforeStore);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(afterStore2);
assertIterableContainsItems(iterable, false); @SuppressWarnings("null")
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertIterableContainsItems(iterable, false);
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEndWithNEQOperator() { public void testQueryUsingNameAndStartAndEndWithNEQOperator() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOperator(Operator.NEQ); FilterCriteria criteria = new FilterCriteria();
criteria.setState(getSecondItemState()); criteria.setOperator(Operator.NEQ);
criteria.setItemName(getItemName()); criteria.setState(getSecondItemState());
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
criteria.setEndDate(afterStore2); criteria.setBeginDate(beforeStore);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(afterStore2);
Iterator<HistoricItem> iterator = iterable.iterator(); @SuppressWarnings("null")
HistoricItem actual1 = iterator.next(); Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertFalse(iterator.hasNext()); Iterator<HistoricItem> iterator = iterable.iterator();
assertStateEquals(getFirstItemState(), actual1.getState()); assertTrue(iterator.hasNext());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant())); HistoricItem actual1 = iterator.next();
assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant())); assertFalse(iterator.hasNext());
assertStateEquals(getFirstItemState(), actual1.getState());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant()));
assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant()));
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEndWithEQOperator() { public void testQueryUsingNameAndStartAndEndWithEQOperator() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOperator(Operator.EQ); FilterCriteria criteria = new FilterCriteria();
criteria.setState(getFirstItemState()); criteria.setOperator(Operator.EQ);
criteria.setItemName(getItemName()); criteria.setState(getFirstItemState());
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
criteria.setEndDate(afterStore2); criteria.setBeginDate(beforeStore);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(afterStore2);
Iterator<HistoricItem> iterator = iterable.iterator(); @SuppressWarnings("null")
HistoricItem actual1 = iterator.next(); Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertFalse(iterator.hasNext()); Iterator<HistoricItem> iterator = iterable.iterator();
assertStateEquals(getFirstItemState(), actual1.getState()); assertTrue(iterator.hasNext());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant())); HistoricItem actual1 = iterator.next();
assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant())); assertFalse(iterator.hasNext());
assertStateEquals(getFirstItemState(), actual1.getState());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant()));
assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant()));
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEndWithLTOperator() { public void testQueryUsingNameAndStartAndEndWithLTOperator() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOperator(Operator.LT); FilterCriteria criteria = new FilterCriteria();
criteria.setState(getSecondItemState()); criteria.setOperator(Operator.LT);
criteria.setItemName(getItemName()); criteria.setState(getSecondItemState());
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
criteria.setEndDate(afterStore2); criteria.setBeginDate(beforeStore);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(afterStore2);
Iterator<HistoricItem> iterator = iterable.iterator(); @SuppressWarnings("null")
HistoricItem actual1 = iterator.next(); Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertFalse(iterator.hasNext()); Iterator<HistoricItem> iterator = iterable.iterator();
assertStateEquals(getFirstItemState(), actual1.getState()); assertTrue(iterator.hasNext());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant())); HistoricItem actual1 = iterator.next();
assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant())); assertFalse(iterator.hasNext());
assertStateEquals(getFirstItemState(), actual1.getState());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant()));
assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant()));
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEndWithLTOperatorNoMatch() { public void testQueryUsingNameAndStartAndEndWithLTOperatorNoMatch() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOperator(Operator.LT); FilterCriteria criteria = new FilterCriteria();
criteria.setState(getFirstItemState()); criteria.setOperator(Operator.LT);
criteria.setItemName(getItemName()); criteria.setState(getFirstItemState());
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
criteria.setEndDate(afterStore2); criteria.setBeginDate(beforeStore);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(afterStore2);
Iterator<HistoricItem> iterator = iterable.iterator(); @SuppressWarnings("null")
assertFalse(iterator.hasNext()); Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
Iterator<HistoricItem> iterator = iterable.iterator();
assertFalse(iterator.hasNext());
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEndWithLTEOperator() { public void testQueryUsingNameAndStartAndEndWithLTEOperator() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOperator(Operator.LTE); FilterCriteria criteria = new FilterCriteria();
criteria.setState(getFirstItemState()); criteria.setOperator(Operator.LTE);
criteria.setItemName(getItemName()); criteria.setState(getFirstItemState());
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
criteria.setEndDate(afterStore2); criteria.setBeginDate(beforeStore);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(afterStore2);
Iterator<HistoricItem> iterator = iterable.iterator(); @SuppressWarnings("null")
HistoricItem actual1 = iterator.next(); Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertFalse(iterator.hasNext()); Iterator<HistoricItem> iterator = iterable.iterator();
assertStateEquals(getFirstItemState(), actual1.getState()); assertTrue(iterator.hasNext());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant())); HistoricItem actual1 = iterator.next();
assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant())); assertFalse(iterator.hasNext());
assertStateEquals(getFirstItemState(), actual1.getState());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant()));
assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant()));
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEndWithGTOperator() { public void testQueryUsingNameAndStartAndEndWithGTOperator() {
// Skip for subclasses which have null "state between" waitForAssert(() -> {
assumeTrue(getQueryItemStateBetween() != null); // Skip for subclasses which have null "state between"
assumeTrue(getQueryItemStateBetween() != null);
FilterCriteria criteria = new FilterCriteria(); FilterCriteria criteria = new FilterCriteria();
criteria.setOperator(Operator.GT); criteria.setOperator(Operator.GT);
criteria.setState(getQueryItemStateBetween()); criteria.setState(getQueryItemStateBetween());
criteria.setItemName(getItemName()); criteria.setItemName(getItemName());
criteria.setBeginDate(beforeStore); criteria.setBeginDate(beforeStore);
criteria.setEndDate(afterStore2); criteria.setEndDate(afterStore2);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); @SuppressWarnings("null")
Iterator<HistoricItem> iterator = iterable.iterator(); Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
HistoricItem actual1 = iterator.next(); Iterator<HistoricItem> iterator = iterable.iterator();
assertFalse(iterator.hasNext()); assertTrue(iterator.hasNext());
assertStateEquals(getSecondItemState(), actual1.getState()); HistoricItem actual1 = iterator.next();
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore2.toInstant())); assertFalse(iterator.hasNext());
assertTrue(actual1.getTimestamp().toInstant().isAfter(afterStore1.toInstant())); assertStateEquals(getSecondItemState(), actual1.getState());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore2.toInstant()));
assertTrue(actual1.getTimestamp().toInstant().isAfter(afterStore1.toInstant()));
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEndWithGTOperatorNoMatch() { public void testQueryUsingNameAndStartAndEndWithGTOperatorNoMatch() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOperator(Operator.GT); FilterCriteria criteria = new FilterCriteria();
criteria.setState(getSecondItemState()); criteria.setOperator(Operator.GT);
criteria.setItemName(getItemName()); criteria.setState(getSecondItemState());
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
criteria.setEndDate(afterStore2); criteria.setBeginDate(beforeStore);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(afterStore2);
Iterator<HistoricItem> iterator = iterable.iterator(); @SuppressWarnings("null")
assertFalse(iterator.hasNext()); Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
Iterator<HistoricItem> iterator = iterable.iterator();
assertFalse(iterator.hasNext());
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEndWithGTEOperator() { public void testQueryUsingNameAndStartAndEndWithGTEOperator() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOperator(Operator.GTE); FilterCriteria criteria = new FilterCriteria();
criteria.setState(getSecondItemState()); criteria.setOperator(Operator.GTE);
criteria.setItemName(getItemName()); criteria.setState(getSecondItemState());
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
criteria.setEndDate(afterStore2); criteria.setBeginDate(beforeStore);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(afterStore2);
Iterator<HistoricItem> iterator = iterable.iterator(); @SuppressWarnings("null")
HistoricItem actual1 = iterator.next(); Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertFalse(iterator.hasNext()); Iterator<HistoricItem> iterator = iterable.iterator();
assertStateEquals(getSecondItemState(), actual1.getState()); assertTrue(iterator.hasNext());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore2.toInstant())); HistoricItem actual1 = iterator.next();
assertTrue(actual1.getTimestamp().toInstant().isAfter(afterStore1.toInstant())); assertFalse(iterator.hasNext());
assertStateEquals(getSecondItemState(), actual1.getState());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore2.toInstant()));
assertTrue(actual1.getTimestamp().toInstant().isAfter(afterStore1.toInstant()));
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEndFirst() { public void testQueryUsingNameAndStartAndEndFirst() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setOrdering(Ordering.ASCENDING); FilterCriteria criteria = new FilterCriteria();
criteria.setItemName(getItemName()); criteria.setOrdering(Ordering.ASCENDING);
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
criteria.setEndDate(afterStore1); criteria.setBeginDate(beforeStore);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(afterStore1);
@SuppressWarnings("null")
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
Iterator<HistoricItem> iterator = iterable.iterator(); Iterator<HistoricItem> iterator = iterable.iterator();
HistoricItem actual1 = iterator.next(); assertTrue(iterator.hasNext());
assertFalse(iterator.hasNext()); HistoricItem actual1 = iterator.next();
assertStateEquals(getFirstItemState(), actual1.getState()); assertFalse(iterator.hasNext());
assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant())); assertStateEquals(getFirstItemState(), actual1.getState());
assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant())); assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant()));
assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant()));
});
} }
@Test @Test
public void testQueryUsingNameAndStartAndEndNoMatch() { public void testQueryUsingNameAndStartAndEndNoMatch() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setItemName(getItemName()); FilterCriteria criteria = new FilterCriteria();
criteria.setBeginDate(beforeStore); criteria.setItemName(getItemName());
criteria.setEndDate(beforeStore); // sic criteria.setBeginDate(beforeStore);
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria); criteria.setEndDate(beforeStore); // sic
assertFalse(iterable.iterator().hasNext()); @SuppressWarnings("null")
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
assertFalse(iterable.iterator().hasNext());
});
} }
} }

View File

@ -12,16 +12,34 @@
*/ */
package org.openhab.persistence.dynamodb.internal; package org.openhab.persistence.dynamodb.internal;
import static org.junit.jupiter.api.Assertions.*;
import static org.mockito.Mockito.when;
import java.net.ServerSocket;
import java.net.URI;
import java.util.Collection; import java.util.Collection;
import java.util.HashMap; import java.util.HashMap;
import java.util.Hashtable;
import java.util.Map; import java.util.Map;
import java.util.Map.Entry; import java.util.Map.Entry;
import java.util.concurrent.ExecutionException;
import java.util.stream.Stream; import java.util.stream.Stream;
import javax.measure.Unit;
import javax.measure.quantity.Dimensionless;
import javax.measure.quantity.Temperature;
import org.eclipse.jdt.annotation.NonNullByDefault; import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable; import org.eclipse.jdt.annotation.Nullable;
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.TestInfo;
import org.mockito.Mockito;
import org.openhab.core.common.registry.RegistryChangeListener; import org.openhab.core.common.registry.RegistryChangeListener;
import org.openhab.core.i18n.UnitProvider;
import org.openhab.core.internal.i18n.I18nProviderImpl;
import org.openhab.core.items.GenericItem;
import org.openhab.core.items.GroupItem;
import org.openhab.core.items.Item; import org.openhab.core.items.Item;
import org.openhab.core.items.ItemNotFoundException; import org.openhab.core.items.ItemNotFoundException;
import org.openhab.core.items.ItemNotUniqueException; import org.openhab.core.items.ItemNotUniqueException;
@ -38,10 +56,21 @@ import org.openhab.core.library.items.PlayerItem;
import org.openhab.core.library.items.RollershutterItem; import org.openhab.core.library.items.RollershutterItem;
import org.openhab.core.library.items.StringItem; import org.openhab.core.library.items.StringItem;
import org.openhab.core.library.items.SwitchItem; import org.openhab.core.library.items.SwitchItem;
import org.openhab.core.library.unit.SIUnits;
import org.openhab.core.library.unit.Units;
import org.openhab.core.test.java.JavaTest;
import org.osgi.framework.BundleContext;
import org.osgi.service.component.ComponentContext;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException; import com.amazonaws.services.dynamodbv2.local.main.ServerRunner;
import com.amazonaws.services.dynamodbv2.local.server.DynamoDBProxyServer;
import software.amazon.awssdk.core.waiters.WaiterResponse;
import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClient;
import software.amazon.awssdk.services.dynamodb.model.DescribeTableResponse;
import software.amazon.awssdk.services.dynamodb.model.ResourceNotFoundException;
/** /**
* *
@ -49,19 +78,87 @@ import com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException;
* *
*/ */
@NonNullByDefault @NonNullByDefault
public class BaseIntegrationTest { public class BaseIntegrationTest extends JavaTest {
protected static final String TABLE = "dynamodb-integration-tests";
protected static final String TABLE_PREFIX = "dynamodb-integration-tests-";
protected static final Logger LOGGER = LoggerFactory.getLogger(DynamoDBPersistenceService.class); protected static final Logger LOGGER = LoggerFactory.getLogger(DynamoDBPersistenceService.class);
protected static @Nullable DynamoDBPersistenceService service; protected static @Nullable DynamoDBPersistenceService service;
protected static final Map<String, Item> ITEMS = new HashMap<>(); protected static final Map<String, Item> ITEMS = new HashMap<>();
protected static @Nullable DynamoDBProxyServer embeddedServer;
/*
* SI system has Celsius as temperature unit
*/
protected static final Unit<Temperature> TEMP_ITEM_UNIT = SIUnits.CELSIUS;
protected static final Unit<Dimensionless> DIMENSIONLESS_ITEM_UNIT = Units.ONE;
private static @Nullable URI endpointOverride;
protected static UnitProvider UNIT_PROVIDER;
static { static {
System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", "trace"); ComponentContext context = Mockito.mock(ComponentContext.class);
BundleContext bundleContext = Mockito.mock(BundleContext.class);
Hashtable<String, Object> properties = new Hashtable<String, Object>();
properties.put("measurementSystem", SIUnits.MEASUREMENT_SYSTEM_NAME);
when(context.getProperties()).thenReturn(properties);
when(context.getBundleContext()).thenReturn(bundleContext);
UNIT_PROVIDER = new I18nProviderImpl(context);
}
/**
* Whether tests are run in Continuous Integration environment, i.e. Jenkins or Travis CI
*
* Travis CI is detected using CI environment variable, see https://docs.travis-ci.com/user/environment-variables/
* Jenkins CI is detected using JENKINS_HOME environment variable
*
* @return
*/
protected static boolean isRunningInCI() {
String jenkinsHome = System.getenv("JENKINS_HOME");
return "true".equals(System.getenv("CI")) || (jenkinsHome != null && !jenkinsHome.isBlank());
}
private static boolean credentialsSet() {
String access = System.getProperty("DYNAMODBTEST_ACCESS");
String secret = System.getProperty("DYNAMODBTEST_SECRET");
return access != null && !access.isBlank() && secret != null && !secret.isBlank();
}
private static int findFreeTCPPort() {
try (ServerSocket serverSocket = new ServerSocket(0)) {
int localPort = serverSocket.getLocalPort();
assertTrue(localPort > 0);
return localPort;
} catch (Exception e) {
fail("Unable to find free tcp port for embedded DynamoDB server");
return -1; // Make compiler happy
}
}
@Override
protected void waitForAssert(Runnable runnable) {
// Longer timeouts and slower polling with real dynamodb
// Non-CI tests against local server are with lower timeout.
waitForAssert(runnable, hasFakeServer() ? isRunningInCI() ? 30_000L : 10_000L : 120_000L,
hasFakeServer() ? 500 : 1000L);
} }
@BeforeAll @BeforeAll
public static void initService() throws InterruptedException { protected static void populateItems() {
ITEMS.put("dimmer", new DimmerItem("dimmer")); ITEMS.put("dimmer", new DimmerItem("dimmer"));
ITEMS.put("number", new NumberItem("number")); ITEMS.put("number", new NumberItem("number"));
NumberItem temperatureItem = new NumberItem("Number:Temperature", "numberTemperature");
ITEMS.put("numberTemperature", temperatureItem);
GroupItem groupTemperature = new GroupItem("groupNumberTemperature", temperatureItem);
ITEMS.put("groupNumberTemperature", groupTemperature);
NumberItem dimensionlessItem = new NumberItem("Number:Dimensionless", "numberDimensionless");
ITEMS.put("numberDimensionless", dimensionlessItem);
GroupItem groupDimensionless = new GroupItem("groupNumberDimensionless", dimensionlessItem);
ITEMS.put("groupNumberDimensionless", groupDimensionless);
GroupItem groupDummy = new GroupItem("dummyGroup", null);
ITEMS.put("groupDummy", groupDummy);
ITEMS.put("string", new StringItem("string")); ITEMS.put("string", new StringItem("string"));
ITEMS.put("switch", new SwitchItem("switch")); ITEMS.put("switch", new SwitchItem("switch"));
ITEMS.put("contact", new ContactItem("contact")); ITEMS.put("contact", new ContactItem("contact"));
@ -73,6 +170,61 @@ public class BaseIntegrationTest {
ITEMS.put("player_playpause", new PlayerItem("player_playpause")); ITEMS.put("player_playpause", new PlayerItem("player_playpause"));
ITEMS.put("player_rewindfastforward", new PlayerItem("player_rewindfastforward")); ITEMS.put("player_rewindfastforward", new PlayerItem("player_rewindfastforward"));
injectItemServices();
}
@BeforeAll
public static void initService(TestInfo testInfo) throws InterruptedException, IllegalArgumentException,
IllegalAccessException, NoSuchFieldException, SecurityException {
service = newService(isLegacyTest(testInfo), true, null, null, null);
clearData();
}
/**
* Create new persistence service. Either pointing to real DynamoDB (given credentials as java properties) or local
* in-memory server
*
* @param legacy whether to create config that implies legacy or new schema. Use null for MAYBE_LEGACY
* @param cleanLocal when creating local DB, whether to create new DB
* @param overrideLocalURI URI to use when using local DB
* @param table
* @param tablePrefix
* @return new persistence service
*/
protected synchronized static DynamoDBPersistenceService newService(@Nullable Boolean legacy, boolean cleanLocal,
@Nullable URI overrideLocalURI, @Nullable String table, @Nullable String tablePrefix) {
final DynamoDBPersistenceService service;
Map<String, Object> config = getConfig(legacy, table, tablePrefix);
if (cleanLocal && overrideLocalURI != null) {
throw new IllegalArgumentException("cannot specify both cleanLocal=true and overrideLocalURI");
}
if (legacy == null && (table != null || tablePrefix != null)) {
throw new IllegalArgumentException("cannot specify both legacy=null and unambiguous table configuration");
}
URI localEndpointOverride = overrideLocalURI == null ? endpointOverride : overrideLocalURI;
if (overrideLocalURI == null && !credentialsSet() && (cleanLocal || endpointOverride == null)) {
// Local server not started yet, start it
// endpointOverride static field has the URI
LOGGER.info("Since credentials have not been defined, using embedded local AWS DynamoDB server");
System.setProperty("sqlite4java.library.path", "src/test/resources/native-libs");
int port = findFreeTCPPort();
String endpoint = String.format("http://127.0.0.1:%d", port);
try {
localEndpointOverride = new URI(endpoint);
DynamoDBProxyServer localEmbeddedServer = ServerRunner
.createServerFromCommandLineArgs(new String[] { "-inMemory", "-port", String.valueOf(port) });
localEmbeddedServer.start();
embeddedServer = localEmbeddedServer;
} catch (Exception e) {
fail("Error with embedded DynamoDB server", e);
throw new IllegalStateException();
}
}
if (endpointOverride == null) {
endpointOverride = localEndpointOverride;
}
service = new DynamoDBPersistenceService(new ItemRegistry() { service = new DynamoDBPersistenceService(new ItemRegistry() {
@Override @Override
public Collection<Item> getItems(String pattern) { public Collection<Item> getItems(String pattern) {
@ -95,6 +247,7 @@ public class BaseIntegrationTest {
if (item == null) { if (item == null) {
throw new ItemNotFoundException(name); throw new ItemNotFoundException(name);
} }
injectItemServices(item);
return item; return item;
} }
@ -172,44 +325,130 @@ public class BaseIntegrationTest {
public void removeRegistryHook(RegistryHook<Item> hook) { public void removeRegistryHook(RegistryHook<Item> hook) {
throw new UnsupportedOperationException(); throw new UnsupportedOperationException();
} }
}); }, localEndpointOverride);
service.activate(null, config);
return service;
}
protected static void injectItemServices() {
ITEMS.values().forEach(BaseIntegrationTest::injectItemServices);
}
protected static void injectItemServices(Item item) {
if (item instanceof GenericItem) {
GenericItem genericItem = (GenericItem) item;
genericItem.setUnitProvider(UNIT_PROVIDER);
}
}
private static Map<String, Object> getConfig(@Nullable Boolean legacy, @Nullable String table,
@Nullable String tablePrefix) {
Map<String, Object> config = new HashMap<>(); Map<String, Object> config = new HashMap<>();
String value = System.getProperty("DYNAMODBTEST_REGION"); if (legacy != null) {
config.put("region", value != null ? value : ""); if (legacy.booleanValue()) {
value = System.getProperty("DYNAMODBTEST_ACCESS"); LOGGER.info("Legacy test");
config.put("accessKey", value != null ? value : ""); config.put("tablePrefix", tablePrefix == null ? TABLE_PREFIX : tablePrefix);
value = System.getProperty("DYNAMODBTEST_SECRET"); } else {
config.put("secretKey", value != null ? value : ""); LOGGER.info("Non-legacy test");
config.put("tablePrefix", "dynamodb-integration-tests-"); config.put("table", table == null ? TABLE : table);
config.put("expireDays", "1");
// Disable buffering
config.put("bufferSize", "0");
for (Entry<String, Object> entry : config.entrySet()) {
if (((String) entry.getValue()).isEmpty()) {
LOGGER.warn(String.format(
"Expecting %s to have value for integration tests. Integration tests will be skipped",
entry.getKey()));
service = null;
return;
} }
} }
service.activate(null, config); if (credentialsSet()) {
clearData(); LOGGER.info("Since credentials have been defined, using real AWS DynamoDB");
String value = System.getProperty("DYNAMODBTEST_REGION");
config.put("region", value != null ? value : "");
value = System.getProperty("DYNAMODBTEST_ACCESS");
config.put("accessKey", value != null ? value : "");
value = System.getProperty("DYNAMODBTEST_SECRET");
config.put("secretKey", value != null ? value : "");
for (Entry<String, Object> entry : config.entrySet()) {
if (((String) entry.getValue()).isEmpty()) {
fail("Expecting " + entry.getKey()
+ " to have value for integration tests. Integration test will fail");
throw new IllegalArgumentException();
}
}
} else {
// Place some values to pass the configuration validation
config.put("region", "eu-west-1");
config.put("accessKey", "dummy-access-key");
config.put("secretKey", "dummy-secret-key");
}
return config;
}
protected static boolean isLegacyTest(TestInfo testInfo) {
try {
return testInfo.getTestClass().get().getDeclaredField("LEGACY_MODE").getBoolean(null);
} catch (IllegalArgumentException | IllegalAccessException | NoSuchFieldException | SecurityException e) {
fail("Could not find static boolean LEGACY_MODE from the test class: " + e.getClass().getSimpleName() + " "
+ e.getMessage());
throw new IllegalStateException(); // Making compiler happy
}
}
protected boolean hasFakeServer() {
return embeddedServer != null;
}
@AfterAll
public static void tearDown() {
try {
if (embeddedServer != null) {
embeddedServer.stop();
}
} catch (Exception e) {
fail("Error stopping embedded server", e);
}
} }
protected static void clearData() { protected static void clearData() {
DynamoDBPersistenceService localService = service;
assert localService != null;
DynamoDbAsyncClient lowLevelClient = localService.getLowLevelClient();
assertNotNull(lowLevelClient);
assert lowLevelClient != null;// To get rid of null exception
// Clear data // Clear data
for (String table : new String[] { "dynamodb-integration-tests-bigdecimal", for (String table : new String[] { "dynamodb-integration-tests-bigdecimal", "dynamodb-integration-tests-string",
"dynamodb-integration-tests-string" }) { TABLE }) {
try { try {
service.getDb().getDynamoClient().deleteTable(table); try {
service.getDb().getDynamoDB().getTable(table).waitForDelete(); lowLevelClient.describeTable(req -> req.tableName(table)).get();
} catch (ResourceNotFoundException e) { } catch (ExecutionException e) {
} catch (InterruptedException e) { if (e.getCause() instanceof ResourceNotFoundException) {
LOGGER.warn("Interrupted! Table might not have been deleted"); // Table does not exist, this table does not need cleaning, continue to next table
continue;
}
}
lowLevelClient.deleteTable(req -> req.tableName(table)).get();
final WaiterResponse<DescribeTableResponse> waiterResponse;
try {
waiterResponse = lowLevelClient.waiter().waitUntilTableNotExists(req -> req.tableName(table)).get();
} catch (ExecutionException e) {
// the waiting might fail with SdkClientException: An exception was thrown and did not match any
// waiter acceptors
// (the exception being CompletionException of ResourceNotFound)
// We check if table has been removed, and continue if it has
try {
lowLevelClient.describeTable(req -> req.tableName(table)).get();
} catch (ExecutionException e2) {
if (e2.getCause() instanceof ResourceNotFoundException) {
// Table does not exist, this table does not need cleaning, continue to next table
continue;
}
}
throw e;
}
assertTrue(waiterResponse.matched().exception().isEmpty());
} catch (ExecutionException | InterruptedException e) {
fail("Error cleaning up test (deleting table)", e);
} }
} }
} }

View File

@ -0,0 +1,25 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class CallItemIntegrationLegacyTest extends CallItemIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -31,12 +31,15 @@ import org.openhab.core.types.State;
@NonNullByDefault @NonNullByDefault
public class CallItemIntegrationTest extends AbstractTwoItemIntegrationTest { public class CallItemIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "call"; private static final String NAME = "call";
// values are encoded as part1,part2 - ordering goes wrt strings // values are encoded as part1,part2 - ordering goes wrt strings
private static final StringListType STATE1 = new StringListType("part1", "foo"); private static final StringListType STATE1 = new StringListType("part1", "foo");
private static final StringListType STATE2 = new StringListType("part3", "bar"); private static final StringListType STATE2 = new StringListType("part3", "bar");
private static final StringListType STATE_BETWEEN = new StringListType("part2", "zzz"); private static final StringListType STATE_BETWEEN = new StringListType("part2", "zzz");
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData() throws InterruptedException {
CallItem item = (CallItem) ITEMS.get(NAME); CallItem item = (CallItem) ITEMS.get(NAME);

View File

@ -0,0 +1,26 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class ColorItemIntegrationLegacyTest extends ColorItemIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -32,6 +32,8 @@ import org.openhab.core.types.State;
@NonNullByDefault @NonNullByDefault
public class ColorItemIntegrationTest extends AbstractTwoItemIntegrationTest { public class ColorItemIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static HSBType color(double hue, int saturation, int brightness) { private static HSBType color(double hue, int saturation, int brightness) {
return new HSBType(new DecimalType(hue), new PercentType(saturation), new PercentType(brightness)); return new HSBType(new DecimalType(hue), new PercentType(saturation), new PercentType(brightness));
} }
@ -48,6 +50,7 @@ public class ColorItemIntegrationTest extends AbstractTwoItemIntegrationTest {
private static final HSBType STATE2 = color(75, 100, 90); private static final HSBType STATE2 = color(75, 100, 90);
private static final HSBType STATE_BETWEEN = color(60, 50, 50); private static final HSBType STATE_BETWEEN = color(60, 50, 50);
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData() throws InterruptedException {
ColorItem item = (ColorItem) ITEMS.get(NAME); ColorItem item = (ColorItem) ITEMS.get(NAME);

View File

@ -0,0 +1,24 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
* @author Sami Salonen - Initial contribution
*/
@NonNullByDefault
public class ContactItemIntegrationLegacyTest extends ContactItemIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -28,6 +28,8 @@ import org.openhab.core.types.State;
@NonNullByDefault @NonNullByDefault
public class ContactItemIntegrationTest extends AbstractTwoItemIntegrationTest { public class ContactItemIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "contact"; private static final String NAME = "contact";
private static final OpenClosedType STATE1 = OpenClosedType.CLOSED; private static final OpenClosedType STATE1 = OpenClosedType.CLOSED;
private static final OpenClosedType STATE2 = OpenClosedType.OPEN; private static final OpenClosedType STATE2 = OpenClosedType.OPEN;
@ -35,6 +37,7 @@ public class ContactItemIntegrationTest extends AbstractTwoItemIntegrationTest {
// Omit extended query tests AbstractTwoItemIntegrationTest by setting stateBetween to null. // Omit extended query tests AbstractTwoItemIntegrationTest by setting stateBetween to null.
private static final @Nullable OnOffType STATE_BETWEEN = null; private static final @Nullable OnOffType STATE_BETWEEN = null;
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData() throws InterruptedException {
ContactItem item = (ContactItem) ITEMS.get(NAME); ContactItem item = (ContactItem) ITEMS.get(NAME);

View File

@ -0,0 +1,25 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class DateTimeItemIntegrationLegacyTest extends DateTimeItemIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -30,6 +30,7 @@ import org.openhab.core.types.State;
*/ */
@NonNullByDefault @NonNullByDefault
public class DateTimeItemIntegrationTest extends AbstractTwoItemIntegrationTest { public class DateTimeItemIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "datetime"; private static final String NAME = "datetime";
private static final ZonedDateTime ZDT1 = ZonedDateTime.parse("2016-06-15T10:00:00Z"); private static final ZonedDateTime ZDT1 = ZonedDateTime.parse("2016-06-15T10:00:00Z");
@ -42,6 +43,7 @@ public class DateTimeItemIntegrationTest extends AbstractTwoItemIntegrationTest
private static final DateTimeType STATE2 = new DateTimeType(ZDT2.withZoneSameInstant(ZoneOffset.ofHours(5))); private static final DateTimeType STATE2 = new DateTimeType(ZDT2.withZoneSameInstant(ZoneOffset.ofHours(5)));
private static final DateTimeType STATE_BETWEEN = new DateTimeType(ZDT_BETWEEN); private static final DateTimeType STATE_BETWEEN = new DateTimeType(ZDT_BETWEEN);
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData() throws InterruptedException {
DateTimeItem item = (DateTimeItem) ITEMS.get(NAME); DateTimeItem item = (DateTimeItem) ITEMS.get(NAME);
@ -57,7 +59,6 @@ public class DateTimeItemIntegrationTest extends AbstractTwoItemIntegrationTest
service.store(item); service.store(item);
Thread.sleep(10); Thread.sleep(10);
afterStore2 = ZonedDateTime.now(); afterStore2 = ZonedDateTime.now();
LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore), LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1)); AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
} }

View File

@ -0,0 +1,26 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class DimmerItemIntegrationLegacyTest extends DimmerItemIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -29,11 +29,13 @@ import org.openhab.core.types.State;
@NonNullByDefault @NonNullByDefault
public class DimmerItemIntegrationTest extends AbstractTwoItemIntegrationTest { public class DimmerItemIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "dimmer"; private static final String NAME = "dimmer";
private static final PercentType STATE1 = new PercentType(66); private static final PercentType STATE1 = new PercentType(66);
private static final PercentType STATE2 = new PercentType(68); private static final PercentType STATE2 = new PercentType(68);
private static final PercentType STATE_BETWEEN = new PercentType(67); private static final PercentType STATE_BETWEEN = new PercentType(67);
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData() throws InterruptedException {
DimmerItem item = (DimmerItem) ITEMS.get(NAME); DimmerItem item = (DimmerItem) ITEMS.get(NAME);
@ -49,7 +51,6 @@ public class DimmerItemIntegrationTest extends AbstractTwoItemIntegrationTest {
service.store(item); service.store(item);
Thread.sleep(10); Thread.sleep(10);
afterStore2 = ZonedDateTime.now(); afterStore2 = ZonedDateTime.now();
LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore), LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1)); AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
} }

View File

@ -27,7 +27,8 @@ import org.eclipse.jdt.annotation.NonNullByDefault;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.io.TempDir; import org.junit.jupiter.api.io.TempDir;
import com.amazonaws.regions.Regions; import software.amazon.awssdk.core.retry.RetryMode;
import software.amazon.awssdk.regions.Region;
/** /**
* *
@ -69,17 +70,18 @@ public class DynamoDBConfigTest {
public void testRegionWithAccessKeys() throws Exception { public void testRegionWithAccessKeys() throws Exception {
DynamoDBConfig fromConfig = DynamoDBConfig DynamoDBConfig fromConfig = DynamoDBConfig
.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1", "secretKey", "secret1")); .fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1", "secretKey", "secret1"));
assertEquals(Regions.EU_WEST_1, fromConfig.getRegion()); assert fromConfig != null;
assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId()); assertEquals(Region.EU_WEST_1, fromConfig.getRegion());
assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey()); assertEquals("access1", fromConfig.getCredentials().accessKeyId());
assertEquals("openhab-", fromConfig.getTablePrefix()); assertEquals("secret1", fromConfig.getCredentials().secretAccessKey());
assertEquals(true, fromConfig.isCreateTable()); assertEquals("openhab-", fromConfig.getTablePrefixLegacy());
assertEquals(1, fromConfig.getReadCapacityUnits()); assertEquals(1, fromConfig.getReadCapacityUnits());
assertEquals(1, fromConfig.getWriteCapacityUnits()); assertEquals(1, fromConfig.getWriteCapacityUnits());
assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis()); assertEquals(RetryMode.STANDARD, fromConfig.getRetryPolicy().retryMode());
assertEquals(1000, fromConfig.getBufferSize()); assertEquals(ExpectedTableSchema.MAYBE_LEGACY, fromConfig.getTableRevision());
} }
@SuppressWarnings("null")
@Test @Test
public void testRegionWithProfilesConfigFile() throws Exception { public void testRegionWithProfilesConfigFile() throws Exception {
Path credsFile = Files.createFile(Paths.get(folder.getPath(), "creds")); Path credsFile = Files.createFile(Paths.get(folder.getPath(), "creds"));
@ -90,13 +92,33 @@ public class DynamoDBConfigTest {
DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "profilesConfigFile", DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "profilesConfigFile",
credsFile.toAbsolutePath().toString(), "profile", "fooprofile")); credsFile.toAbsolutePath().toString(), "profile", "fooprofile"));
assertEquals(Regions.EU_WEST_1, fromConfig.getRegion()); assertNotNull(fromConfig);
assertEquals("openhab-", fromConfig.getTablePrefix()); assertEquals(Region.EU_WEST_1, fromConfig.getRegion());
assertEquals(true, fromConfig.isCreateTable()); assertEquals("openhab-", fromConfig.getTablePrefixLegacy());
assertEquals(1, fromConfig.getReadCapacityUnits()); assertEquals(1, fromConfig.getReadCapacityUnits());
assertEquals(1, fromConfig.getWriteCapacityUnits()); assertEquals(1, fromConfig.getWriteCapacityUnits());
assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis()); assertEquals(RetryMode.STANDARD, fromConfig.getRetryPolicy().retryMode());
assertEquals(1000, fromConfig.getBufferSize()); assertEquals(ExpectedTableSchema.MAYBE_LEGACY, fromConfig.getTableRevision());
}
@SuppressWarnings("null")
@Test
public void testProfilesConfigFileRetryMode() throws Exception {
Path credsFile = Files.createFile(Paths.get(folder.getPath(), "creds"));
Files.write(credsFile,
("[fooprofile]\n" + "aws_access_key_id=testAccessKey\n" + "aws_secret_access_key=testSecretKey\n"
+ "aws_session_token=testSessionToken\n" + "retry_mode=legacy").getBytes(),
StandardOpenOption.TRUNCATE_EXISTING);
DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "profilesConfigFile",
credsFile.toAbsolutePath().toString(), "profile", "fooprofile"));
assertNotNull(fromConfig);
assertEquals(Region.EU_WEST_1, fromConfig.getRegion());
assertEquals("openhab-", fromConfig.getTablePrefixLegacy());
assertEquals(1, fromConfig.getReadCapacityUnits());
assertEquals(1, fromConfig.getWriteCapacityUnits());
assertEquals(RetryMode.LEGACY, fromConfig.getRetryPolicy().retryMode());
assertEquals(ExpectedTableSchema.MAYBE_LEGACY, fromConfig.getTableRevision());
} }
@Test @Test
@ -128,94 +150,98 @@ public class DynamoDBConfigTest {
mapFrom("region", "eu-west-1", "profilesConfigFile", credsFile.toAbsolutePath().toString()))); mapFrom("region", "eu-west-1", "profilesConfigFile", credsFile.toAbsolutePath().toString())));
} }
@SuppressWarnings("null")
@Test @Test
public void testRegionWithAccessKeysWithPrefix() throws Exception { public void testRegionWithAccessKeysWithLegacyPrefix() throws Exception {
DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1", DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
"secretKey", "secret1", "tablePrefix", "foobie-")); "secretKey", "secret1", "tablePrefix", "foobie-", "expireDays", "105"));
assertEquals(Regions.EU_WEST_1, fromConfig.getRegion()); assertEquals(Region.EU_WEST_1, fromConfig.getRegion());
assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId()); assertEquals("access1", fromConfig.getCredentials().accessKeyId());
assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey()); assertEquals("secret1", fromConfig.getCredentials().secretAccessKey());
assertEquals("foobie-", fromConfig.getTablePrefix()); assertEquals("foobie-", fromConfig.getTablePrefixLegacy());
assertEquals(true, fromConfig.isCreateTable());
assertEquals(1, fromConfig.getReadCapacityUnits()); assertEquals(1, fromConfig.getReadCapacityUnits());
assertEquals(1, fromConfig.getWriteCapacityUnits()); assertEquals(1, fromConfig.getWriteCapacityUnits());
assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis()); assertEquals(RetryMode.STANDARD, fromConfig.getRetryPolicy().retryMode());
assertEquals(1000, fromConfig.getBufferSize()); assertEquals(ExpectedTableSchema.LEGACY, fromConfig.getTableRevision());
assertNull(fromConfig.getExpireDays()); // not supported with legacy
} }
@SuppressWarnings("null")
@Test @Test
public void testRegionWithAccessKeysWithPrefixWithCreateTable() throws Exception { public void testRegionWithAccessKeysWithTable() throws Exception {
DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig( DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
mapFrom("region", "eu-west-1", "accessKey", "access1", "secretKey", "secret1", "createTable", "false")); "secretKey", "secret1", "table", "mytable", "expireDays", "105"));
assertEquals(Regions.EU_WEST_1, fromConfig.getRegion()); assertEquals(Region.EU_WEST_1, fromConfig.getRegion());
assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId()); assertEquals("access1", fromConfig.getCredentials().accessKeyId());
assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey()); assertEquals("secret1", fromConfig.getCredentials().secretAccessKey());
assertEquals("openhab-", fromConfig.getTablePrefix()); assertEquals("mytable", fromConfig.getTable());
assertEquals(false, fromConfig.isCreateTable());
assertEquals(1, fromConfig.getReadCapacityUnits()); assertEquals(1, fromConfig.getReadCapacityUnits());
assertEquals(1, fromConfig.getWriteCapacityUnits()); assertEquals(1, fromConfig.getWriteCapacityUnits());
assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis()); assertEquals(RetryMode.STANDARD, fromConfig.getRetryPolicy().retryMode());
assertEquals(1000, fromConfig.getBufferSize()); assertEquals(ExpectedTableSchema.NEW, fromConfig.getTableRevision());
assertEquals(105, fromConfig.getExpireDays());
} }
@SuppressWarnings("null")
@Test @Test
public void testRegionWithAccessKeysWithPrefixWithReadCapacityUnits() throws Exception { public void testRegionWithAccessKeysWithoutPrefixWithReadCapacityUnits() throws Exception {
DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1", DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
"secretKey", "secret1", "readCapacityUnits", "5")); "secretKey", "secret1", "readCapacityUnits", "5", "expireDays", "105"));
assertEquals(Regions.EU_WEST_1, fromConfig.getRegion()); assertEquals(Region.EU_WEST_1, fromConfig.getRegion());
assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId()); assertEquals("access1", fromConfig.getCredentials().accessKeyId());
assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey()); assertEquals("secret1", fromConfig.getCredentials().secretAccessKey());
assertEquals("openhab-", fromConfig.getTablePrefix()); assertEquals("openhab-", fromConfig.getTablePrefixLegacy());
assertEquals(true, fromConfig.isCreateTable());
assertEquals(5, fromConfig.getReadCapacityUnits()); assertEquals(5, fromConfig.getReadCapacityUnits());
assertEquals(1, fromConfig.getWriteCapacityUnits()); assertEquals(1, fromConfig.getWriteCapacityUnits());
assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis()); assertEquals(RetryMode.STANDARD, fromConfig.getRetryPolicy().retryMode());
assertEquals(1000, fromConfig.getBufferSize()); assertEquals(ExpectedTableSchema.MAYBE_LEGACY, fromConfig.getTableRevision());
assertEquals(105, fromConfig.getExpireDays());
} }
@SuppressWarnings("null")
@Test @Test
public void testRegionWithAccessKeysWithPrefixWithWriteCapacityUnits() throws Exception { public void testRegionWithAccessKeysWithoutPrefixWithWriteCapacityUnits() throws Exception {
DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1", DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
"secretKey", "secret1", "writeCapacityUnits", "5")); "secretKey", "secret1", "writeCapacityUnits", "5"));
assertEquals(Regions.EU_WEST_1, fromConfig.getRegion()); assertEquals(Region.EU_WEST_1, fromConfig.getRegion());
assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId()); assertEquals("access1", fromConfig.getCredentials().accessKeyId());
assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey()); assertEquals("secret1", fromConfig.getCredentials().secretAccessKey());
assertEquals("openhab-", fromConfig.getTablePrefix()); assertEquals("openhab-", fromConfig.getTablePrefixLegacy());
assertEquals(true, fromConfig.isCreateTable());
assertEquals(1, fromConfig.getReadCapacityUnits()); assertEquals(1, fromConfig.getReadCapacityUnits());
assertEquals(5, fromConfig.getWriteCapacityUnits()); assertEquals(5, fromConfig.getWriteCapacityUnits());
assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis()); assertEquals(RetryMode.STANDARD, fromConfig.getRetryPolicy().retryMode());
assertEquals(1000, fromConfig.getBufferSize()); assertEquals(ExpectedTableSchema.MAYBE_LEGACY, fromConfig.getTableRevision());
assertNull(fromConfig.getExpireDays()); // default is null
} }
@SuppressWarnings("null")
@Test @Test
public void testRegionWithAccessKeysWithPrefixWithReadWriteCapacityUnits() throws Exception { public void testRegionWithAccessKeysWithoutPrefixWithReadWriteCapacityUnits() throws Exception {
DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1", DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
"secretKey", "secret1", "readCapacityUnits", "3", "writeCapacityUnits", "5")); "secretKey", "secret1", "readCapacityUnits", "3", "writeCapacityUnits", "5", "expireDays", "105"));
assertEquals(Regions.EU_WEST_1, fromConfig.getRegion()); assertEquals(Region.EU_WEST_1, fromConfig.getRegion());
assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId()); assertEquals("access1", fromConfig.getCredentials().accessKeyId());
assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey()); assertEquals("secret1", fromConfig.getCredentials().secretAccessKey());
assertEquals("openhab-", fromConfig.getTablePrefix()); assertEquals("openhab-", fromConfig.getTablePrefixLegacy());
assertEquals(true, fromConfig.isCreateTable());
assertEquals(3, fromConfig.getReadCapacityUnits()); assertEquals(3, fromConfig.getReadCapacityUnits());
assertEquals(5, fromConfig.getWriteCapacityUnits()); assertEquals(5, fromConfig.getWriteCapacityUnits());
assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis()); assertEquals(RetryMode.STANDARD, fromConfig.getRetryPolicy().retryMode());
assertEquals(1000, fromConfig.getBufferSize()); assertEquals(ExpectedTableSchema.MAYBE_LEGACY, fromConfig.getTableRevision());
} }
@SuppressWarnings("null")
@Test @Test
public void testRegionWithAccessKeysWithPrefixWithReadWriteCapacityUnitsWithBufferSettings() throws Exception { public void testRegionWithAccessKeysWithPrefixWithReadWriteCapacityUnitsWithBufferSettings() throws Exception {
DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig( DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(
mapFrom("region", "eu-west-1", "accessKey", "access1", "secretKey", "secret1", "readCapacityUnits", "3", mapFrom("region", "eu-west-1", "accessKey", "access1", "secretKey", "secret1", "readCapacityUnits", "3",
"writeCapacityUnits", "5", "bufferCommitIntervalMillis", "501", "bufferSize", "112")); "writeCapacityUnits", "5", "bufferCommitIntervalMillis", "501", "bufferSize", "112"));
assertEquals(Regions.EU_WEST_1, fromConfig.getRegion()); assertEquals(Region.EU_WEST_1, fromConfig.getRegion());
assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId()); assertEquals("access1", fromConfig.getCredentials().accessKeyId());
assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey()); assertEquals("secret1", fromConfig.getCredentials().secretAccessKey());
assertEquals("openhab-", fromConfig.getTablePrefix()); assertEquals("openhab-", fromConfig.getTablePrefixLegacy());
assertEquals(true, fromConfig.isCreateTable());
assertEquals(3, fromConfig.getReadCapacityUnits()); assertEquals(3, fromConfig.getReadCapacityUnits());
assertEquals(5, fromConfig.getWriteCapacityUnits()); assertEquals(5, fromConfig.getWriteCapacityUnits());
assertEquals(501L, fromConfig.getBufferCommitIntervalMillis()); assertEquals(RetryMode.STANDARD, fromConfig.getRetryPolicy().retryMode());
assertEquals(112, fromConfig.getBufferSize()); assertEquals(ExpectedTableSchema.MAYBE_LEGACY, fromConfig.getTableRevision());
} }
} }

View File

@ -12,10 +12,23 @@
*/ */
package org.openhab.persistence.dynamodb.internal; package org.openhab.persistence.dynamodb.internal;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.*;
import static org.junit.jupiter.api.Assumptions.assumeTrue;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import org.eclipse.jdt.annotation.NonNull;
import org.eclipse.jdt.annotation.NonNullByDefault; import org.eclipse.jdt.annotation.NonNullByDefault;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.openhab.core.library.items.NumberItem;
import org.openhab.core.library.types.DecimalType;
import org.openhab.core.persistence.FilterCriteria;
import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClient;
/** /**
* *
@ -23,16 +36,143 @@ import org.junit.jupiter.api.Test;
* *
*/ */
@NonNullByDefault @NonNullByDefault
public class DynamoDBTableNameResolverTest { public class DynamoDBTableNameResolverTest extends BaseIntegrationTest {
public static final boolean LEGACY_MODE = false; // not relevant for these tests but required by BaseIntegrationTest
@Test
public void testLegacyWithDynamoDBBigDecimalItem() {
assertEquals("integration-tests-bigdecimal",
new DynamoDBTableNameResolver(ExpectedTableSchema.LEGACY, "", "integration-tests-")
.fromItem(new DynamoDBBigDecimalItem()));
}
@Test
public void testLegacyWithDynamoDBStringItem() {
assertEquals("integration-tests-string",
new DynamoDBTableNameResolver(ExpectedTableSchema.LEGACY, "", "integration-tests-")
.fromItem(new DynamoDBStringItem()));
}
@Test @Test
public void testWithDynamoDBBigDecimalItem() { public void testWithDynamoDBBigDecimalItem() {
assertEquals("prefixbigdecimal", assertEquals("integration-tests",
new DynamoDBTableNameResolver("prefix").fromItem(new DynamoDBBigDecimalItem())); new DynamoDBTableNameResolver(ExpectedTableSchema.NEW, "integration-tests", "")
.fromItem(new DynamoDBBigDecimalItem()));
} }
@Test @Test
public void testWithDynamoDBStringItem() { public void testWithDynamoDBStringItem() {
assertEquals("prefixstring", new DynamoDBTableNameResolver("prefix").fromItem(new DynamoDBStringItem())); assertEquals("integration-tests",
new DynamoDBTableNameResolver(ExpectedTableSchema.NEW, "integration-tests", "")
.fromItem(new DynamoDBStringItem()));
}
@Test
public void testBothLegacyAndNewParametersNeedToBeSpecifiedWithUnclearTableSchema() {
assertThrows(IllegalArgumentException.class, () -> {
assertEquals("integration-tests",
new DynamoDBTableNameResolver(ExpectedTableSchema.MAYBE_LEGACY, "integration-tests", "")
.fromItem(new DynamoDBStringItem()));
});
assertThrows(IllegalArgumentException.class, () -> {
assertEquals("integration-tests", new DynamoDBTableNameResolver(ExpectedTableSchema.MAYBE_LEGACY, "", "bb")
.fromItem(new DynamoDBStringItem()));
});
}
@Test
public void testResolveLegacyTablesPresent() throws InterruptedException {
// Run test only with embedded server. Otherwise there is risk of writing data using default table names
assumeTrue(embeddedServer != null);
ExecutorService executor = Executors.newFixedThreadPool(2);
DynamoDBPersistenceService maybeLegacyService = null;
final DynamoDBPersistenceService legacyService = newService(true, true, null, DynamoDBConfig.DEFAULT_TABLE_NAME,
DynamoDBConfig.DEFAULT_TABLE_PREFIX);
DynamoDBTableNameResolver tableNameResolver = legacyService.getTableNameResolver();
assertNotNull(tableNameResolver);
assert tableNameResolver != null; // to get rid of null warning...
assertEquals(ExpectedTableSchema.LEGACY, tableNameResolver.getTableSchema());
NumberItem item = (@NonNull NumberItem) ITEMS.get("number");
final FilterCriteria criteria = new FilterCriteria();
criteria.setItemName(item.getName());
try {
// Old tables do not exit --> resolves to new schema
assertEquals(ExpectedTableSchema.NEW, resolveMaybeLegacy(legacyService, executor));
// Write data using legacy tables
item.setState(new DecimalType(0));
legacyService.store(item);
// Since table exist now, DynamoDBTableNameResolver should resolve
waitForAssert(() -> {
// Old tables are now there --> should resolve to old schema
assertEquals(ExpectedTableSchema.LEGACY, resolveMaybeLegacy(legacyService, executor));
});
// Create 2 new services, with unknown schemas (MAYBE_LEGACY), pointing to same database
maybeLegacyService = newService(null, false, legacyService.getEndpointOverride(), null, null);
DynamoDBTableNameResolver maybeLegacyServiceTableNameResolver = maybeLegacyService.getTableNameResolver();
assertNotNull(maybeLegacyServiceTableNameResolver);
assert maybeLegacyServiceTableNameResolver != null; // to get rid of null warning...
assertEquals(ExpectedTableSchema.MAYBE_LEGACY, maybeLegacyServiceTableNameResolver.getTableSchema());
assertEquals(legacyService.getEndpointOverride(), maybeLegacyService.getEndpointOverride());
// maybeLegacyService2 still does not know the schema
assertEquals(ExpectedTableSchema.MAYBE_LEGACY, maybeLegacyServiceTableNameResolver.getTableSchema());
// ... but it will be resolved automatically on query
final DynamoDBPersistenceService maybeLegacyServiceFinal = maybeLegacyService;
waitForAssert(() -> {
assertEquals(1, asList(maybeLegacyServiceFinal.query(criteria)).size());
// also the schema gets resolved
assertEquals(ExpectedTableSchema.LEGACY, maybeLegacyServiceTableNameResolver.getTableSchema());
});
} finally {
executor.shutdown();
if (maybeLegacyService != null) {
maybeLegacyService.deactivate();
}
legacyService.deactivate();
}
}
/**
*
* @param legacyService service that has the client to use
* @param executor
* @return
*/
private ExpectedTableSchema resolveMaybeLegacy(DynamoDBPersistenceService legacyService, ExecutorService executor) {
DynamoDBTableNameResolver resolver = new DynamoDBTableNameResolver(ExpectedTableSchema.MAYBE_LEGACY,
DynamoDBConfig.DEFAULT_TABLE_NAME, DynamoDBConfig.DEFAULT_TABLE_PREFIX);
assertFalse(resolver.isFullyResolved());
try {
DynamoDbAsyncClient localClient = legacyService.getLowLevelClient();
if (localClient == null) {
fail("local client is null");
throw new RuntimeException();
}
boolean resolved = resolver
.resolveSchema(localClient, b -> b.overrideConfiguration(legacyService::overrideConfig), executor)
.get();
assertTrue(resolved);
return resolver.getTableSchema();
} catch (InterruptedException | ExecutionException e) {
fail(e.getMessage());
throw new IllegalStateException(); // Make compiler happy
}
}
private static <T> List<T> asList(Iterable<T> iterable) {
var items = new ArrayList<T>();
for (T item : iterable) {
items.add(item);
}
return items;
} }
} }

View File

@ -0,0 +1,26 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class LocationItemIntegrationLegacyTest extends LocationItemIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -30,6 +30,7 @@ import org.openhab.core.types.State;
@NonNullByDefault @NonNullByDefault
public class LocationItemIntegrationTest extends AbstractTwoItemIntegrationTest { public class LocationItemIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "location"; private static final String NAME = "location";
// values are encoded as lat,lon[,alt] , ordering goes wrt strings // values are encoded as lat,lon[,alt] , ordering goes wrt strings
private static final PointType STATE1 = new PointType( private static final PointType STATE1 = new PointType(
@ -38,6 +39,7 @@ public class LocationItemIntegrationTest extends AbstractTwoItemIntegrationTest
private static final PointType STATE2 = new PointType(new DecimalType(61.0), new DecimalType(30.)); private static final PointType STATE2 = new PointType(new DecimalType(61.0), new DecimalType(30.));
private static final PointType STATE_BETWEEN = new PointType(new DecimalType(60.5), new DecimalType(30.)); private static final PointType STATE_BETWEEN = new PointType(new DecimalType(60.5), new DecimalType(30.));
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData() throws InterruptedException {
LocationItem item = (LocationItem) ITEMS.get(NAME); LocationItem item = (LocationItem) ITEMS.get(NAME);

View File

@ -0,0 +1,26 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class NumberItemIntegrationLegacyTest extends NumberItemIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -32,6 +32,7 @@ import org.openhab.core.types.State;
@NonNullByDefault @NonNullByDefault
public class NumberItemIntegrationTest extends AbstractTwoItemIntegrationTest { public class NumberItemIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "number"; private static final String NAME = "number";
// On purpose we have super accurate number here (testing limits of aws) // On purpose we have super accurate number here (testing limits of aws)
private static final DecimalType STATE1 = new DecimalType(new BigDecimal( private static final DecimalType STATE1 = new DecimalType(new BigDecimal(
@ -39,6 +40,7 @@ public class NumberItemIntegrationTest extends AbstractTwoItemIntegrationTest {
private static final DecimalType STATE2 = new DecimalType(600.9123); private static final DecimalType STATE2 = new DecimalType(600.9123);
private static final DecimalType STATE_BETWEEN = new DecimalType(500); private static final DecimalType STATE_BETWEEN = new DecimalType(500);
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData() throws InterruptedException {
NumberItem item = (NumberItem) ITEMS.get(NAME); NumberItem item = (NumberItem) ITEMS.get(NAME);

View File

@ -0,0 +1,26 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class PagingIntegrationLegacyTest extends PagingIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -13,13 +13,11 @@
package org.openhab.persistence.dynamodb.internal; package org.openhab.persistence.dynamodb.internal;
import static org.junit.jupiter.api.Assertions.*; import static org.junit.jupiter.api.Assertions.*;
import static org.junit.jupiter.api.Assumptions.assumeTrue;
import java.math.BigDecimal; import java.math.BigDecimal;
import java.time.ZonedDateTime; import java.time.ZonedDateTime;
import java.util.Arrays; import java.util.ArrayList;
import java.util.Iterator; import java.util.Iterator;
import java.util.LinkedList;
import java.util.List; import java.util.List;
import org.eclipse.jdt.annotation.NonNullByDefault; import org.eclipse.jdt.annotation.NonNullByDefault;
@ -41,125 +39,142 @@ import org.openhab.core.persistence.HistoricItem;
@NonNullByDefault @NonNullByDefault
public class PagingIntegrationTest extends BaseIntegrationTest { public class PagingIntegrationTest extends BaseIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "number"; private static final String NAME = "number";
private static final int STATE_COUNT = 10; private static final int STATE_COUNT = 10;
private static @Nullable ZonedDateTime storeStart; private static @Nullable ZonedDateTime storeStart;
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void checkService() throws InterruptedException {
String msg = "DynamoDB integration tests will be skipped. Did you specify AWS credentials for testing? "
+ "See BaseIntegrationTest for more details";
if (service == null) {
System.out.println(msg);
}
assumeTrue(service != null, msg);
populateData();
}
public static void populateData() { public static void populateData() {
storeStart = ZonedDateTime.now(); storeStart = ZonedDateTime.now();
NumberItem item = (NumberItem) ITEMS.get(NAME); NumberItem item = (NumberItem) ITEMS.get(NAME);
for (int i = 0; i < STATE_COUNT; i++) { for (int i = 0; i < STATE_COUNT; i++) {
item.setState(new DecimalType(i)); item.setState(new DecimalType(i));
try {
// Add some delay to enforce different timestamps in ms accuracy
Thread.sleep(5);
} catch (InterruptedException e) {
fail("Interrupted");
return;
}
service.store(item); service.store(item);
} }
} }
@SuppressWarnings("null")
@Test @Test
public void testPagingFirstPage() { public void testPagingFirstPage() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setItemName(NAME); FilterCriteria criteria = new FilterCriteria();
criteria.setBeginDate(storeStart); criteria.setItemName(NAME);
criteria.setOrdering(Ordering.ASCENDING); criteria.setBeginDate(storeStart);
criteria.setPageNumber(0); criteria.setOrdering(Ordering.ASCENDING);
criteria.setPageSize(3); criteria.setPageNumber(0);
assertItemStates(BaseIntegrationTest.service.query(criteria), 0, 1, 2); criteria.setPageSize(3);
assertItemStates(BaseIntegrationTest.service.query(criteria), 0, 1, 2);
});
} }
@SuppressWarnings("null")
@Test @Test
public void testPagingSecondPage() { public void testPagingSecondPage() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setItemName(NAME); FilterCriteria criteria = new FilterCriteria();
criteria.setBeginDate(storeStart); criteria.setItemName(NAME);
criteria.setOrdering(Ordering.ASCENDING); criteria.setBeginDate(storeStart);
criteria.setPageNumber(1); criteria.setOrdering(Ordering.ASCENDING);
criteria.setPageSize(3); criteria.setPageNumber(1);
assertItemStates(BaseIntegrationTest.service.query(criteria), 3, 4, 5); criteria.setPageSize(3);
assertItemStates(BaseIntegrationTest.service.query(criteria), 3, 4, 5);
});
} }
@SuppressWarnings("null")
@Test @Test
public void testPagingPagePartialPage() { public void testPagingPagePartialPage() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setItemName(NAME); FilterCriteria criteria = new FilterCriteria();
criteria.setBeginDate(storeStart); criteria.setItemName(NAME);
criteria.setOrdering(Ordering.ASCENDING); criteria.setBeginDate(storeStart);
criteria.setPageNumber(3); criteria.setOrdering(Ordering.ASCENDING);
criteria.setPageSize(3); criteria.setPageNumber(3);
assertItemStates(BaseIntegrationTest.service.query(criteria), 9); criteria.setPageSize(3);
assertItemStates(BaseIntegrationTest.service.query(criteria), 9);
});
} }
@SuppressWarnings("null")
@Test @Test
public void testPagingPageOutOfBounds() { public void testPagingPageOutOfBounds() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setItemName(NAME); FilterCriteria criteria = new FilterCriteria();
criteria.setBeginDate(storeStart); criteria.setItemName(NAME);
criteria.setOrdering(Ordering.ASCENDING); criteria.setBeginDate(storeStart);
criteria.setPageNumber(4); criteria.setOrdering(Ordering.ASCENDING);
criteria.setPageSize(3); criteria.setPageNumber(4);
assertItemStates(BaseIntegrationTest.service.query(criteria)); // no results criteria.setPageSize(3);
assertItemStates(BaseIntegrationTest.service.query(criteria)); // no results
});
} }
@SuppressWarnings("null")
@Test @Test
public void testPagingPage0Descending() { public void testPagingPage0Descending() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setItemName(NAME); FilterCriteria criteria = new FilterCriteria();
criteria.setBeginDate(storeStart); criteria.setItemName(NAME);
criteria.setOrdering(Ordering.DESCENDING); criteria.setBeginDate(storeStart);
criteria.setPageNumber(0); criteria.setOrdering(Ordering.DESCENDING);
criteria.setPageSize(3); criteria.setPageNumber(0);
assertItemStates(BaseIntegrationTest.service.query(criteria), 9, 8, 7); criteria.setPageSize(3);
assertItemStates(BaseIntegrationTest.service.query(criteria), 9, 8, 7);
});
} }
@SuppressWarnings("null")
@Test @Test
public void testPagingPage0HugePageSize() { public void testPagingPage0HugePageSize() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setItemName(NAME); FilterCriteria criteria = new FilterCriteria();
criteria.setBeginDate(storeStart); criteria.setItemName(NAME);
criteria.setOrdering(Ordering.ASCENDING); criteria.setBeginDate(storeStart);
criteria.setPageNumber(0); criteria.setOrdering(Ordering.ASCENDING);
criteria.setPageSize(900); criteria.setPageNumber(0);
assertItemStates(BaseIntegrationTest.service.query(criteria), 0, 1, 2, 3, 4, 5, 6, 7, 8, 9); criteria.setPageSize(900);
assertItemStates(BaseIntegrationTest.service.query(criteria), 0, 1, 2, 3, 4, 5, 6, 7, 8, 9);
});
} }
@SuppressWarnings("null")
@Test @Test
public void testPagingFirstPageWithFilter() { public void testPagingFirstPageWithFilter() {
FilterCriteria criteria = new FilterCriteria(); waitForAssert(() -> {
criteria.setItemName(NAME); FilterCriteria criteria = new FilterCriteria();
criteria.setBeginDate(storeStart); criteria.setItemName(NAME);
criteria.setOrdering(Ordering.ASCENDING); criteria.setBeginDate(storeStart);
criteria.setPageNumber(0); criteria.setOrdering(Ordering.ASCENDING);
criteria.setPageSize(3); criteria.setPageNumber(0);
criteria.setOperator(Operator.GT); criteria.setPageSize(3);
criteria.setState(new DecimalType(new BigDecimal(3))); criteria.setOperator(Operator.GT);
assertItemStates(BaseIntegrationTest.service.query(criteria), 4, 5, 6); criteria.setState(new DecimalType(new BigDecimal(3)));
assertItemStates(BaseIntegrationTest.service.query(criteria), 4, 5, 6);
});
} }
private void assertItemStates(Iterable<HistoricItem> actualIterable, int... expected) { private void assertItemStates(Iterable<HistoricItem> actualIterable, int... expected) {
Iterator<HistoricItem> actualIterator = actualIterable.iterator(); Iterator<HistoricItem> actualIterator = actualIterable.iterator();
List<HistoricItem> got = new LinkedList<HistoricItem>(); List<DecimalType> expectedStates = new ArrayList<>();
List<DecimalType> actualStates = new ArrayList<>();
for (int expectedState : expected) { for (int expectedState : expected) {
assertTrue(actualIterator.hasNext()); assertTrue(actualIterator.hasNext());
HistoricItem actual = actualIterator.next(); HistoricItem actual = actualIterator.next();
assertEquals(new DecimalType(expectedState), actual.getState()); expectedStates.add(new DecimalType(expectedState));
got.add(actual); actualStates.add((DecimalType) actual.getState());
}
if (actualIterator.hasNext()) {
fail("Did not expect any more items, but got at least this extra element: "
+ actualIterator.next().toString() + ". Before this we got: " + Arrays.toString(got.toArray()));
} }
assertEquals(expectedStates, actualStates);
assertFalse(actualIterator.hasNext());
} }
} }

View File

@ -0,0 +1,26 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class PlayerItemPlayPauseIntegrationLegacyTest extends PlayerItemPlayPauseIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -29,11 +29,13 @@ import org.openhab.core.types.State;
@NonNullByDefault @NonNullByDefault
public class PlayerItemPlayPauseIntegrationTest extends AbstractTwoItemIntegrationTest { public class PlayerItemPlayPauseIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "player_playpause"; private static final String NAME = "player_playpause";
private static final PlayPauseType STATE1 = PlayPauseType.PAUSE; private static final PlayPauseType STATE1 = PlayPauseType.PAUSE;
private static final PlayPauseType STATE2 = PlayPauseType.PLAY; private static final PlayPauseType STATE2 = PlayPauseType.PLAY;
private static final @Nullable PlayPauseType STATE_BETWEEN = null; private static final @Nullable PlayPauseType STATE_BETWEEN = null;
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData() throws InterruptedException {
PlayerItem item = (PlayerItem) ITEMS.get(NAME); PlayerItem item = (PlayerItem) ITEMS.get(NAME);

View File

@ -0,0 +1,26 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class PlayerItemRewindFastForwardIntegrationLegacyTest extends PlayerItemRewindFastForwardIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -14,11 +14,12 @@ package org.openhab.persistence.dynamodb.internal;
import java.time.ZonedDateTime; import java.time.ZonedDateTime;
import org.eclipse.jdt.annotation.NonNull;
import org.eclipse.jdt.annotation.NonNullByDefault; import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable; import org.eclipse.jdt.annotation.Nullable;
import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.TestInfo;
import org.openhab.core.library.items.PlayerItem; import org.openhab.core.library.items.PlayerItem;
import org.openhab.core.library.types.PlayPauseType;
import org.openhab.core.library.types.RewindFastforwardType; import org.openhab.core.library.types.RewindFastforwardType;
import org.openhab.core.types.State; import org.openhab.core.types.State;
@ -30,23 +31,41 @@ import org.openhab.core.types.State;
@NonNullByDefault @NonNullByDefault
public class PlayerItemRewindFastForwardIntegrationTest extends AbstractTwoItemIntegrationTest { public class PlayerItemRewindFastForwardIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "player_rewindfastforward"; private static final String NAME = "player_rewindfastforward";
private static final RewindFastforwardType STATE1 = RewindFastforwardType.FASTFORWARD;
private static final RewindFastforwardType STATE2 = RewindFastforwardType.REWIND;
private static final @Nullable PlayPauseType STATE_BETWEEN = null;
private static @Nullable RewindFastforwardType STATE1, STATE2;
private static final @Nullable RewindFastforwardType STATE_BETWEEN = null;
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData(TestInfo testInfo) throws InterruptedException {
@NonNull
RewindFastforwardType localState1, localState2;
if (isLegacyTest(testInfo)) {
// In legacy, FASTFORWARD < REWIND
STATE1 = RewindFastforwardType.FASTFORWARD;
STATE2 = RewindFastforwardType.REWIND;
} else {
// In non-legacy, FASTFORWARD (serialized as 1) > REWIND (-1)
STATE1 = RewindFastforwardType.REWIND;
STATE2 = RewindFastforwardType.FASTFORWARD;
}
localState1 = (@NonNull RewindFastforwardType) STATE1;
localState2 = (@NonNull RewindFastforwardType) STATE2;
assert localState1 != null;
assert localState2 != null;
PlayerItem item = (PlayerItem) ITEMS.get(NAME); PlayerItem item = (PlayerItem) ITEMS.get(NAME);
item.setState(STATE1); item.setState(localState1);
beforeStore = ZonedDateTime.now(); beforeStore = ZonedDateTime.now();
Thread.sleep(10); Thread.sleep(10);
service.store(item); service.store(item);
afterStore1 = ZonedDateTime.now(); afterStore1 = ZonedDateTime.now();
Thread.sleep(10); Thread.sleep(10);
item.setState(STATE2); item.setState(localState2);
service.store(item); service.store(item);
Thread.sleep(10); Thread.sleep(10);
afterStore2 = ZonedDateTime.now(); afterStore2 = ZonedDateTime.now();
@ -62,12 +81,12 @@ public class PlayerItemRewindFastForwardIntegrationTest extends AbstractTwoItemI
@Override @Override
protected State getFirstItemState() { protected State getFirstItemState() {
return STATE1; return (@NonNull RewindFastforwardType) STATE1;
} }
@Override @Override
protected State getSecondItemState() { protected State getSecondItemState() {
return STATE2; return (@NonNull RewindFastforwardType) STATE2;
} }
@Override @Override

View File

@ -0,0 +1,26 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class RollershutterItemIntegrationLegacyTest extends RollershutterItemIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -33,11 +33,13 @@ import org.openhab.core.types.State;
@NonNullByDefault @NonNullByDefault
public class RollershutterItemIntegrationTest extends AbstractTwoItemIntegrationTest { public class RollershutterItemIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "rollershutter"; private static final String NAME = "rollershutter";
private static final PercentType STATE1 = PercentType.ZERO; private static final PercentType STATE1 = PercentType.ZERO;
private static final PercentType STATE2 = new PercentType("72.938289428989489389329834898929892439842399483498"); private static final PercentType STATE2 = new PercentType("72.938289428989489389329834898929892439842399483498");
private static final PercentType STATE_BETWEEN = new PercentType(66); // no such that exists private static final PercentType STATE_BETWEEN = new PercentType(66); // no such that exists
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData() throws InterruptedException {
RollershutterItem item = (RollershutterItem) ITEMS.get(NAME); RollershutterItem item = (RollershutterItem) ITEMS.get(NAME);

View File

@ -0,0 +1,26 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class StringItemIntegrationLegacyTest extends StringItemIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -29,11 +29,13 @@ import org.openhab.core.types.State;
@NonNullByDefault @NonNullByDefault
public class StringItemIntegrationTest extends AbstractTwoItemIntegrationTest { public class StringItemIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "string"; private static final String NAME = "string";
private static final StringType STATE1 = new StringType("b001"); private static final StringType STATE1 = new StringType("b001");
private static final StringType STATE2 = new StringType("c002"); private static final StringType STATE2 = new StringType("c002");
private static final StringType STATE_BETWEEN = new StringType("b001"); private static final StringType STATE_BETWEEN = new StringType("b001");
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData() throws InterruptedException {
StringItem item = (StringItem) ITEMS.get(NAME); StringItem item = (StringItem) ITEMS.get(NAME);

View File

@ -0,0 +1,26 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class SwitchItemIntegrationLegacyTest extends SwitchItemIntegrationTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -29,6 +29,7 @@ import org.openhab.core.types.State;
@NonNullByDefault @NonNullByDefault
public class SwitchItemIntegrationTest extends AbstractTwoItemIntegrationTest { public class SwitchItemIntegrationTest extends AbstractTwoItemIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final String NAME = "switch"; private static final String NAME = "switch";
private static final OnOffType STATE1 = OnOffType.OFF; private static final OnOffType STATE1 = OnOffType.OFF;
private static final OnOffType STATE2 = OnOffType.ON; private static final OnOffType STATE2 = OnOffType.ON;
@ -36,6 +37,7 @@ public class SwitchItemIntegrationTest extends AbstractTwoItemIntegrationTest {
// Omit extended query tests AbstractTwoItemIntegrationTest by setting stateBetween to null. // Omit extended query tests AbstractTwoItemIntegrationTest by setting stateBetween to null.
private static final @Nullable OnOffType STATE_BETWEEN = null; private static final @Nullable OnOffType STATE_BETWEEN = null;
@SuppressWarnings("null")
@BeforeAll @BeforeAll
public static void storeData() throws InterruptedException { public static void storeData() throws InterruptedException {
SwitchItem item = (SwitchItem) ITEMS.get(NAME); SwitchItem item = (SwitchItem) ITEMS.get(NAME);

View File

@ -0,0 +1,26 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class TestComplexItemsWithDifferentStateTypesLegacyTest extends TestComplexItemsWithDifferentStateTypesTest {
public static final boolean LEGACY_MODE = true;
}

View File

@ -0,0 +1,349 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import static org.junit.jupiter.api.Assertions.*;
import static org.junit.jupiter.api.Assumptions.*;
import java.math.BigDecimal;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import org.openhab.core.items.GroupItem;
import org.openhab.core.library.items.ColorItem;
import org.openhab.core.library.items.DimmerItem;
import org.openhab.core.library.items.NumberItem;
import org.openhab.core.library.items.RollershutterItem;
import org.openhab.core.library.items.StringItem;
import org.openhab.core.library.types.DateTimeType;
import org.openhab.core.library.types.DecimalType;
import org.openhab.core.library.types.HSBType;
import org.openhab.core.library.types.OnOffType;
import org.openhab.core.library.types.PercentType;
import org.openhab.core.library.types.QuantityType;
import org.openhab.core.library.types.StringType;
import org.openhab.core.library.types.UpDownType;
import org.openhab.core.persistence.FilterCriteria;
import org.openhab.core.persistence.FilterCriteria.Operator;
import org.openhab.core.persistence.FilterCriteria.Ordering;
import org.openhab.core.persistence.HistoricItem;
import org.openhab.core.types.State;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class TestComplexItemsWithDifferentStateTypesTest extends BaseIntegrationTest {
private static final HSBType HSB_STATE = new HSBType(DecimalType.valueOf("20"), new PercentType(30),
new PercentType(40));
private static final PercentType COLOR_PERCENT_STATE = new PercentType(22);
private static final HSBType HSB_STATE_AFTER_PERCENT = new HSBType(DecimalType.valueOf("20"), new PercentType(30),
COLOR_PERCENT_STATE);
private static final OnOffType COLOR_ONOFF_STATE = OnOffType.ON;
private static final HSBType HSB_STATE_AFTER_ONOFF = new HSBType(DecimalType.valueOf("20"), new PercentType(30),
PercentType.HUNDRED);
public static final boolean LEGACY_MODE = false;
@BeforeAll
@SuppressWarnings("null")
public static void storeColorItemData() {
ColorItem item = (ColorItem) ITEMS.get("color");
try {
item.setState(HSB_STATE);
service.store(item);
Thread.sleep(10);
// percent
item.setState(COLOR_PERCENT_STATE); // changes only the brightness
service.store(item);
Thread.sleep(10);
// on/off
item.setState(COLOR_ONOFF_STATE); // once again, changes the brightness
service.store(item);
Thread.sleep(10);
} catch (InterruptedException e) {
fail("interrupted");
}
}
@BeforeAll
@SuppressWarnings("null")
public static void storeRollershutterItemData() {
RollershutterItem item = (RollershutterItem) ITEMS.get("rollershutter");
try {
item.setState(new PercentType(31));
service.store(item);
Thread.sleep(10);
item.setState(UpDownType.DOWN);
service.store(item);
Thread.sleep(10);
item.setState(new PercentType(32));
service.store(item);
Thread.sleep(10);
item.setState(UpDownType.UP);
service.store(item);
Thread.sleep(10);
} catch (InterruptedException e) {
fail("interrupted");
}
}
@BeforeAll
@SuppressWarnings("null")
public static void storeDimmerItemData() {
DimmerItem item = (DimmerItem) ITEMS.get("dimmer");
try {
Thread.sleep(10);
item.setState(new PercentType(33));
service.store(item);
Thread.sleep(10);
item.setState(OnOffType.OFF);
service.store(item);
Thread.sleep(10);
item.setState(new PercentType(35));
service.store(item);
Thread.sleep(10);
item.setState(OnOffType.ON);
service.store(item);
} catch (InterruptedException e) {
fail("interrupted");
}
}
@BeforeAll
@SuppressWarnings("null")
public static void storeStringItemData() {
StringItem item = (StringItem) ITEMS.get("string");
try {
Thread.sleep(10);
item.setState(new StringType("mystring"));
service.store(item);
Thread.sleep(10);
item.setState(DateTimeType.valueOf("2021-01-17T11:18:00+02:00"));
service.store(item);
} catch (InterruptedException e) {
fail("interrupted");
}
}
@BeforeAll
@SuppressWarnings("null")
public static void storeTemperatureNumberItemData() {
NumberItem item = (NumberItem) ITEMS.get("numberTemperature");
assertEquals(TEMP_ITEM_UNIT, item.getUnit());
try {
Thread.sleep(10);
item.setState(new QuantityType<>("2.0 °C"));
service.store(item);
Thread.sleep(10);
item.setState(new QuantityType<>("2.0 K"));
service.store(item);
Thread.sleep(10);
item.setState(new QuantityType<>("5.1 °F"));
service.store(item);
} catch (InterruptedException e) {
fail("interrupted");
}
}
@BeforeAll
@SuppressWarnings("null")
public static void storeGroupTemperatureNumberItemData() {
GroupItem item = (GroupItem) ITEMS.get("groupNumberTemperature");
try {
Thread.sleep(10);
item.setState(new QuantityType<>("3.0 °C"));
service.store(item);
Thread.sleep(10);
item.setState(new QuantityType<>("3.0 K"));
service.store(item);
Thread.sleep(10);
item.setState(new QuantityType<>("5.1 °F"));
service.store(item);
} catch (InterruptedException e) {
fail("interrupted");
}
}
@BeforeAll
@SuppressWarnings("null")
public static void storeDimensionlessNumberItemData() {
NumberItem item = (NumberItem) ITEMS.get("numberDimensionless");
assertEquals(DIMENSIONLESS_ITEM_UNIT, item.getUnit());
try {
Thread.sleep(10);
item.setState(new QuantityType<>("2 %"));
service.store(item);
Thread.sleep(10);
item.setState(new QuantityType<>("3.5"));
service.store(item);
Thread.sleep(10);
item.setState(new QuantityType<>("510 ppm"));
service.store(item);
} catch (InterruptedException e) {
fail("interrupted");
}
}
@BeforeAll
@SuppressWarnings("null")
public static void storeDummyGroupItemData() {
GroupItem item = (GroupItem) ITEMS.get("groupDummy");
try {
Thread.sleep(10);
item.setState(new QuantityType<>("2 %")); // Will not write anything
service.store(item);
} catch (InterruptedException e) {
fail("interrupted");
}
}
@Test
public void testColorItem() {
waitForAssert(() -> {
assertQueryAll("color", new HSBType[] { HSB_STATE, HSB_STATE_AFTER_PERCENT, HSB_STATE_AFTER_ONOFF });
});
}
@Test
public void testRollershutter() {
// when querying, UP/DOWN are returned as PercentType
assertEquals(PercentType.HUNDRED, UpDownType.DOWN.as(PercentType.class));
assertEquals(PercentType.ZERO, UpDownType.UP.as(PercentType.class));
waitForAssert(() -> {
assertQueryAll("rollershutter",
new State[] { new PercentType(31), PercentType.HUNDRED, new PercentType(32), PercentType.ZERO });
});
}
@Test
public void testDimmer() {
// when querying, ON/OFF are returned as PercentType
assertEquals(PercentType.HUNDRED, OnOffType.ON.as(PercentType.class));
assertEquals(PercentType.ZERO, OnOffType.OFF.as(PercentType.class));
waitForAssert(() -> {
assertQueryAll("dimmer",
new State[] { new PercentType(33), PercentType.ZERO, new PercentType(35), PercentType.HUNDRED });
});
}
@Test
public void testString() {
waitForAssert(() -> {
assertQueryAll("string",
new State[] { new StringType("mystring"), new StringType("2021-01-17T09:18:00.000Z") });
});
}
@Test
public void testTemperatureItemNumber() {
waitForAssert(() -> {
assertQueryAll("numberTemperature",
new State[] { new QuantityType<>("2.0 °C"), /* 2 K = -271.15 °C */ new QuantityType<>("-271.15 °C"),
new QuantityType<>("-14.9444444444444444444444444444444 °C") });
assertQueryFilterByState("numberTemperature", Operator.GT, new QuantityType<>("-20 °C"), new State[] {
new QuantityType<>("2.0 °C"), new QuantityType<>("-14.9444444444444444444444444444444 °C") });
assertQueryFilterByState("numberTemperature", Operator.LT, new QuantityType<>("273.15 K"), new State[] {
new QuantityType<>("-271.15 °C"), new QuantityType<>("-14.9444444444444444444444444444444 °C") });
assertQueryFilterByState("numberTemperature", Operator.EQ, new QuantityType<>("5.1 °F"),
new State[] { new QuantityType<>("-14.9444444444444444444444444444444 °C") });
assertQueryFilterByState("numberTemperature", Operator.EQ, new QuantityType<>("5.1 m/s"), new State[] {});
});
}
@Test
public void testGroupTemperatureItemNumber() {
waitForAssert(() -> {
assertQueryAll("groupNumberTemperature",
new State[] { new QuantityType<>("3.0 °C"), /* 3 K = -270.15 °C */ new QuantityType<>("-270.15 °C"),
new QuantityType<>("-14.9444444444444444444444444444444 °C") });
assertQueryFilterByState("groupNumberTemperature", Operator.GT, new QuantityType<>("-20 °C"), new State[] {
new QuantityType<>("3.0 °C"), new QuantityType<>("-14.9444444444444444444444444444444 °C") });
assertQueryFilterByState("groupNumberTemperature", Operator.LT, new QuantityType<>("273.15 K"),
new State[] { new QuantityType<>("-270.15 °C"),
new QuantityType<>("-14.9444444444444444444444444444444 °C") });
assertQueryFilterByState("groupNumberTemperature", Operator.EQ, new QuantityType<>("5.1 °F"),
new State[] { new QuantityType<>("-14.9444444444444444444444444444444 °C") });
assertQueryFilterByState("groupNumberTemperature", Operator.EQ, new QuantityType<>("5.1 m/s"),
new State[] {});
});
}
@Test
public void testGroupDummyItem() {
// Do not want to slow down CI runs
assumeFalse(isRunningInCI());
// only with the fast local server
assumeTrue(hasFakeServer());
try {
// 5 seconds should be enough that any writes go through
Thread.sleep(5000);
} catch (InterruptedException e) {
fail(e.getMessage());
return;
}
// We expect no results
assertQueryAll("groupDummy", new State[] {});
}
@Test
public void testDimensionlessItemNumber() {
waitForAssert(() -> {
assertQueryAll("numberDimensionless", new State[] { /* 2 % */ new QuantityType<>("0.02"),
new QuantityType<>("3.5"), new QuantityType<>("510").divide(new BigDecimal(1_000_000)) });
});
}
private void assertQueryAll(String item, State[] expectedStates) {
assertQueryFilterByState(item, null, null, expectedStates);
}
private void assertQueryFilterByState(String item, @Nullable Operator operator, @Nullable State state,
State[] expectedStates) {
FilterCriteria criteria = new FilterCriteria();
criteria.setOrdering(Ordering.ASCENDING);
criteria.setItemName(item);
criteria.setOperator(operator);
criteria.setState(state);
@SuppressWarnings("null")
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
List<State> actualStatesList = new ArrayList<>();
iterable.forEach(i -> actualStatesList.add(i.getState()));
State[] actualStates = actualStatesList.toArray(new State[0]);
assertArrayEquals(expectedStates, actualStates, Arrays.toString(actualStates));
}
}

View File

@ -0,0 +1,51 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.openhab.core.library.types.DecimalType;
import org.openhab.core.library.types.OnOffType;
import org.openhab.core.library.types.PlayPauseType;
import org.openhab.core.library.types.StringType;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class TestStoreMixedTypesLegacyTest extends TestStoreMixedTypesTest {
public static final boolean LEGACY_MODE = true;
@Override
protected PlayPauseType[] expectedPlayerItem() {
return new PlayPauseType[] { PlayPauseType.PAUSE };
}
@Override
protected StringType[] expectedStringItem() {
return new StringType[] { StringType.valueOf("a1"), StringType.valueOf("b1"), StringType.valueOf("PAUSE") };
}
@Override
protected OnOffType[] expectedSwitchItem() {
return new OnOffType[] { /* 33.14 */OnOffType.ON, /* 66.28 */ OnOffType.ON, OnOffType.ON, OnOffType.OFF, };
}
@Override
protected DecimalType[] expectedNumberItem() {
return new DecimalType[] { DecimalType.valueOf("33.14"), DecimalType.valueOf("66.28"),
/* on */DecimalType.valueOf("1"), /* off */DecimalType.valueOf("0") };
}
}

View File

@ -0,0 +1,198 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import static org.junit.jupiter.api.Assertions.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.concurrent.atomic.AtomicInteger;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.openhab.core.library.items.NumberItem;
import org.openhab.core.library.items.PlayerItem;
import org.openhab.core.library.items.StringItem;
import org.openhab.core.library.items.SwitchItem;
import org.openhab.core.library.types.DecimalType;
import org.openhab.core.library.types.OnOffType;
import org.openhab.core.library.types.PlayPauseType;
import org.openhab.core.library.types.StringType;
import org.openhab.core.persistence.FilterCriteria;
import org.openhab.core.persistence.FilterCriteria.Ordering;
import org.openhab.core.persistence.HistoricItem;
import org.openhab.core.types.State;
/**
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class TestStoreMixedTypesTest extends BaseIntegrationTest {
public static final boolean LEGACY_MODE = false;
private static final AtomicInteger testCounter = new AtomicInteger();
private int uniqueId;
private String getItemName() {
return "localItem" + uniqueId;
}
@BeforeEach
private void generateUniqueItemId() {
uniqueId = testCounter.getAndIncrement();
}
@AfterEach
private void tearDownLocalItems() {
ITEMS.remove(getItemName());
}
@SuppressWarnings("null")
public void storeItemWithDifferentTypes() {
try {
// First writing two values with string item
{
StringItem item = new StringItem(getItemName());
ITEMS.put(getItemName(), item);
item.setState(StringType.valueOf("a1"));
service.store(item);
Thread.sleep(10);
item.setState(StringType.valueOf("b1"));
service.store(item);
Thread.sleep(10);
}
// then writing with same item but numbers
{
NumberItem item = new NumberItem(getItemName());
assert item != null;
ITEMS.put(getItemName(), item);
item.setState(DecimalType.valueOf("33.14"));
service.store(item);
Thread.sleep(10);
item.setState(DecimalType.valueOf("66.28"));
service.store(item);
Thread.sleep(10);
}
// finally some switch values
{
SwitchItem item = new SwitchItem(getItemName());
assert item != null;
ITEMS.put(getItemName(), item);
item.setState(OnOffType.ON);
service.store(item);
Thread.sleep(10);
item.setState(OnOffType.OFF);
service.store(item);
Thread.sleep(10);
}
// Player
{
PlayerItem item = new PlayerItem(getItemName());
assert item != null;
ITEMS.put(getItemName(), item);
item.setState(PlayPauseType.PAUSE);
service.store(item);
Thread.sleep(10);
}
} catch (InterruptedException e) {
fail("interrupted");
}
}
/**
* Test where first data is stored with various item types, some serialized as numbers and some as strings.
*
* - Querying with NumberItem returns data that have been persisted using DynamoDBBigDecimalItem DTO, that is,
* NumberItem and SwitchItem data
* - Querying with StringItem returns data that have been persisted using DynamoDBStringItem DTO (StringItem and
* PlayerItem)
* - Querying with SwitchItem returns data that have been persisted using DynamoDBBigDecimalItem DTO. All numbers
* are converted to OnOff (ON if nonzero)
* - Querying with PlayerItem returns data that have been persisted using DynamoDBStringItem DTO. However, some
* values are not convertible to PlayPauseType are ignored (warning logged).
*
*/
@Test
public void testQueryAllItemTypeChanged() {
storeItemWithDifferentTypes();
{
NumberItem item = new NumberItem(getItemName());
ITEMS.put(getItemName(), item);
waitForAssert(() -> {
assertQueryAll(getItemName(), expectedNumberItem());
});
}
{
SwitchItem item = new SwitchItem(getItemName());
ITEMS.put(getItemName(), item);
waitForAssert(() -> {
assertQueryAll(getItemName(), expectedSwitchItem());
});
}
{
StringItem item = new StringItem(getItemName());
ITEMS.put(getItemName(), item);
waitForAssert(() -> {
assertQueryAll(getItemName(), expectedStringItem());
});
}
{
PlayerItem item = new PlayerItem(getItemName());
assert item != null;
ITEMS.put(getItemName(), item);
waitForAssert(() -> {
assertQueryAll(getItemName(), expectedPlayerItem());
});
}
}
protected PlayPauseType[] expectedPlayerItem() {
return new PlayPauseType[] { /* ON=1=PLAY */PlayPauseType.PLAY, /* OFF=0=PAUSE */PlayPauseType.PAUSE,
PlayPauseType.PAUSE };
}
protected StringType[] expectedStringItem() {
return new StringType[] { StringType.valueOf("a1"), StringType.valueOf("b1") };
}
protected OnOffType[] expectedSwitchItem() {
return new OnOffType[] { /* 33.14 */OnOffType.ON, /* 66.28 */ OnOffType.ON, OnOffType.ON, OnOffType.OFF,
/* pause */ OnOffType.OFF };
}
protected DecimalType[] expectedNumberItem() {
return new DecimalType[] { DecimalType.valueOf("33.14"), DecimalType.valueOf("66.28"),
/* on */DecimalType.valueOf("1"), /* off */DecimalType.valueOf("0"),
/* pause */DecimalType.valueOf("0") };
}
private void assertQueryAll(String item, State[] expectedStates) {
FilterCriteria criteria = new FilterCriteria();
criteria.setOrdering(Ordering.ASCENDING);
criteria.setItemName(item);
@SuppressWarnings("null")
Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
List<State> actualStatesList = new ArrayList<>();
iterable.forEach(i -> actualStatesList.add(i.getState()));
State[] actualStates = actualStatesList.toArray(new State[0]);
assertArrayEquals(expectedStates, actualStates, Arrays.toString(actualStates));
}
}

View File

@ -26,10 +26,17 @@
# OPTION 2 (using profilesConfigFile and profile) # OPTION 2 (using profilesConfigFile and profile)
# where profilesConfigFile points to AWS credentials file # where profilesConfigFile points to AWS credentials file
# Please note that the user that runs openHAB must have approriate read rights to the credential file. # Please note that the user that runs openHAB must have approriate read rights to the credential file.
# See below for an example how the credentials file should look like
#profilesConfigFile=/etc/openhab2/aws_creds #profilesConfigFile=/etc/openhab2/aws_creds
#profile=fooprofile #profile=fooprofile
#region=eu-west-1 #region=eu-west-1
# UNCOMMENT THE BELOW ALWAYS (otherwise legacy table schema with 'tablePrefix' is used)
#table=openhab
# Credentials file example: # Credentials file example:
# #
# [fooprofile] # [fooprofile]
@ -41,11 +48,16 @@
# ADVANCED CONFIGURATION (OPTIONAL) # ADVANCED CONFIGURATION (OPTIONAL)
# #
# Expire time for data in days (relative to stored timestamp).
# Data older than this is removed automatically using DynamoDB Time to Live (TTL)
# feature.
#expireDays=
# read capacity for the created tables # read capacity for the created tables
#readCapacityUnits=1 #readCapacityUnits=1
# write capacity for the created tables # write capacity for the created tables
#writeCapacityUnits=1 #writeCapacityUnits=1
# table prefix used in the name of created tables # LEGACY SCHEMA: table prefix used in the name of created tables
#tablePrefix=openhab- #tablePrefix=openhab-