[ChatGPT] Enhance binding (#17320)

Signed-off-by: Artur-Fedjukevits <fedjukevitsh@gmail.com>
This commit is contained in:
Artur-Fedjukevits 2024-11-09 23:08:42 +01:00 committed by GitHub
parent 7cab153ebf
commit 5929ef821a
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
18 changed files with 1406 additions and 91 deletions

View File

@ -1,9 +1,16 @@
# ChatGPT Binding # ChatGPT Binding
The openHAB ChatGPT Binding allows openHAB to communicate with the ChatGPT language model provided by OpenAI. The openHAB ChatGPT Binding allows openHAB to communicate with the ChatGPT language model provided by OpenAI and manage openHAB system via [Function calling](https://platform.openai.com/docs/guides/function-calling).
ChatGPT is a powerful natural language processing (NLP) tool that can be used to understand and respond to a wide range of text-based commands and questions. ChatGPT is a powerful natural language processing (NLP) tool that can be used to understand and respond to a wide range of text-based commands and questions.
With this binding, you can use ChatGPT to formulate proper sentences for any kind of information that you would like to output. With this binding, users can:
- Control openHAB Devices: Manage lights, climate systems, media players, and more with natural language commands.
- Multi-language Support: Issue commands in almost any language, enhancing accessibility.
- Engage in Conversations: Have casual conversations, ask questions, and receive informative responses.
- Extended Capabilities: Utilize all other functionalities of ChatGPT, from composing creative content to answering complex questions.
This integration significantly enhances user experience, providing seamless control over smart home environments and access to the full range of ChatGPTs capabilities.
## Supported Things ## Supported Things
@ -11,14 +18,22 @@ The binding supports a single thing type `account`, which corresponds to the Ope
## Thing Configuration ## Thing Configuration
The `account` thing requires a single configuration parameter, which is the API key that allows accessing the account. The `account` thing requires the API key that allows accessing the account.
API keys can be created and managed under <https://platform.openai.com/account/api-keys>. API keys can be created and managed under <https://platform.openai.com/account/api-keys>.
| Name | Type | Description | Default | Required | Advanced | | Name | Type | Description | Default | Required | Advanced |
|-----------------|---------|-----------------------------------------------------------|--------------------------------------------|----------|----------| |------------------|---------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------|----------|----------|
| apiKey | text | The API key to be used for the requests | N/A | yes | no | | apiKey | text | The API key to be used for the requests | N/A | yes | no |
| apiUrl | text | The server API where to reach the AI service | https://api.openai.com/v1/chat/completions | no | yes | | temperature | decimal | A value between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. | 0.5 | no | no |
| modelUrl | text | The model url where to retrieve the available models from | https://api.openai.com/v1/models | no | yes | | topP | decimal | A value between 0 and 1. An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. We generally recommend altering this or temperature but not both. | 1.0 | no | yes |
| apiUrl | text | The server API where to reach the AI service | https://api.openai.com/v1/chat/completions | no | yes |
| modelUrl | text | The model url where to retrieve the available models from | https://api.openai.com/v1/models | no | yes |
| model | text | The model to be used for the HLI service | gpt-4o-mini | no | yes |
| systemMessage | text | Here you need to describe your openHAB system that will help AI control your smart home. | N/A | if HLI | yes |
| maxTokens | decimal | The maximum number of tokens to generate in the completion. | 500 | no | yes |
| keepContext | decimal | How long should the HLI service retain context between requests (in minutes) | 2 | no | yes |
| contextThreshold | decimal | Limit total tokens included in context. | 10000 | no | yes |
| useSemanticModel | boolean | Use the semantic model to determine the location of an item. | true | no | yes |
The advanced parameters `apiUrl` and `modelUrl` can be used, if any other ChatGPT-compatible service is used, e.g. a local installation of [LocalAI](https://github.com/go-skynet/LocalAI). The advanced parameters `apiUrl` and `modelUrl` can be used, if any other ChatGPT-compatible service is used, e.g. a local installation of [LocalAI](https://github.com/go-skynet/LocalAI).
@ -33,32 +48,41 @@ It is possible to extend the thing with further channels of type `chat`, so that
Each channel of type `chat` takes the following configuration parameters: Each channel of type `chat` takes the following configuration parameters:
| Name | Type | Description | Default | Required | Advanced | | Name | Type | Description | Default | Required | Advanced |
|-----------------|---------|------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------|----------|----------| |---------------|---------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------|----------|----------|
| model | text | The model to be used for the responses. | gpt-3.5-turbo | no | no | | model | text | The model to be used for the responses. | gpt-4o | yes | no |
| temperature | decimal | A value between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. | 0.5 | no | no | | systemMessage | text | The system message helps set the behavior of the assistant. | N/A | yes | no |
| systemMessage | text | The system message helps set the behavior of the assistant. | N/A | no | no | | temperature | decimal | A value between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. | 0.5 | no | yes |
| maxTokens | decimal | The maximum number of tokens to generate in the completion. | 500 | no | yes | | topP | decimal | A value between 0 and 1. An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. We generally recommend altering this or temperature but not both. | 1.0 | no | yes |
| maxTokens | decimal | The maximum number of tokens to generate in the completion. | 1000 | no | yes |
## Items Configuration
Items to be used by the HLI service must be tagged with the [ "ChatGPT" ] tag.
If no semantic model is set up, you can set the parameter `useSemanticModel` to false.
In this case, the item names must follow the naming convention '<Location>_***', for example "Kitchen_Light". The label of the items are expected to briefly describe the item in more detail.
## Full Example ## Full Example
### Thing Configuration ### Thing Configuration
```java ```java
Thing chatgpt:account:1 [apiKey="<your api key here>"] { Thing chatgpt:account:1 [
apiKey="",
] {
Channels: Channels:
Type chat : chat "Weather Advice" [ Type chat : chat "Weather Advice" [
model="gpt-3.5-turbo", model="gpt-4o-mini",
temperature="1.5", temperature="1.5",
systemMessage="Answer briefly, in 2-3 sentences max. Behave like Eddie Murphy and give an advice for the day based on the following weather data:" systemMessage="Answer briefly, in 2-3 sentences max. Behave like Eddie Murphy and give an advice for the day based on the following weather data:"
] ]
Type chat : morningMessage "Morning Message" [ Type chat : morningMessage "Morning Message" [
model="gpt-3.5-turbo", model="gpt-4o-mini",
temperature="0.5", temperature="0.5",
systemMessage="You are Marvin, a very depressed robot. You wish a good morning and tell the current time." systemMessage="You are Marvin, a very depressed robot. You wish a good morning and tell the current time."
] ]
} }
``` ```
### Item Configuration ### Item Configuration
@ -69,8 +93,14 @@ String Morning_Message { channel="chatgpt:account:1:morningMessage" }
Number Temperature_Forecast_Low Number Temperature_Forecast_Low
Number Temperature_Forecast_High Number Temperature_Forecast_High
Dimmer Kitchen_Dimmer "Kitchen main light" [ "ChatGPT" ]
``` ```
### UI Configuration of the HLI Service
To enable the HLI service, go to Settings -> Voice and choose "ChatGPT Human Language Interpreter".
A text-to-speech service must be configured.
### Example Rules ### Example Rules
```java ```java
@ -106,3 +136,4 @@ and
``` ```
The state updates can be used for a text-to-speech output and they will give your announcements at home a personal touch. The state updates can be used for a text-to-speech output and they will give your announcements at home a personal touch.

View File

@ -22,11 +22,13 @@ import org.eclipse.jdt.annotation.NonNullByDefault;
@NonNullByDefault @NonNullByDefault
public class ChatGPTChannelConfiguration { public class ChatGPTChannelConfiguration {
public String model = "gpt-3.5-turbo"; public String model = "gpt-4o-mini";
public float temperature = 0.5f; public Double temperature = 0.5;
public Double topP = 1.0;
public String systemMessage = ""; public String systemMessage = "";
int maxTokens = 500; public int maxTokens = 500;
} }

View File

@ -25,4 +25,12 @@ public class ChatGPTConfiguration {
public String apiKey = ""; public String apiKey = "";
public String apiUrl = "https://api.openai.com/v1/chat/completions"; public String apiUrl = "https://api.openai.com/v1/chat/completions";
public String modelUrl = "https://api.openai.com/v1/models"; public String modelUrl = "https://api.openai.com/v1/models";
public boolean useSemanticModel = true;
public String model = "gpt-4o-mini";
public Double temperature = 1.0;
public Integer maxTokens = 1000;
public Double topP = 1.0;
public String systemMessage = "";
public Integer keepContext = 2;
public Integer contextThreshold = 10000;
} }

View File

@ -12,8 +12,6 @@
*/ */
package org.openhab.binding.chatgpt.internal; package org.openhab.binding.chatgpt.internal;
import static org.openhab.binding.chatgpt.internal.ChatGPTBindingConstants.*;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collection; import java.util.Collection;
import java.util.List; import java.util.List;
@ -29,7 +27,10 @@ import org.eclipse.jetty.client.api.Request;
import org.eclipse.jetty.client.util.StringContentProvider; import org.eclipse.jetty.client.util.StringContentProvider;
import org.eclipse.jetty.http.HttpMethod; import org.eclipse.jetty.http.HttpMethod;
import org.eclipse.jetty.http.HttpStatus; import org.eclipse.jetty.http.HttpStatus;
import org.openhab.binding.chatgpt.internal.dto.ChatMessage;
import org.openhab.binding.chatgpt.internal.dto.ChatRequestBody;
import org.openhab.binding.chatgpt.internal.dto.ChatResponse; import org.openhab.binding.chatgpt.internal.dto.ChatResponse;
import org.openhab.binding.chatgpt.internal.hli.ChatGPTHLIService;
import org.openhab.core.io.net.http.HttpClientFactory; import org.openhab.core.io.net.http.HttpClientFactory;
import org.openhab.core.library.types.StringType; import org.openhab.core.library.types.StringType;
import org.openhab.core.thing.Channel; import org.openhab.core.thing.Channel;
@ -40,20 +41,20 @@ import org.openhab.core.thing.ThingStatusDetail;
import org.openhab.core.thing.binding.BaseThingHandler; import org.openhab.core.thing.binding.BaseThingHandler;
import org.openhab.core.thing.binding.ThingHandlerService; import org.openhab.core.thing.binding.ThingHandlerService;
import org.openhab.core.types.Command; import org.openhab.core.types.Command;
import org.openhab.core.types.RefreshType;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.gson.Gson; import com.fasterxml.jackson.annotation.JsonInclude.Include;
import com.google.gson.JsonArray; import com.fasterxml.jackson.core.JsonProcessingException;
import com.google.gson.JsonElement; import com.fasterxml.jackson.databind.JsonNode;
import com.google.gson.JsonObject; import com.fasterxml.jackson.databind.ObjectMapper;
/** /**
* The {@link ChatGPTHandler} is responsible for handling commands, which are * The {@link ChatGPTHandler} is responsible for handling commands, which are
* sent to one of the channels. * sent to one of the channels.
* *
* @author Kai Kreuzer - Initial contribution * @author Kai Kreuzer - Initial contribution
* @author Artur Fedjukevits - Replaced gson with jackson
*/ */
@NonNullByDefault @NonNullByDefault
public class ChatGPTHandler extends BaseThingHandler { public class ChatGPTHandler extends BaseThingHandler {
@ -62,14 +63,11 @@ public class ChatGPTHandler extends BaseThingHandler {
private final Logger logger = LoggerFactory.getLogger(ChatGPTHandler.class); private final Logger logger = LoggerFactory.getLogger(ChatGPTHandler.class);
private HttpClient httpClient; private HttpClient httpClient;
private Gson gson = new Gson(); private @Nullable ChatGPTConfiguration config;
private String apiKey = ""; private String apiKey = "";
private String apiUrl = ""; private String apiUrl = "";
private String modelUrl = ""; private String modelUrl = "";
private String lastPrompt = ""; private String lastPrompt = "";
private List<String> models = List.of(); private List<String> models = List.of();
public ChatGPTHandler(Thing thing, HttpClientFactory httpClientFactory) { public ChatGPTHandler(Thing thing, HttpClientFactory httpClientFactory) {
@ -79,55 +77,101 @@ public class ChatGPTHandler extends BaseThingHandler {
@Override @Override
public void handleCommand(ChannelUID channelUID, Command command) { public void handleCommand(ChannelUID channelUID, Command command) {
if (command instanceof RefreshType && !"".equals(lastPrompt)) {
String response = sendPrompt(channelUID, lastPrompt);
processChatResponse(channelUID, response);
}
if (command instanceof StringType stringCommand) { if (command instanceof StringType stringCommand) {
lastPrompt = stringCommand.toFullString(); lastPrompt = stringCommand.toFullString();
String response = sendPrompt(channelUID, lastPrompt);
processChatResponse(channelUID, response); String queryJson = prepareRequestBody(channelUID);
if (queryJson != null) {
String response = sendPrompt(queryJson);
processChatResponse(channelUID, response);
}
} }
} }
private void processChatResponse(ChannelUID channelUID, @Nullable String response) { private void processChatResponse(ChannelUID channelUID, @Nullable String response) {
if (response != null) { if (response != null) {
ChatResponse chatResponse = gson.fromJson(response, ChatResponse.class);
ObjectMapper objectMapper = new ObjectMapper();
ChatResponse chatResponse;
try {
chatResponse = objectMapper.readValue(response, ChatResponse.class);
} catch (JsonProcessingException e) {
logger.error("Failed to parse ChatGPT response: {}", e.getMessage(), e);
return;
}
if (chatResponse != null) { if (chatResponse != null) {
String msg = chatResponse.getChoices().get(0).getMessage().getContent();
updateState(channelUID, new StringType(msg)); String finishReason = chatResponse.getChoices().get(0).getFinishReason();
if ("length".equals(finishReason)) {
logger.warn("Token length exceeded. Increase maximum token limit to avoid the issue.");
return;
}
@Nullable
ChatMessage chatResponseMessage = chatResponse.getChoices().get(0).getChatMessage();
if (chatResponseMessage == null) {
logger.error("ChatGPT response does not contain a message.");
return;
}
@Nullable
String msg = chatResponseMessage.getContent();
if (msg != null) {
updateState(channelUID, new StringType(msg));
}
} else { } else {
logger.warn("Didn't receive any response from ChatGPT - this is unexpected."); logger.warn("Didn't receive any response from ChatGPT - this is unexpected.");
} }
} }
} }
private @Nullable String sendPrompt(ChannelUID channelUID, String prompt) { private @Nullable String prepareRequestBody(ChannelUID channelUID) {
Channel channel = getThing().getChannel(channelUID); Channel channel = getThing().getChannel(channelUID);
if (channel == null) { if (channel == null) {
logger.error("Channel with UID '{}' cannot be found on Thing '{}'.", channelUID, getThing().getUID()); logger.error("Channel with UID '{}' cannot be found on Thing '{}'.", channelUID, getThing().getUID());
return null; return null;
} }
ChatGPTChannelConfiguration channelConfig = channel.getConfiguration().as(ChatGPTChannelConfiguration.class); ChatGPTChannelConfiguration channelConfig = channel.getConfiguration().as(ChatGPTChannelConfiguration.class);
JsonObject root = new JsonObject(); List<ChatMessage> messages = new ArrayList<>();
root.addProperty("temperature", channelConfig.temperature);
root.addProperty("model", channelConfig.model);
root.addProperty("max_tokens", channelConfig.maxTokens);
JsonObject systemMessage = new JsonObject(); ChatMessage systemMessage = new ChatMessage();
systemMessage.addProperty("role", "system"); systemMessage.setRole(ChatMessage.Role.SYSTEM.value());
systemMessage.addProperty("content", channelConfig.systemMessage); systemMessage.setContent(channelConfig.systemMessage);
JsonObject userMessage = new JsonObject();
userMessage.addProperty("role", "user");
userMessage.addProperty("content", prompt);
JsonArray messages = new JsonArray(2);
messages.add(systemMessage); messages.add(systemMessage);
messages.add(userMessage);
root.add("messages", messages);
String queryJson = gson.toJson(root); ChatMessage userMessage = new ChatMessage();
userMessage.setRole(ChatMessage.Role.USER.value());
userMessage.setContent(lastPrompt);
messages.add(userMessage);
ChatRequestBody chatRequestBody = new ChatRequestBody();
chatRequestBody.setModel(channelConfig.model);
chatRequestBody.setTemperature(channelConfig.temperature);
chatRequestBody.setMaxTokens(channelConfig.maxTokens);
chatRequestBody.setTopP(channelConfig.topP);
chatRequestBody.setMessages(messages);
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setSerializationInclusion(Include.NON_NULL);
try {
return objectMapper.writeValueAsString(chatRequestBody);
} catch (JsonProcessingException e) {
logger.error("Failed to serialize ChatGPT request: {}", e.getMessage(), e);
return null;
}
}
public @Nullable String sendPrompt(String queryJson) {
Request request = httpClient.newRequest(apiUrl).method(HttpMethod.POST) Request request = httpClient.newRequest(apiUrl).method(HttpMethod.POST)
.timeout(REQUEST_TIMEOUT_MS, TimeUnit.MILLISECONDS).header("Content-Type", "application/json") .timeout(REQUEST_TIMEOUT_MS, TimeUnit.MILLISECONDS).header("Content-Type", "application/json")
.header("Authorization", "Bearer " + apiKey).content(new StringContentProvider(queryJson)); .header("Authorization", "Bearer " + apiKey).content(new StringContentProvider(queryJson));
@ -150,9 +194,13 @@ public class ChatGPTHandler extends BaseThingHandler {
} }
} }
public @Nullable ChatGPTConfiguration getConfigAs() {
return this.config;
}
@Override @Override
public void initialize() { public void initialize() {
ChatGPTConfiguration config = getConfigAs(ChatGPTConfiguration.class); this.config = getConfigAs(ChatGPTConfiguration.class);
String apiKey = config.apiKey; String apiKey = config.apiKey;
@ -175,19 +223,28 @@ public class ChatGPTHandler extends BaseThingHandler {
ContentResponse response = request.send(); ContentResponse response = request.send();
if (response.getStatus() == 200) { if (response.getStatus() == 200) {
updateStatus(ThingStatus.ONLINE); updateStatus(ThingStatus.ONLINE);
JsonObject jsonObject = gson.fromJson(response.getContentAsString(), JsonObject.class); ObjectMapper objectMapper = new ObjectMapper();
if (jsonObject != null) { try {
JsonArray data = jsonObject.getAsJsonArray("data"); JsonNode models = objectMapper.readTree(response.getContentAsString());
JsonNode data = models.get("data");
List<String> modelIds = new ArrayList<>(); if (data != null) {
for (JsonElement element : data) { logger.debug("Models: {}", data.toString());
JsonObject model = element.getAsJsonObject(); List<String> modelList = new ArrayList<>();
String id = model.get("id").getAsString(); data.forEach(model -> {
modelIds.add(id); JsonNode id = model.get("id");
if (id != null) {
modelList.add(id.asText());
}
});
this.models = List.copyOf(modelList);
} else {
updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.COMMUNICATION_ERROR,
"@text/offline.communication-error");
} }
this.models = List.copyOf(modelIds); } catch (JsonProcessingException e) {
} else { logger.warn("Failed to parse models: {}", e.getMessage(), e);
logger.warn("Did not receive a valid JSON response from the models endpoint.");
} }
} else { } else {
updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.COMMUNICATION_ERROR, updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.COMMUNICATION_ERROR,
@ -205,6 +262,6 @@ public class ChatGPTHandler extends BaseThingHandler {
@Override @Override
public Collection<Class<? extends ThingHandlerService>> getServices() { public Collection<Class<? extends ThingHandlerService>> getServices() {
return List.of(ChatGPTModelOptionProvider.class); return List.of(ChatGPTModelOptionProvider.class, ChatGPTHLIService.class);
} }
} }

View File

@ -0,0 +1,78 @@
/**
* Copyright (c) 2010-2024 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.chatgpt.internal.dto;
import java.util.function.Function;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
/**
* @author Artur Fedjukevits - Initial contribution
*/
@JsonIgnoreProperties(ignoreUnknown = true)
public class ChatFunction {
private String name;
private String description;
private Parameters parameters;
@JsonIgnore
private Function<Object, Object> executor;
@JsonIgnore
private Class<?> parametersClass;
public ChatFunction() {
}
public String getName() {
return name;
}
public String getDescription() {
return description;
}
public Parameters getParameters() {
return parameters;
}
public Function<Object, Object> getExecutor() {
return executor;
}
public Class<?> getParametersClass() {
return parametersClass;
}
public void setName(String name) {
this.name = name;
}
public void setDescription(String description) {
this.description = description;
}
public void setExecutor(Function<Object, Object> executor) {
this.executor = executor;
}
public void setParameters(Parameters requestClass) {
this.parameters = requestClass;
}
public void setParametersClass(Class<?> parametersClass) {
this.parametersClass = parametersClass;
}
}

View File

@ -0,0 +1,43 @@
/**
* Copyright (c) 2010-2024 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.chatgpt.internal.dto;
import org.eclipse.jdt.annotation.Nullable;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
/**
* @author Artur Fedjukevits - Initial contribution
*/
@JsonIgnoreProperties(ignoreUnknown = true)
public class ChatFunctionCall {
private @Nullable String name;
private @Nullable String arguments;
public String getName() {
return name;
}
public String getArguments() {
return arguments;
}
public void setName(String name) {
this.name = name;
}
public void setArguments(String arguments) {
this.arguments = arguments;
}
}

View File

@ -0,0 +1,104 @@
/**
* Copyright (c) 2010-2024 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.chatgpt.internal.dto;
import java.util.List;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import com.fasterxml.jackson.annotation.JsonProperty;
/**
* @author Artur Fedjukevits - Initial contribution
*/
@JsonIgnoreProperties(ignoreUnknown = true)
public class ChatMessage {
public enum Role {
USER("user"),
ASSISTANT("assistant"),
SYSTEM("system"),
TOOL("tool");
private final String value;
Role(String value) {
this.value = value;
}
public String value() {
return value;
}
}
private String role;
private String content;
@JsonProperty("tool_call_id")
private String toolCallId;
private String name;
@JsonProperty("function_call")
ChatFunctionCall functionCall;
@JsonProperty("tool_calls")
List<ChatToolCalls> toolCalls;
public String getRole() {
return role;
}
public String getContent() {
return content;
}
public ChatFunctionCall getFunctionCall() {
return functionCall;
}
public List<ChatToolCalls> getToolCalls() {
return toolCalls;
}
public String getToolCallId() {
return toolCallId;
}
public String getName() {
return name;
}
public void setRole(String role) {
this.role = role;
}
public void setContent(String content) {
this.content = content;
}
public void setFunctionCall(ChatFunctionCall functionCall) {
this.functionCall = functionCall;
}
public void setToolCalls(List<ChatToolCalls> toolCalls) {
this.toolCalls = toolCalls;
}
public void setToolCallId(String toolCallId) {
this.toolCallId = toolCallId;
}
public void setName(String name) {
this.name = name;
}
}

View File

@ -0,0 +1,101 @@
/**
* Copyright (c) 2010-2024 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.chatgpt.internal.dto;
import java.util.List;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import com.fasterxml.jackson.annotation.JsonProperty;
/**
* @author Artur Fedjukevits - Initial contribution
*/
@JsonIgnoreProperties(ignoreUnknown = true)
public class ChatRequestBody {
private String model;
private List<ChatMessage> messages;
private Double temperature;
@JsonProperty("top_p")
private Double topP;
@JsonProperty("max_tokens")
private Integer maxTokens;
private String user;
private List<ChatTools> tools;
@JsonProperty("tool_choice")
private String toolChoice;
public String getModel() {
return model;
}
public List<ChatMessage> getMessages() {
return messages;
}
public Double getTemperature() {
return temperature;
}
public Double getTopP() {
return topP;
}
public Integer getMaxTokens() {
return maxTokens;
}
public String getUser() {
return user;
}
public List<ChatTools> getTools() {
return tools;
}
public String getToolChoice() {
return toolChoice;
}
public void setModel(String model) {
this.model = model;
}
public void setMessages(List<ChatMessage> messages) {
this.messages = messages;
}
public void setTemperature(Double temperature) {
this.temperature = temperature;
}
public void setTopP(Double topP) {
this.topP = topP;
}
public void setMaxTokens(Integer maxTokens) {
this.maxTokens = maxTokens;
}
public void setUser(String user) {
this.user = user;
}
public void setTools(List<ChatTools> tools) {
this.tools = tools;
}
public void setToolChoice(String toolChoice) {
this.toolChoice = toolChoice;
}
}

View File

@ -14,14 +14,17 @@ package org.openhab.binding.chatgpt.internal.dto;
import java.util.List; import java.util.List;
import com.google.gson.annotations.SerializedName; import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import com.fasterxml.jackson.annotation.JsonProperty;
/** /**
* This is a dto used for parsing the JSON response from ChatGPT. * This is a dto used for parsing the JSON response from ChatGPT.
* *
* @author Kai Kreuzer - Initial contribution * @author Kai Kreuzer - Initial contribution
* @author Artur Fedjukevits - Added fields and edited the class
* *
*/ */
@JsonIgnoreProperties(ignoreUnknown = true)
public class ChatResponse { public class ChatResponse {
private List<Choice> choices; private List<Choice> choices;
@ -29,6 +32,7 @@ public class ChatResponse {
private String object; private String object;
private int created; private int created;
private String model; private String model;
private Usage usage;
public List<Choice> getChoices() { public List<Choice> getChoices() {
return choices; return choices;
@ -50,15 +54,46 @@ public class ChatResponse {
return model; return model;
} }
public static class Choice { public Usage getUsage() {
private Message message; return usage;
}
@SerializedName("finish_reason") public void setChoices(List<Choice> choices) {
this.choices = choices;
}
public void setId(String id) {
this.id = id;
}
public void setObject(String object) {
this.object = object;
}
public void setCreated(int created) {
this.created = created;
}
public void setModel(String model) {
this.model = model;
}
public void setUsage(Usage usage) {
this.usage = usage;
}
@JsonIgnoreProperties(ignoreUnknown = true)
public static class Choice {
@JsonProperty("message")
private ChatMessage chatMessage;
@JsonProperty("finish_reason")
private String finishReason; private String finishReason;
private int index; private int index;
public Message getMessage() { public ChatMessage getChatMessage() {
return message; return chatMessage;
} }
public String getFinishReason() { public String getFinishReason() {
@ -68,18 +103,54 @@ public class ChatResponse {
public int getIndex() { public int getIndex() {
return index; return index;
} }
}
public static class Message { public void setChatMessage(ChatMessage chatMessage) {
private String role; this.chatMessage = chatMessage;
private String content;
public String getRole() {
return role;
} }
public String getContent() { public void setFinishReason(String finishReason) {
return content; this.finishReason = finishReason;
}
public void setIndex(int index) {
this.index = index;
}
}
@JsonIgnoreProperties(ignoreUnknown = true)
public static class Usage {
@JsonProperty("prompt_tokens")
private int promptTokens;
@JsonProperty("completion_tokens")
private int completionTokens;
@JsonProperty("total_tokens")
private int totalTokens;
public int getPromptTokens() {
return promptTokens;
}
public int getCompletionTokens() {
return completionTokens;
}
public int getTotalTokens() {
return totalTokens;
}
public void setPromptTokens(int promptTokens) {
this.promptTokens = promptTokens;
}
public void setCompletionTokens(int completionTokens) {
this.completionTokens = completionTokens;
}
public void setTotalTokens(int totalTokens) {
this.totalTokens = totalTokens;
} }
} }
} }

View File

@ -0,0 +1,50 @@
/**
* Copyright (c) 2010-2024 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.chatgpt.internal.dto;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
/**
* @author Artur Fedjukevits - Initial contribution
*/
@JsonIgnoreProperties(ignoreUnknown = true)
public class ChatToolCalls {
String id;
ChatFunctionCall function;
String type;
public String getId() {
return id;
}
public ChatFunctionCall getFunction() {
return function;
}
public String getType() {
return type;
}
public void setId(String id) {
this.id = id;
}
public void setFunction(ChatFunctionCall function) {
this.function = function;
}
public void setType(String type) {
this.type = type;
}
}

View File

@ -0,0 +1,44 @@
/**
* Copyright (c) 2010-2024 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.chatgpt.internal.dto;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
/**
* @author Artur Fedjukevits - Initial contribution
*/
@JsonIgnoreProperties(ignoreUnknown = true)
public class ChatTools {
private String type;
private ChatFunction function;
public ChatTools() {
}
public String getType() {
return type;
}
public ChatFunction getFunction() {
return function;
}
public void setType(String type) {
this.type = type;
}
public void setFunction(ChatFunction function) {
this.function = function;
}
}

View File

@ -0,0 +1,75 @@
/**
* Copyright (c) 2010-2024 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.chatgpt.internal.dto;
import java.util.List;
import java.util.Map;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
/**
* @author Artur Fedjukevits - Initial contribution
*/
@JsonIgnoreProperties(ignoreUnknown = true)
public class Parameters {
private String type;
private Map<String, Property> properties;
private List<String> required;
public String getType() {
return type;
}
public void setType(String type) {
this.type = type;
}
public Map<String, Property> getProperties() {
return properties;
}
public void setProperties(Map<String, Property> properties) {
this.properties = properties;
}
public List<String> getRequired() {
return required;
}
public void setRequired(List<String> required) {
this.required = required;
}
public static class Property {
private String type;
private String description;
public String getType() {
return type;
}
public String getDescription() {
return description;
}
public void setType(String type) {
this.type = type;
}
public void setDescription(String description) {
this.description = description;
}
}
}

View File

@ -0,0 +1,32 @@
/**
* Copyright (c) 2010-2024 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.chatgpt.internal.dto;
/**
* @author Artur Fedjukevits - Initial contribution
*/
public enum ToolChoice {
NONE("none"),
AUTO("auto");
private final String value;
ToolChoice(String value) {
this.value = value;
}
public String value() {
return value;
}
}

View File

@ -0,0 +1,52 @@
/**
* Copyright (c) 2010-2024 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.chatgpt.internal.dto.functions;
import com.fasterxml.jackson.annotation.JsonProperty;
/**
* @author Artur Fedjukevits - Initial contribution
*/
public class ItemsControl {
@JsonProperty("name")
private String name;
@JsonProperty("type")
private String type;
@JsonProperty("state")
private String state;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getType() {
return type;
}
public void setType(String type) {
this.type = type;
}
public String getState() {
return state;
}
public void setState(String state) {
this.state = state;
}
}

View File

@ -0,0 +1,24 @@
/**
* Copyright (c) 2010-2024 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.chatgpt.internal.hli;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
* @author Artur Fedjukevits - Initial contribution
*/
@NonNullByDefault
public class ChatGPTHLIConstants {
public static final String SERVICE_ID = "chatgpthli";
}

View File

@ -0,0 +1,427 @@
/**
* Copyright (c) 2010-2024 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.chatgpt.internal.hli;
import static org.openhab.binding.chatgpt.internal.hli.ChatGPTHLIConstants.SERVICE_ID;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.nio.charset.StandardCharsets;
import java.time.LocalTime;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.HashMap;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Set;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.openhab.binding.chatgpt.internal.ChatGPTConfiguration;
import org.openhab.binding.chatgpt.internal.ChatGPTHandler;
import org.openhab.binding.chatgpt.internal.dto.ChatFunction;
import org.openhab.binding.chatgpt.internal.dto.ChatFunctionCall;
import org.openhab.binding.chatgpt.internal.dto.ChatMessage;
import org.openhab.binding.chatgpt.internal.dto.ChatRequestBody;
import org.openhab.binding.chatgpt.internal.dto.ChatResponse;
import org.openhab.binding.chatgpt.internal.dto.ChatToolCalls;
import org.openhab.binding.chatgpt.internal.dto.ChatTools;
import org.openhab.binding.chatgpt.internal.dto.ToolChoice;
import org.openhab.binding.chatgpt.internal.dto.functions.ItemsControl;
import org.openhab.core.events.EventPublisher;
import org.openhab.core.items.Item;
import org.openhab.core.items.ItemNotFoundException;
import org.openhab.core.items.ItemRegistry;
import org.openhab.core.items.events.ItemEventFactory;
import org.openhab.core.library.items.RollershutterItem;
import org.openhab.core.library.items.SwitchItem;
import org.openhab.core.library.types.OnOffType;
import org.openhab.core.library.types.UpDownType;
import org.openhab.core.model.script.actions.Semantics;
import org.openhab.core.thing.binding.ThingHandler;
import org.openhab.core.thing.binding.ThingHandlerService;
import org.openhab.core.types.Command;
import org.openhab.core.types.CommandDescription;
import org.openhab.core.types.CommandOption;
import org.openhab.core.types.TypeParser;
import org.openhab.core.voice.text.HumanLanguageInterpreter;
import org.osgi.service.component.annotations.Activate;
import org.osgi.service.component.annotations.Component;
import org.osgi.service.component.annotations.Reference;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.fasterxml.jackson.annotation.JsonInclude.Include;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
/**
* The {@link ChatGPTHLIService} is responsible for handling the human language interpretation using ChatGPT.
*
* @author Artur Fedjukevits - Initial contribution
*/
@Component(service = { ChatGPTHLIService.class, HumanLanguageInterpreter.class })
@NonNullByDefault
public class ChatGPTHLIService implements ThingHandlerService, HumanLanguageInterpreter {
private @Nullable ThingHandler thingHandler;
private List<ChatMessage> messages = new ArrayList<>();
private LocalTime lastMessageTime = LocalTime.now();
private List<ChatTools> tools = new ArrayList<>();
private final Logger logger = LoggerFactory.getLogger(ChatGPTHLIService.class);
private final Map<String, ChatFunction> functions = new HashMap<>();
private @Nullable ItemRegistry itemRegistry;
private @Nullable EventPublisher eventPublisher;
private @Nullable ChatGPTConfiguration config;
@Activate
public ChatGPTHLIService(@Reference ItemRegistry itemRegistry, @Reference EventPublisher eventPublisher) {
this.itemRegistry = itemRegistry;
this.eventPublisher = eventPublisher;
try (InputStream is = Thread.currentThread().getContextClassLoader().getResourceAsStream("/json/tools.json");
InputStreamReader reader = new InputStreamReader(is, StandardCharsets.UTF_8)) {
ObjectMapper mapper = new ObjectMapper();
JsonNode node = mapper.readTree(reader);
try {
this.tools = Arrays.asList(mapper.treeToValue(node, ChatTools[].class));
} catch (JsonProcessingException e) {
logger.debug("Error processing tools.json", e);
}
} catch (IOException e) {
logger.error("Error reading tools.json", e);
}
for (ChatTools tool : tools) {
logger.debug("Loaded tool: {}", tool.getFunction().getName());
}
functions.clear();
functions.putAll(tools.stream().collect(HashMap::new, (map, tool) -> {
ChatFunction function = tool.getFunction();
String functionName = function.getName();
map.put(functionName, function);
}, HashMap::putAll));
ChatFunction itemControlFunction = functions.get("items_control");
if (itemControlFunction != null) {
itemControlFunction.setParametersClass(ItemsControl.class);
itemControlFunction.setExecutor(p -> {
ItemsControl parameters = (ItemsControl) p;
return sendCommand(parameters.getName(), parameters.getState());
});
}
logger.debug("ChatGPTHLIService activated");
}
@Override
public String getId() {
return SERVICE_ID;
}
@Override
public Set<String> getSupportedGrammarFormats() {
return Set.of();
}
@Override
public String interpret(Locale locale, String text) {
String requestBody = prepareRequestBody(text);
if (requestBody == null) {
return "Failed to prepare request body";
}
if (thingHandler instanceof ChatGPTHandler chatGPTHandler) {
String response = chatGPTHandler.sendPrompt(requestBody);
return processChatResponse(response);
}
return "Failed to interpret text";
}
@Override
public Set<Locale> getSupportedLocales() {
return Set.of();
}
@Override
public String getLabel(@Nullable Locale locale) {
return "ChatGPT Human Language Interpreter";
}
@Override
@Nullable
public String getGrammar(Locale locale, String format) {
return "null";
}
@Override
public void setThingHandler(ThingHandler handler) {
this.thingHandler = handler;
}
@Override
public @Nullable ThingHandler getThingHandler() {
return thingHandler;
}
@Override
public void activate() {
}
private String processChatResponse(@Nullable String response) {
if (response == null || response.isEmpty()) {
return "";
}
logger.trace("Received response: {}", response);
ObjectMapper objectMapper = new ObjectMapper();
ChatResponse chatResponse;
try {
chatResponse = objectMapper.readValue(response, ChatResponse.class);
} catch (JsonProcessingException e) {
logger.debug("Failed to parse ChatGPT response: {}", e.getMessage(), e);
return "";
}
if (chatResponse == null) {
logger.warn("Didn't receive any response from ChatGPT - this is unexpected.");
return "";
}
this.lastMessageTime = LocalTime.now();
if (chatResponse.getUsage().getTotalTokens() > this.config.contextThreshold) {
Integer lastUserMessageIndex = null;
for (int i = messages.size() - 1; i >= 0; i--) {
if (messages.get(i).getRole().equals(ChatMessage.Role.USER.value())) {
lastUserMessageIndex = i;
break;
}
}
if (lastUserMessageIndex != null) {
messages.subList(1, lastUserMessageIndex).clear();
messages.set(0, generateSystemMessage());
}
}
String finishReason = chatResponse.getChoices().get(0).getFinishReason();
if ("length".equals(finishReason)) {
logger.warn("Token length exceeded. Increase the maximum token limit to avoid the issue.");
return "";
}
@Nullable
ChatMessage chatResponseMessage = chatResponse.getChoices().get(0).getChatMessage();
if (chatResponseMessage == null) {
logger.debug("ChatGPT response does not contain a message.");
return "";
}
this.messages.add(chatResponseMessage);
if ("tool_calls".equals(finishReason)) {
executeToolCalls(chatResponseMessage.getToolCalls());
return "";
} else {
return (chatResponseMessage.getContent() == null) ? "" : chatResponseMessage.getContent();
}
}
private void executeToolCalls(@Nullable List<ChatToolCalls> toolCalls) {
toolCalls.forEach(tool -> {
if (tool.getType().equals("function")) {
ChatFunctionCall functionCall = tool.getFunction();
if (functionCall != null) {
String functionName = functionCall.getName();
ChatFunction function = functions.get(functionName);
if (function != null) {
ObjectMapper objectMapper = new ObjectMapper();
String arguments = functionCall.getArguments();
Object argumentsObject;
logger.debug("Function '{}' with arguments: {}", functionName, arguments);
JsonNode argumentsNode;
try {
argumentsNode = objectMapper.readTree(arguments);
Class<?> parametersClass = function.getParametersClass();
argumentsObject = objectMapper.treeToValue(argumentsNode, parametersClass);
} catch (JsonProcessingException e) {
logger.debug("Failed to parse arguments: {}", e.getMessage(), e);
return;
}
Object result = function.getExecutor().apply(argumentsObject);
String resultString = String.valueOf(result);
ChatMessage message = new ChatMessage();
message.setRole(ChatMessage.Role.TOOL.value());
message.setName(functionName);
message.setToolCallId(tool.getId());
message.setContent(resultString);
messages.add(message);
} else {
logger.debug("Function '{}' not found", functionName);
}
}
}
});
}
private @Nullable String prepareRequestBody(String message) {
if (this.config == null) {
if (thingHandler instanceof ChatGPTHandler chatGPTHandler) {
this.config = chatGPTHandler.getConfigAs();
}
}
if (this.config == null) {
logger.debug("Could not get configuration");
return null;
}
LocalTime currentTime = LocalTime.now();
if (currentTime.isAfter(this.lastMessageTime.plusMinutes(config.keepContext))) {
this.messages.clear();
}
if (this.messages.isEmpty()) {
ChatMessage systemMessage = generateSystemMessage();
this.messages.add(systemMessage);
}
this.lastMessageTime = currentTime;
ChatMessage userMessage = new ChatMessage();
userMessage.setRole(ChatMessage.Role.USER.value());
userMessage.setContent(message);
this.messages.add(userMessage);
ChatRequestBody chatRequestBody = new ChatRequestBody();
if (this.config.model == null || this.config.model.isEmpty()) {
logger.debug("Model is not set");
return null;
}
chatRequestBody.setModel(this.config.model);
chatRequestBody.setTemperature(this.config.temperature);
chatRequestBody.setMaxTokens(this.config.maxTokens);
chatRequestBody.setTopP(this.config.topP);
chatRequestBody.setToolChoice(ToolChoice.AUTO.value());
chatRequestBody.setTools(this.tools);
chatRequestBody.setMessages(this.messages);
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setSerializationInclusion(Include.NON_NULL);
try {
return objectMapper.writeValueAsString(chatRequestBody);
} catch (JsonProcessingException e) {
logger.debug("Failed to serialize ChatGPT request: {}", e.getMessage(), e);
return null;
}
}
private ChatMessage generateSystemMessage() {
ChatMessage systemMessage = new ChatMessage();
systemMessage.setRole(ChatMessage.Role.SYSTEM.value());
StringBuilder content = new StringBuilder();
content.append(this.config.systemMessage);
Collection<Item> openaiItems = itemRegistry.getItemsByTag("ChatGPT");
if (!openaiItems.isEmpty()) {
openaiItems.forEach(item -> {
String location = "";
String itemType = item.getType();
CommandDescription description = item.getCommandDescription();
List<CommandOption> options = new ArrayList<>();
if (description != null) {
options = description.getCommandOptions();
}
content.append("name: \"").append(item.getName()).append("\", type: \"").append(itemType);
if (config.useSemanticModel) {
Item locationItem = Semantics.getLocation(item);
if (locationItem != null) {
location = locationItem.getName();
}
} else {
String[] nameParts = item.getName().split("_");
location = nameParts[0];
}
if (!location.isEmpty()) {
content.append("\", location: \"").append(location);
}
if (!options.isEmpty()) {
content.append("\", accepted commands: \"");
options.forEach(option -> {
content.append(option.getCommand()).append(", ");
});
content.delete(content.length() - 2, content.length());
content.append("\"").append(System.lineSeparator());
}
content.append("\", description: \"").append(item.getLabel()).append("\", state: \"")
.append(item.getState().toString()).append("\"").append(System.lineSeparator());
});
}
systemMessage.setContent(content.toString());
return systemMessage;
}
public String sendCommand(String itemName, String commandString) {
try {
Item item = itemRegistry.getItem(itemName);
Command command = null;
if ("toggle".equalsIgnoreCase(commandString)
&& (item instanceof SwitchItem || item instanceof RollershutterItem)) {
if (OnOffType.ON.equals(item.getStateAs(OnOffType.class))) {
command = OnOffType.OFF;
} else if (OnOffType.OFF.equals(item.getStateAs(OnOffType.class))) {
command = OnOffType.ON;
} else if (UpDownType.UP.equals(item.getStateAs(UpDownType.class))) {
command = UpDownType.DOWN;
} else if (UpDownType.DOWN.equals(item.getStateAs(UpDownType.class))) {
command = UpDownType.UP;
}
} else {
command = TypeParser.parseCommand(item.getAcceptedCommandTypes(), commandString);
}
if (command != null) {
logger.debug("Received command '{}' for item '{}'", commandString, itemName);
eventPublisher.post(ItemEventFactory.createCommandEvent(itemName, command));
return "Done";
} else {
return "Invalid command";
}
} catch (ItemNotFoundException e) {
logger.warn("Received command '{}' for a non-existent item '{}'", commandString, itemName);
return "Item not found";
}
}
}

View File

@ -14,12 +14,20 @@
</channels> </channels>
<config-description> <config-description>
<parameter name="apiKey" type="text" required="true"> <parameter-group name="authentication">
<label>Authentication</label>
<description>Authentication for connecting to OpenAI API.</description>
</parameter-group>
<parameter-group name="hli">
<label>HLI Configuration</label>
<description>Configure HLI service.</description>
</parameter-group>
<parameter name="apiKey" type="text" required="true" groupName="authentication">
<context>password</context> <context>password</context>
<label>API Key</label> <label>API Key</label>
<description>API key to access the account</description> <description>API key to access the account</description>
</parameter> </parameter>
<parameter name="apiUrl" type="text" required="false"> <parameter name="apiUrl" type="text" required="false" groupName="authentication">
<label>API URL</label> <label>API URL</label>
<description>The server API where to reach the AI service.</description> <description>The server API where to reach the AI service.</description>
<default>https://api.openai.com/v1/chat/completions</default> <default>https://api.openai.com/v1/chat/completions</default>
@ -29,7 +37,7 @@
</options> </options>
<limitToOptions>false</limitToOptions> <limitToOptions>false</limitToOptions>
</parameter> </parameter>
<parameter name="modelUrl" type="text" required="false"> <parameter name="modelUrl" type="text" required="false" groupName="authentication">
<label>Model URL</label> <label>Model URL</label>
<description>The model url where to retrieve the available models from.</description> <description>The model url where to retrieve the available models from.</description>
<default>https://api.openai.com/v1/models</default> <default>https://api.openai.com/v1/models</default>
@ -39,6 +47,79 @@
</options> </options>
<limitToOptions>false</limitToOptions> <limitToOptions>false</limitToOptions>
</parameter> </parameter>
<parameter name="model" type="text" required="true" groupName="hli">
<label>Model</label>
<description>ID of the model to use.</description>
<default>gpt-4o-mini</default>
<advanced>true</advanced>
</parameter>
<parameter name="temperature" type="decimal" min="0" max="2" groupName="hli">
<label>Temperature</label>
<description>What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more
random, while lower values like 0.2 will make it more focused and deterministic.</description>
<default>0.5</default>
<advanced>true</advanced>
</parameter>
<parameter name="topP" type="decimal" min="0" max="1" groupName="hli">
<label>TopP</label>
<description>
An alternative to sampling with temperature, called nucleus sampling, where the model considers the
results of the
tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability
mass are considered.
We generally recommend altering this or temperature but not both.
</description>
<default>1</default>
<advanced>true</advanced>
</parameter>
<parameter name="systemMessage" type="text" groupName="hli">
<label>System Message</label>
<description>Override the default system message of the assistant.</description>
<default>You are the manager of the openHAB smart home. You know how to manage devices in a smart home or provide
their current status. You can also answer a question not related to devices in the house. Or, for example, you can
compose a story upon request.
I will provide information about the smart home; if necessary, you can perform the
function; if there is not enough
information to perform it, then clarify briefly, without listing all the available
devices and parameters for the
function. If the question is not related to devices in a smart home, then answer the
question briefly,
maximum 3 sentences in everyday language.
The name, current status and location of devices is
displayed in 'Available devices'.
Use the items_control function only for the requested action, not for providing
current states.
Available devices:
</default>
<advanced>true</advanced>
</parameter>
<parameter name="maxTokens" type="integer" groupName="hli">
<label>Max Tokens</label>
<description>The maximum number of tokens that can be generated in the chat completion.</description>
<default>1000</default>
<advanced>true</advanced>
</parameter>
<parameter name="keepContext" type="integer" groupName="hli">
<label>Keep Context</label>
<description>How long to store the context in minutes.</description>
<default>2</default>
<advanced>true</advanced>
</parameter>
<parameter name="contextThreshold" type="integer" groupName="hli">
<label>Context Threshold</label>
<description>Limit total tokens included in context.</description>
<default>10000</default>
<advanced>true</advanced>
</parameter>
<parameter name="useSemanticModel" type="boolean" groupName="hli">
<label>Use Semantic Model</label>
<description>Use a semantic model to determine the location of an item. Otherwise, item names must begin with a
"location_" e.g. "Kitchen_Light"</description>
<default>true</default>
<advanced>true</advanced>
</parameter>
</config-description> </config-description>
</thing-type> </thing-type>
@ -52,7 +133,7 @@
<label>Model</label> <label>Model</label>
<description>The model to be used for the responses</description> <description>The model to be used for the responses</description>
<limitToOptions>false</limitToOptions> <limitToOptions>false</limitToOptions>
<default>gpt-3.5-turbo</default> <default>gpt-4o-mini</default>
</parameter> </parameter>
<parameter name="temperature" type="decimal" min="0" max="2"> <parameter name="temperature" type="decimal" min="0" max="2">
<label>Temperature</label> <label>Temperature</label>
@ -67,7 +148,15 @@
<parameter name="maxTokens" type="decimal"> <parameter name="maxTokens" type="decimal">
<label>Maximum Number of Tokens</label> <label>Maximum Number of Tokens</label>
<description>The maximum number of tokens to generate in the completion.</description> <description>The maximum number of tokens to generate in the completion.</description>
<default>500</default> <default>1000</default>
<advanced>true</advanced>
</parameter>
<parameter name="topP" type="decimal" min="0" max="1">
<label>TopP</label>
<description>An alternative to sampling with temperature, called nucleus sampling, where the model considers the
results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability
mass are considered. We generally recommend altering this or temperature but not both.</description>
<default>1</default>
<advanced>true</advanced> <advanced>true</advanced>
</parameter> </parameter>
</config-description> </config-description>

View File

@ -0,0 +1,27 @@
[
{
"type": "function",
"function": {
"name": "items_control",
"description": "Use this function to control item in openHAB",
"parameters": {
"type": "object",
"properties": {
"name": {
"type": "string",
"description": "Item name in openHAB"
},
"type": {
"type": "string",
"description": "Item type"
},
"state": {
"type": "string",
"description": "New state for item"
}
},
"required": ["type", "name", "state"]
}
}
}
]