Compare commits
68 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d5561d393a | ||
|
|
a8f1a838c1 | ||
|
|
b530353e54 | ||
|
|
271b4f876e | ||
|
|
6816a3e027 | ||
|
|
bee25a5f13 | ||
|
|
3db643cb87 | ||
|
|
c791395e0e | ||
|
|
0043e4c147 | ||
|
|
f38047c931 | ||
|
|
19cbd5a041 | ||
|
|
a48394d057 | ||
|
|
1871f6c8d2 | ||
|
|
066459f14e | ||
|
|
3f14f5cb9e | ||
|
|
4c51a159af | ||
|
|
450012aac5 | ||
|
|
00f800c17a | ||
|
|
421f7a533a | ||
|
|
6d9be75ce3 | ||
|
|
0886b30032 | ||
|
|
d308c3a9fa | ||
|
|
38dacf2b97 | ||
|
|
700b946acf | ||
|
|
dfe8bcb01e | ||
|
|
a8449e8417 | ||
|
|
f097b3350b | ||
|
|
056e182f64 | ||
|
|
00f1fe01bf | ||
|
|
108da0a97e | ||
|
|
e5d19ce07d | ||
|
|
464e542a47 | ||
|
|
414eb19ffb | ||
|
|
283bc2257b | ||
|
|
198146b5f4 | ||
|
|
242653da72 | ||
|
|
417b57c99a | ||
|
|
ff9360d2a7 | ||
|
|
c570fbabfa | ||
|
|
7b69de8181 | ||
|
|
5377dd81c8 | ||
|
|
64f573a369 | ||
|
|
c31c0280e7 | ||
|
|
382d887f56 | ||
|
|
92d44eaa6b | ||
|
|
c773d5a084 | ||
|
|
997195ea29 | ||
|
|
b25a4619f3 | ||
|
|
030b9794bb | ||
|
|
bf597c10a5 | ||
|
|
0f4d41b466 | ||
|
|
a44c03fc98 | ||
|
|
6a6a89d6d3 | ||
|
|
7d56f47c10 | ||
|
|
aa1376208a | ||
|
|
4f1c3a53be | ||
|
|
d97d4ece43 | ||
|
|
476cdf029e | ||
|
|
dfcd5de166 | ||
|
|
60fc38b1f0 | ||
|
|
5b155c7b4c | ||
|
|
c0a2a705ec | ||
|
|
125f681bec | ||
|
|
8b4ff6173c | ||
|
|
9273c843d4 | ||
|
|
d48ddcb151 | ||
|
|
5bc3ba8727 | ||
|
|
76cb9a19c7 |
2
.gitignore
vendored
2
.gitignore
vendored
@@ -3,4 +3,6 @@ __pycache__
|
||||
mosquitto/**
|
||||
homeassistant/**
|
||||
tsun_proxy/**
|
||||
system_tests/**
|
||||
Doku/**
|
||||
.DS_Store
|
||||
|
||||
1
.vscode/settings.json
vendored
1
.vscode/settings.json
vendored
@@ -1,5 +1,6 @@
|
||||
{
|
||||
"python.testing.pytestArgs": [
|
||||
"-vv",
|
||||
"app","system_tests"
|
||||
],
|
||||
"python.testing.unittestEnabled": false,
|
||||
|
||||
41
CHANGELOG.md
41
CHANGELOG.md
@@ -7,14 +7,49 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
## [0.1.0] - 2023-10-06
|
||||
|
||||
### Removed
|
||||
- refactoring of the connection classes
|
||||
- change user id on startup
|
||||
- register MQTT topics to home assistant, even if we have multiple inverters
|
||||
|
||||
-
|
||||
## [0.0.6] - 2023-10-03
|
||||
|
||||
- Bump aiomqtt to version 1.2.1
|
||||
- Force MQTT registration when the home assistant has set the status to online again
|
||||
- fix control byte output in tx trace
|
||||
- dealloc async_stream instances in connection termination
|
||||
|
||||
## [0.0.5] - 2023-10-01
|
||||
|
||||
- Entity icons updated
|
||||
- Prints version on start
|
||||
- Prepare for MQTT component != sensor
|
||||
- Add MQTT origin
|
||||
|
||||
## [0.0.4] - 2023-09-30
|
||||
|
||||
- With this patch we ignore the setting 'suggested_area' in config.toml, because it makes no sense with multiple devices. We are looking for a better solution without combining all values into one area again in a later version.
|
||||
|
||||
❗Due to the change from one device to multiple devices in the Home Assistant, the previous MQTT device should be deleted in the Home Assistant after the update to pre-release '0.0.4'. Afterwards, the proxy must be restarted again to ensure that the sub-devices are created completely.
|
||||
|
||||
### Added
|
||||
|
||||
-
|
||||
- Register multiple devices at home-assistant instead of one for all measurements.
|
||||
Now we register: a Controller, the inverter and up to 4 input devices to home-assistant.
|
||||
|
||||
## [0.0.3] - 2023-09-28
|
||||
|
||||
### Added
|
||||
|
||||
- Fixes Running Proxy with host UID and GUID #2
|
||||
|
||||
## [0.0.2] - 2023-09-27
|
||||
|
||||
### Added
|
||||
|
||||
- Dockerfile opencontainer labels
|
||||
- Send voltage and current of inputs to mqtt
|
||||
|
||||
## [0.0.1] - 2023-09-25
|
||||
|
||||
|
||||
64
README.md
64
README.md
@@ -16,9 +16,9 @@
|
||||
###
|
||||
# Overview
|
||||
|
||||
The "TSUN Gen3 Micro-Inverter" proxy enables a reliable connection between TSUN third generation inverters and an MQTT broker to integrate the inverter into typical home automations.
|
||||
The "TSUN Gen3 Micro-Inverter" proxy enables a reliable connection between TSUN third generation inverters and an MQTT broker. With the proxy, you can easily retrieve real-time values such as power, current and daily energy and integrate the inverter into typical home automations. This works even without an internet connection. The optional connection to the TSUN Cloud can be disabled!
|
||||
|
||||
The inverter establishes a TCP connection to the TSUN Cloud to transmit current measured values every 300 seconds. To be able to forward the measurement data to an MQTT broker, the proxy must be looped into this TCP connection.
|
||||
In detail, the inverter establishes a TCP connection to the TSUN cloud to transmit current measured values every 300 seconds. To be able to forward the measurement data to an MQTT broker, the proxy must be looped into this TCP connection.
|
||||
|
||||
Through this, the inverter then establishes a connection to the proxy and the proxy establishes another connection to the TSUN Cloud. The transmitted data is interpreted by the proxy and then passed on to both the TSUN Cloud and the MQTT broker. The connection to the TSUN Cloud is optional and can be switched off in the configuration (default is on). Then no more data is sent to the Internet, but no more remote updates of firmware and operating parameters (e.g. rated power, grid parameters) are possible.
|
||||
|
||||
@@ -47,27 +47,27 @@ If you use a Pi-hole, you can also store the host entry in the Pi-hole.
|
||||
- A running Docker engine to host the container
|
||||
- Ability to loop the proxy into the connection between the inverter and the TSUN cloud
|
||||
|
||||
## License
|
||||
|
||||
This project is licensed under the [BSD 3-clause License](https://opensource.org/licenses/BSD-3-Clause).
|
||||
###
|
||||
# Getting Started
|
||||
|
||||
Note the aiomqtt library used is based on the paho-mqtt library, which has a dual license. One of the licenses is the so-called [Eclipse Distribution License v1.0](https://www.eclipse.org/org/documents/edl-v10.php). It is almost word-for-word identical to the BSD 3-clause License. The only differences are:
|
||||
|
||||
- One use of "COPYRIGHT OWNER" (EDL) instead of "COPYRIGHT HOLDER" (BSD)
|
||||
- One use of "Eclipse Foundation, Inc." (EDL) instead of "copyright holder" (BSD)
|
||||
|
||||
|
||||
## Versioning
|
||||
|
||||
This project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). Breaking changes will only occur in major `X.0.0` releases.
|
||||
|
||||
## Contributing
|
||||
|
||||
We're very happy to receive contributions to this project! You can get started by reading [CONTRIBUTING.md](https://github.com/s-allius/tsun-gen3-proxy/blob/main/CONTRIBUTING.md).
|
||||
|
||||
## Changelog
|
||||
|
||||
The changelog lives in [CHANGELOG.md](https://github.com/s-allius/tsun-gen3-proxy/blob/main/CHANGELOG.md). It follows the principles of [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||
To run the proxy, you first need to create the image. You can do this quite simply as follows:
|
||||
```sh
|
||||
docker build https://github.com/s-allius/tsun-gen3-proxy.git#main:app -t tsun-proxy
|
||||
```
|
||||
after that you can run the image:
|
||||
```sh
|
||||
docker run --dns '8.8.8.8' --env 'UID=1000' -p '5005:5005' -v ./config:/home/tsun-proxy/config -v ./log:/home/tsun-proxy/log tsun-proxy
|
||||
```
|
||||
You will surely see a message that the configuration file was not found. So that we can create this without admin rights, the `uid` must still be adapted. To do this, simply stop the proxy with ctrl-c and use the `id` command to determine your own UserId:
|
||||
```sh
|
||||
% id
|
||||
uid=1050(sallius) gid=20(staff) ...
|
||||
```
|
||||
With this information we can customize the `docker run`` statement:
|
||||
```sh
|
||||
docker run --dns '8.8.8.8' --env 'UID=1050' -p '5005:5005' -v ./config:/home/tsun-proxy/config -v ./log:/home/tsun-proxy/log tsun-proxy
|
||||
```
|
||||
|
||||
###
|
||||
# Configuration
|
||||
@@ -120,3 +120,25 @@ suggested_area = 'balcony' # Optional, suggested installation area for home-a
|
||||
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
This project is licensed under the [BSD 3-clause License](https://opensource.org/licenses/BSD-3-Clause).
|
||||
|
||||
Note the aiomqtt library used is based on the paho-mqtt library, which has a dual license. One of the licenses is the so-called [Eclipse Distribution License v1.0](https://www.eclipse.org/org/documents/edl-v10.php). It is almost word-for-word identical to the BSD 3-clause License. The only differences are:
|
||||
|
||||
- One use of "COPYRIGHT OWNER" (EDL) instead of "COPYRIGHT HOLDER" (BSD)
|
||||
- One use of "Eclipse Foundation, Inc." (EDL) instead of "copyright holder" (BSD)
|
||||
|
||||
|
||||
## Versioning
|
||||
|
||||
This project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). Breaking changes will only occur in major `X.0.0` releases.
|
||||
|
||||
## Contributing
|
||||
|
||||
We're very happy to receive contributions to this project! You can get started by reading [CONTRIBUTING.md](https://github.com/s-allius/tsun-gen3-proxy/blob/main/CONTRIBUTING.md).
|
||||
|
||||
## Changelog
|
||||
|
||||
The changelog lives in [CHANGELOG.md](https://github.com/s-allius/tsun-gen3-proxy/blob/main/CHANGELOG.md). It follows the principles of [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
tests/
|
||||
**/__pycache__
|
||||
*.pyc
|
||||
*.pyc
|
||||
.DS_Store
|
||||
@@ -1,9 +1,20 @@
|
||||
ARG SERVICE_NAME="tsun-proxy"
|
||||
ARG UID=1026
|
||||
ARG UID=1000
|
||||
ARG GID=1000
|
||||
|
||||
# set base image (host OS)
|
||||
FROM python:3.11-slim-bookworm AS builder
|
||||
|
||||
USER root
|
||||
|
||||
# install gosu for a better su+exec command
|
||||
RUN set -eux; \
|
||||
apt-get update; \
|
||||
apt-get install -y gosu; \
|
||||
rm -rf /var/lib/apt/lists/*; \
|
||||
# verify that the binary works
|
||||
gosu nobody true
|
||||
|
||||
RUN pip install --upgrade pip
|
||||
|
||||
|
||||
@@ -18,35 +29,44 @@ RUN pip install --user -r requirements.txt
|
||||
# second unnamed stage
|
||||
FROM python:3.11-slim-bookworm
|
||||
ARG SERVICE_NAME
|
||||
ARG VERSION
|
||||
ARG UID
|
||||
ARG GID
|
||||
ENV VERSION=$VERSION
|
||||
ENV SERVICE_NAME=$SERVICE_NAME
|
||||
ENV UID=$UID
|
||||
ENV GID=$GID
|
||||
|
||||
RUN addgroup --gid 1000 $SERVICE_NAME && \
|
||||
adduser --ingroup $SERVICE_NAME --shell /bin/false --disabled-password --uid $UID $SERVICE_NAME && \
|
||||
mkdir -p /home/$SERVICE_NAME/log && \
|
||||
chown $SERVICE_NAME:$SERVICE_NAME /home/$SERVICE_NAME/log && \
|
||||
mkdir -p /home/$SERVICE_NAME/config && \
|
||||
chown $SERVICE_NAME:$SERVICE_NAME /home/$SERVICE_NAME/config
|
||||
|
||||
|
||||
# set the working directory in the container
|
||||
WORKDIR /home/$SERVICE_NAME
|
||||
USER $SERVICE_NAME
|
||||
|
||||
# copy only the dependencies installation from the 1st stage image
|
||||
COPY --from=builder --chown=$SERVICE_NAME:$SERVICE_NAME /root/.local /home/$SERVICE_NAME/.local
|
||||
|
||||
# copy the content of the local src and config directory to the working directory
|
||||
COPY --chown=$SERVICE_NAME:$SERVICE_NAME config .
|
||||
COPY --chown=$SERVICE_NAME:$SERVICE_NAME src .
|
||||
|
||||
# update PATH environment variable
|
||||
ENV HOME=/home/$SERVICE_NAME
|
||||
ENV PATH=/home/$SERVICE_NAME/.local:$PATH
|
||||
|
||||
EXPOSE 5005 5005
|
||||
VOLUME ["/home/$SERVICE_NAME/log", "/home/$SERVICE_NAME/config"]
|
||||
|
||||
LABEL de.allius.image.authors="Stefan Allius <stefan.allius@t-online.de>"
|
||||
# copy only the dependencies installation from the 1st stage image
|
||||
COPY --from=builder --chown=$SERVICE_NAME:$SERVICE_NAME /root/.local /home/$SERVICE_NAME/.local
|
||||
COPY --from=builder /usr/sbin/gosu /usr/sbin/gosu
|
||||
|
||||
COPY entrypoint.sh /root/entrypoint.sh
|
||||
RUN chmod +x /root/entrypoint.sh
|
||||
|
||||
# copy the content of the local src and config directory to the working directory
|
||||
COPY config .
|
||||
COPY src .
|
||||
|
||||
EXPOSE 5005
|
||||
|
||||
# command to run on container start
|
||||
CMD [ "python3", "./server.py" ]
|
||||
ENTRYPOINT ["/root/entrypoint.sh"]
|
||||
CMD [ "python3", "./server.py" ]
|
||||
|
||||
|
||||
LABEL org.opencontainers.image.authors="Stefan Allius"
|
||||
LABEL org.opencontainers.image.source https://github.com/s-allius/tsun-gen3-proxy
|
||||
LABEL org.opencontainers.image.description 'The "TSUN Gen3 Micro-Inverter" proxy enables a reliable connection between TSUN third generation inverters and an MQTT broker to integrate the inverter into typical home automations'
|
||||
LABEL org.opencontainers.image.licenses="BSD-3-Clause"
|
||||
LABEL org.opencontainers.image.vendor="Stefan Allius"
|
||||
|
||||
31
app/build.sh
Executable file
31
app/build.sh
Executable file
@@ -0,0 +1,31 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -e
|
||||
|
||||
BUILD_DATE=$(date -Iminutes)
|
||||
VERSION=$(git describe --tags --abbrev=0)
|
||||
VERSION="${VERSION:1}"
|
||||
arr=(${VERSION//./ })
|
||||
MAJOR=${arr[0]}
|
||||
IMAGE=tsun-gen3-proxy
|
||||
|
||||
if [[ $1 == dev ]];then
|
||||
IMAGE=docker.io/sallius/${IMAGE}
|
||||
VERSION=${VERSION}-dev
|
||||
elif [[ $1 == rel ]];then
|
||||
IMAGE=ghcr.io/s-allius/${IMAGE}
|
||||
else
|
||||
echo argument missing!
|
||||
echo try: $0 '[dev|rel]'
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo version: $VERSION build-date: $BUILD_DATE image: $IMAGE
|
||||
if [[ $1 == dev ]];then
|
||||
docker build --build-arg "VERSION=${VERSION}" --label "org.label-schema.build-date=${BUILD_DATE}" --label "org.opencontainers.image.version=${VERSION}" -t ${IMAGE}:latest app
|
||||
elif [[ $1 == rel ]];then
|
||||
docker build --no-cache --build-arg "VERSION=${VERSION}" --label "org.label-schema.build-date=${BUILD_DATE}" --label "org.opencontainers.image.version=${VERSION}" -t ${IMAGE}:latest -t ${IMAGE}:${MAJOR} -t ${IMAGE}:${VERSION} app
|
||||
docker push ghcr.io/s-allius/tsun-gen3-proxy:latest
|
||||
docker push ghcr.io/s-allius/tsun-gen3-proxy:${MAJOR}
|
||||
docker push ghcr.io/s-allius/tsun-gen3-proxy:${VERSION}
|
||||
fi
|
||||
@@ -22,7 +22,7 @@ inverters.allow_all = true # allow inverters, even if we have no inverter mapp
|
||||
# inverter mapping, maps a `serial_no* to a `mqtt_id` and defines an optional `suggested_place` for `home-assistant`
|
||||
#
|
||||
# for each inverter add a block starting with [inverters."<16-digit serial numbeer>"]
|
||||
#[inverters."R17xxxxxxxxxxxx1"]
|
||||
[inverters."R170000000000001"]
|
||||
#node_id = '' # Optional, MQTT replacement for inverters serial number
|
||||
#suggested_area = '' # Optional, suggested installation area for home-assistant
|
||||
|
||||
|
||||
26
app/entrypoint.sh
Normal file
26
app/entrypoint.sh
Normal file
@@ -0,0 +1,26 @@
|
||||
#!/bin/sh
|
||||
set -e
|
||||
|
||||
user="$(id -u)"
|
||||
echo "######################################################"
|
||||
echo "# prepare: '$SERVICE_NAME' Version:$VERSION"
|
||||
echo "# for running with UserID:$UID, GroupID:$GID"
|
||||
echo "#"
|
||||
|
||||
if [ "$user" = '0' ]; then
|
||||
mkdir -p /home/$SERVICE_NAME/log /home/$SERVICE_NAME/config
|
||||
|
||||
if id $SERVICE_NAME ; then
|
||||
echo "user still exists"
|
||||
else
|
||||
addgroup --gid $GID $SERVICE_NAME 2> /dev/null
|
||||
adduser --ingroup $SERVICE_NAME --shell /bin/false --disabled-password --no-create-home --comment "" --uid $UID $SERVICE_NAME
|
||||
fi
|
||||
chown -R $SERVICE_NAME:$SERVICE_NAME /home/$SERVICE_NAME || true
|
||||
echo "######################################################"
|
||||
echo "#"
|
||||
|
||||
exec gosu $SERVICE_NAME "$@"
|
||||
else
|
||||
exec "$@"
|
||||
fi
|
||||
@@ -1,2 +1,2 @@
|
||||
aiomqtt==1.2.0
|
||||
schema
|
||||
aiomqtt==1.2.1
|
||||
schema==0.7.5
|
||||
@@ -1,67 +1,52 @@
|
||||
import logging, traceback, aiomqtt, json
|
||||
import logging, traceback
|
||||
from config import Config
|
||||
#import gc
|
||||
from messages import Message, hex_dump_memory
|
||||
from mqtt import Mqtt
|
||||
|
||||
logger = logging.getLogger('conn')
|
||||
logger_mqtt = logging.getLogger('mqtt')
|
||||
|
||||
class AsyncStream(Message):
|
||||
|
||||
def __init__(self, proxy, reader, writer, addr, stream=None, server_side=True):
|
||||
def __init__(self, reader, writer, addr, remote_stream, server_side: bool) -> None:
|
||||
super().__init__()
|
||||
self.proxy = proxy
|
||||
self.reader = reader
|
||||
self.writer = writer
|
||||
self.remoteStream = stream
|
||||
self.addr = addr
|
||||
self.remoteStream = remote_stream
|
||||
self.server_side = server_side
|
||||
self.mqtt = Mqtt()
|
||||
self.addr = addr
|
||||
self.unique_id = 0
|
||||
self.node_id = ''
|
||||
|
||||
'''
|
||||
Our puplic methods
|
||||
'''
|
||||
async def set_serial_no(self, serial_no : str):
|
||||
logger_mqtt.info(f'SerialNo: {serial_no}')
|
||||
def set_serial_no(self, serial_no : str):
|
||||
logger.info(f'SerialNo: {serial_no}')
|
||||
|
||||
if self.unique_id != serial_no:
|
||||
|
||||
inverters = Config.get('inverters')
|
||||
#logger_mqtt.debug(f'Inverters: {inverters}')
|
||||
#logger.debug(f'Inverters: {inverters}')
|
||||
|
||||
if serial_no in inverters:
|
||||
logger_mqtt.debug(f'SerialNo {serial_no} allowed!')
|
||||
logger.debug(f'SerialNo {serial_no} allowed!')
|
||||
inv = inverters[serial_no]
|
||||
self.node_id = inv['node_id']
|
||||
sug_area = inv['suggested_area']
|
||||
self.sug_area = inv['suggested_area']
|
||||
else:
|
||||
logger_mqtt.debug(f'SerialNo {serial_no} not known!')
|
||||
logger.debug(f'SerialNo {serial_no} not known!')
|
||||
self.node_id = ''
|
||||
sug_area = ''
|
||||
self.sug_area = ''
|
||||
if not inverters['allow_all']:
|
||||
self.unique_id = None
|
||||
|
||||
logger_mqtt.error('ignore message from unknow inverter!')
|
||||
logger.error('ignore message from unknow inverter!')
|
||||
return
|
||||
|
||||
self.unique_id = serial_no
|
||||
|
||||
ha = Config.get('ha')
|
||||
self.entitiy_prfx = ha['entity_prefix'] + '/'
|
||||
discovery_prfx = ha['discovery_prefix'] + '/'
|
||||
|
||||
if self.server_side:
|
||||
try:
|
||||
for data_json, id in self.db.ha_confs(self.entitiy_prfx + self.node_id, self.unique_id, sug_area):
|
||||
logger_mqtt.debug(f'Register: {data_json}')
|
||||
await self.mqtt.publish(f"{discovery_prfx}sensor/{self.node_id}{id}/config", data_json)
|
||||
|
||||
except Exception:
|
||||
logging.error(
|
||||
f"Proxy: Exception:\n"
|
||||
f"{traceback.format_exc()}")
|
||||
|
||||
|
||||
|
||||
async def loop(self) -> None:
|
||||
@@ -71,12 +56,12 @@ class AsyncStream(Message):
|
||||
await self.__async_read()
|
||||
|
||||
if self.id_str:
|
||||
await self.set_serial_no(self.id_str.decode("utf-8"))
|
||||
self.set_serial_no(self.id_str.decode("utf-8"))
|
||||
|
||||
if self.unique_id:
|
||||
await self.__async_write()
|
||||
await self.__async_forward()
|
||||
await self.__async_publ_mqtt()
|
||||
await self.async_publ_mqtt()
|
||||
|
||||
|
||||
except (ConnectionResetError,
|
||||
@@ -91,12 +76,18 @@ class AsyncStream(Message):
|
||||
f"{traceback.format_exc()}")
|
||||
self.close()
|
||||
return
|
||||
|
||||
def disc(self) -> None:
|
||||
logger.debug(f'in AsyncStream.disc() {self.addr}')
|
||||
self.writer.close()
|
||||
|
||||
|
||||
def close(self):
|
||||
logger.info(f'in async_stream.close() {self.addr}')
|
||||
logger.debug(f'in AsyncStream.close() {self.addr}')
|
||||
self.writer.close()
|
||||
self.proxy = None
|
||||
self.remoteStream = None
|
||||
super().close() # call close handler in the parent class
|
||||
|
||||
# logger.info (f'AsyncStream refs: {gc.get_referrers(self)}')
|
||||
|
||||
|
||||
'''
|
||||
@@ -120,8 +111,7 @@ class AsyncStream(Message):
|
||||
async def __async_forward(self) -> None:
|
||||
if self._forward_buffer:
|
||||
if not self.remoteStream:
|
||||
tsun = Config.get('tsun')
|
||||
self.remoteStream = await self.proxy.CreateClientStream (self, tsun['host'], tsun['port'])
|
||||
await self.async_create_remote() # only implmeneted for server side => syncServerStream
|
||||
|
||||
if self.remoteStream:
|
||||
hex_dump_memory(logging.DEBUG, f'Forward to {self.remoteStream.addr}:', self._forward_buffer, len(self._forward_buffer))
|
||||
@@ -129,17 +119,14 @@ class AsyncStream(Message):
|
||||
await self.remoteStream.writer.drain()
|
||||
self._forward_buffer = bytearray(0)
|
||||
|
||||
async def __async_publ_mqtt(self) -> None:
|
||||
if self.server_side:
|
||||
db = self.db.db
|
||||
for key in self.new_data:
|
||||
if self.new_data[key] and key in db:
|
||||
data_json = json.dumps(db[key])
|
||||
logger_mqtt.info(f'{key}: {data_json}')
|
||||
await self.mqtt.publish(f"{self.entitiy_prfx}{self.node_id}{key}", data_json)
|
||||
self.new_data[key] = False
|
||||
async def async_create_remote(self) -> None:
|
||||
pass
|
||||
|
||||
async def async_publ_mqtt(self) -> None:
|
||||
pass
|
||||
|
||||
|
||||
def __del__ (self):
|
||||
logger.debug ("AsyncStream __del__")
|
||||
|
||||
logging.debug (f"AsyncStream.__del__ {self.addr}")
|
||||
|
||||
|
||||
|
||||
@@ -60,7 +60,7 @@ class Config():
|
||||
config['inverters'] = def_config['inverters'] | usr_config['inverters']
|
||||
|
||||
cls.config = cls.conf_schema.validate(config)
|
||||
logging.debug(f'Readed config: "{cls.config}" ')
|
||||
#logging.debug(f'Readed config: "{cls.config}" ')
|
||||
|
||||
except Exception as error:
|
||||
logger.error(f'Config.read: {error}')
|
||||
|
||||
187
app/src/infos.py
187
app/src/infos.py
@@ -1,32 +1,39 @@
|
||||
import struct, json, logging
|
||||
import struct, json, logging, os
|
||||
|
||||
|
||||
|
||||
class Infos:
|
||||
def __init__(self):
|
||||
self.db = {}
|
||||
self.app_name = os.getenv('SERVICE_NAME', 'proxy')
|
||||
self.version = os.getenv('VERSION', 'unknown')
|
||||
self.tracer = logging.getLogger('data')
|
||||
|
||||
__info_devs={
|
||||
'controller':{ 'name':'Controller', 'mdl':0x00092f90, 'mf':0x000927c0, 'sw':0x00092ba8},
|
||||
'inverter': {'via':'controller', 'name':'Micro Inverter', 'mdl':0x00000032, 'mf':0x00000014, 'sw':0x0000001e},
|
||||
'input_pv1': {'via':'inverter', 'name':'Module PV1'},
|
||||
'input_pv2': {'via':'inverter', 'name':'Module PV2'},
|
||||
'input_pv3': {'via':'inverter', 'name':'Module PV3'},
|
||||
'input_pv4': {'via':'inverter', 'name':'Module PV4'},
|
||||
}
|
||||
|
||||
__info_defs={
|
||||
# collector values:
|
||||
# collector values used for device registration:
|
||||
0x00092ba8: {'name':['collector', 'Collector_Fw_Version'], 'level': logging.INFO, 'unit': ''},
|
||||
0x000927c0: {'name':['collector', 'Chip_Type'], 'level': logging.DEBUG, 'unit': ''},
|
||||
0x00092f90: {'name':['collector', 'Chip_Model'], 'level': logging.DEBUG, 'unit': ''},
|
||||
0x00095a88: {'name':['collector', 'Trace_URL'], 'level': logging.DEBUG, 'unit': ''},
|
||||
0x00095aec: {'name':['collector', 'Logger_URL'], 'level': logging.DEBUG, 'unit': ''},
|
||||
0x000cf850: {'name':['collector', 'Data_Up_Interval'], 'level': logging.DEBUG, 'unit': 's'},
|
||||
0x000005dc: {'name':['collector', 'Rated_Power'], 'level': logging.DEBUG, 'unit': 'W'},
|
||||
# inverter values:
|
||||
|
||||
# inverter values used for device registration:
|
||||
0x0000000a: {'name':['inverter', 'Product_Name'], 'level': logging.DEBUG, 'unit': ''},
|
||||
0x00000014: {'name':['inverter', 'Manufacturer'], 'level': logging.DEBUG, 'unit': ''},
|
||||
0x0000001e: {'name':['inverter', 'Version'], 'level': logging.INFO, 'unit': ''},
|
||||
0x00000028: {'name':['inverter', 'Serial_Number'], 'level': logging.DEBUG, 'unit': ''},
|
||||
0x00000032: {'name':['inverter', 'Equipment_Model'], 'level': logging.DEBUG, 'unit': ''},
|
||||
# env:
|
||||
0x00000514: {'name':['env', 'Inverter_Temp'], 'level': logging.DEBUG, 'unit': '°C', 'ha':{'dev_cla': 'temperature', 'stat_cla': 'measurement', 'id':'temp_', 'fmt':'| float','name': 'Inverter Temperature'}},
|
||||
0x000c3500: {'name':['env', 'Signal_Strength'], 'level': logging.DEBUG, 'unit': '%' , 'ha':{'dev_cla': None, 'stat_cla': 'measurement', 'id':'signal_', 'fmt':'| float','name': 'Signal Strength', 'icon':'mdi:wifi'}},
|
||||
|
||||
# events:
|
||||
|
||||
# events
|
||||
0x00000191: {'name':['events', '401_'], 'level': logging.DEBUG, 'unit': ''},
|
||||
0x00000192: {'name':['events', '402_'], 'level': logging.DEBUG, 'unit': ''},
|
||||
0x00000193: {'name':['events', '403_'], 'level': logging.DEBUG, 'unit': ''},
|
||||
@@ -43,83 +50,142 @@ class Infos:
|
||||
0x0000019e: {'name':['events', '414_'], 'level': logging.DEBUG, 'unit': ''},
|
||||
0x0000019f: {'name':['events', '415_GridFreqOverRating'], 'level': logging.DEBUG, 'unit': ''},
|
||||
0x000001a0: {'name':['events', '416_'], 'level': logging.DEBUG, 'unit': ''},
|
||||
|
||||
# grid measures:
|
||||
0x000003e8: {'name':['grid', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V', 'ha':{'dev_cla': 'voltage', 'stat_cla': 'measurement', 'id':'out_volt_', 'fmt':'| float','name': 'Grid Voltage'}},
|
||||
0x0000044c: {'name':['grid', 'Current'], 'level': logging.DEBUG, 'unit': 'A', 'ha':{'dev_cla': 'current', 'stat_cla': 'measurement', 'id':'out_cur_', 'fmt':'| float','name': 'Grid Current'}},
|
||||
0x000004b0: {'name':['grid', 'Frequency'], 'level': logging.DEBUG, 'unit': 'Hz', 'ha':{'dev_cla': 'frequency', 'stat_cla': 'measurement', 'id':'out_freq_', 'fmt':'| float','name': 'Grid Frequency'}},
|
||||
0x00000640: {'name':['grid', 'Output_Power'], 'level': logging.INFO, 'unit': 'W', 'ha':{'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'out_power_', 'fmt':'| float','name': 'Actual Power'}},
|
||||
0x000003e8: {'name':['grid', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V', 'ha':{'dev':'inverter', 'dev_cla': 'voltage', 'stat_cla': 'measurement', 'id':'out_volt_', 'fmt':'| float','name': 'Grid Voltage'}},
|
||||
0x0000044c: {'name':['grid', 'Current'], 'level': logging.DEBUG, 'unit': 'A', 'ha':{'dev':'inverter', 'dev_cla': 'current', 'stat_cla': 'measurement', 'id':'out_cur_', 'fmt':'| float','name': 'Grid Current'}},
|
||||
0x000004b0: {'name':['grid', 'Frequency'], 'level': logging.DEBUG, 'unit': 'Hz', 'ha':{'dev':'inverter', 'dev_cla': 'frequency', 'stat_cla': 'measurement', 'id':'out_freq_', 'fmt':'| float','name': 'Grid Frequency'}},
|
||||
0x00000640: {'name':['grid', 'Output_Power'], 'level': logging.INFO, 'unit': 'W', 'ha':{'dev':'inverter', 'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'out_power_', 'fmt':'| float','name': 'Power'}},
|
||||
0x000005dc: {'name':['env', 'Rated_Power'], 'level': logging.DEBUG, 'unit': 'W', 'ha':{'dev':'inverter', 'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'rated_power_','fmt':'| int', 'name': 'Rated Power'}},
|
||||
0x00000514: {'name':['env', 'Inverter_Temp'], 'level': logging.DEBUG, 'unit': '°C', 'ha':{'dev':'inverter', 'dev_cla': 'temperature', 'stat_cla': 'measurement', 'id':'temp_', 'fmt':'| int','name': 'Temperature'}},
|
||||
|
||||
# input measures:
|
||||
0x000006a4: {'name':['input', 'pv1', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V'},
|
||||
0x00000708: {'name':['input', 'pv1', 'Current'], 'level': logging.DEBUG, 'unit': 'A'},
|
||||
0x0000076c: {'name':['input', 'pv1', 'Power'], 'level': logging.INFO, 'unit': 'W', 'ha':{'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'power_pv1_','name': 'Power PV1', 'val_tpl' :"{{ (value_json['pv1']['Power'] | float)}}"}},
|
||||
0x000007d0: {'name':['input', 'pv2', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V'},
|
||||
0x00000834: {'name':['input', 'pv2', 'Current'], 'level': logging.DEBUG, 'unit': 'A'},
|
||||
0x00000898: {'name':['input', 'pv2', 'Power'], 'level': logging.INFO, 'unit': 'W', 'ha':{'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'power_pv2_','name': 'Power PV2', 'val_tpl' :"{{ (value_json['pv2']['Power'] | float)}}"}},
|
||||
0x000008fc: {'name':['input', 'pv3', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V'},
|
||||
0x00000960: {'name':['input', 'pv3', 'Curent'], 'level': logging.DEBUG, 'unit': 'A'},
|
||||
0x000009c4: {'name':['input', 'pv3', 'Power'], 'level': logging.DEBUG, 'unit': 'W', 'ha':{'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'power_pv3_','name': 'Power PV3', 'val_tpl' :"{{ (value_json['pv3']['Power'] | float)}}"}},
|
||||
0x00000a28: {'name':['input', 'pv4', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V'},
|
||||
0x00000a8c: {'name':['input', 'pv4', 'Current'], 'level': logging.DEBUG, 'unit': 'A'},
|
||||
0x00000af0: {'name':['input', 'pv4', 'Power'], 'level': logging.DEBUG, 'unit': 'W', 'ha':{'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'power_pv4_','name': 'Power PV4', 'val_tpl' :"{{ (value_json['pv4']['Power'] | float)}}"}},
|
||||
0x00000c1c: {'name':['input', 'pv1', 'Daily_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev_cla': 'energy', 'stat_cla': 'total_increasing', 'id':'daily_gen_pv1_','name': 'Daily Generation PV1', 'val_tpl' :"{{ (value_json['pv1']['Daily_Generation'] | float)}}"}},
|
||||
0x00000c80: {'name':['input', 'pv1', 'Total_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_pv1_','name': 'Total Generation PV1', 'val_tpl' :"{{ (value_json['pv1']['Total_Generation'] | float)}}"}},
|
||||
0x00000ce4: {'name':['input', 'pv2', 'Daily_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev_cla': 'energy', 'stat_cla': 'total_increasing', 'id':'daily_gen_pv2_','name': 'Daily Generation PV2', 'val_tpl' :"{{ (value_json['pv2']['Daily_Generation'] | float)}}"}},
|
||||
0x00000d48: {'name':['input', 'pv2', 'Total_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_pv2_','name': 'Total Generation PV2', 'val_tpl' :"{{ (value_json['pv2']['Total_Generation'] | float)}}"}},
|
||||
0x00000dac: {'name':['input', 'pv3', 'Daily_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev_cla': 'energy', 'stat_cla': 'total_increasing', 'id':'daily_gen_pv3_','name': 'Daily Generation PV3', 'val_tpl' :"{{ (value_json['pv3']['Daily_Generation'] | float)}}"}},
|
||||
0x00000e10: {'name':['input', 'pv3', 'Total_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_pv3_','name': 'Total Generation PV3', 'val_tpl' :"{{ (value_json['pv3']['Total_Generation'] | float)}}"}},
|
||||
0x00000e74: {'name':['input', 'pv4', 'Daily_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev_cla': 'energy', 'stat_cla': 'total_increasing', 'id':'daily_gen_pv4_','name': 'Daily Generation PV4', 'val_tpl' :"{{ (value_json['pv4']['Daily_Generation'] | float)}}"}},
|
||||
0x00000ed8: {'name':['input', 'pv4', 'Total_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_pv4_','name': 'Total Generation PV4', 'val_tpl' :"{{ (value_json['pv4']['Total_Generation'] | float)}}"}},
|
||||
0x000006a4: {'name':['input', 'pv1', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V', 'ha':{'dev':'input_pv1', 'dev_cla': 'voltage', 'stat_cla': 'measurement', 'id':'volt_pv1_', 'name': 'Voltage', 'val_tpl' :"{{ (value_json['pv1']['Voltage'] | float)}}", 'unvisible':1, 'icon':'mdi:gauge'}},
|
||||
0x00000708: {'name':['input', 'pv1', 'Current'], 'level': logging.DEBUG, 'unit': 'A', 'ha':{'dev':'input_pv1', 'dev_cla': 'current', 'stat_cla': 'measurement', 'id':'cur_pv1_', 'name': 'Current', 'val_tpl' :"{{ (value_json['pv1']['Current'] | float)}}", 'unvisible':1, 'icon':'mdi:gauge'}},
|
||||
0x0000076c: {'name':['input', 'pv1', 'Power'], 'level': logging.INFO, 'unit': 'W', 'ha':{'dev':'input_pv1', 'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'power_pv1_','name': 'Power', 'val_tpl' :"{{ (value_json['pv1']['Power'] | float)}}", 'icon':'mdi:gauge'}},
|
||||
0x000007d0: {'name':['input', 'pv2', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V', 'ha':{'dev':'input_pv2', 'dev_cla': 'voltage', 'stat_cla': 'measurement', 'id':'volt_pv2_', 'name': 'Voltage', 'val_tpl' :"{{ (value_json['pv2']['Voltage'] | float)}}", 'unvisible':1, 'icon':'mdi:gauge'}},
|
||||
0x00000834: {'name':['input', 'pv2', 'Current'], 'level': logging.DEBUG, 'unit': 'A', 'ha':{'dev':'input_pv2', 'dev_cla': 'current', 'stat_cla': 'measurement', 'id':'cur_pv2_', 'name': 'Current', 'val_tpl' :"{{ (value_json['pv2']['Current'] | float)}}", 'unvisible':1, 'icon':'mdi:gauge'}},
|
||||
0x00000898: {'name':['input', 'pv2', 'Power'], 'level': logging.INFO, 'unit': 'W', 'ha':{'dev':'input_pv2', 'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'power_pv2_','name': 'Power', 'val_tpl' :"{{ (value_json['pv2']['Power'] | float)}}", 'icon':'mdi:gauge'}},
|
||||
0x000008fc: {'name':['input', 'pv3', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V', 'ha':{'dev':'input_pv3', 'dev_cla': 'voltage', 'stat_cla': 'measurement', 'id':'volt_pv3_', 'name': 'Voltage', 'val_tpl' :"{{ (value_json['pv3']['Voltage'] | float)}}", 'unvisible':1, 'icon':'mdi:gauge'}},
|
||||
0x00000960: {'name':['input', 'pv3', 'Current'], 'level': logging.DEBUG, 'unit': 'A', 'ha':{'dev':'input_pv3', 'dev_cla': 'current', 'stat_cla': 'measurement', 'id':'cur_pv3_', 'name': 'Current', 'val_tpl' :"{{ (value_json['pv3']['Current'] | float)}}", 'unvisible':1, 'icon':'mdi:gauge'}},
|
||||
0x000009c4: {'name':['input', 'pv3', 'Power'], 'level': logging.DEBUG, 'unit': 'W', 'ha':{'dev':'input_pv3', 'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'power_pv3_','name': 'Power', 'val_tpl' :"{{ (value_json['pv3']['Power'] | float)}}", 'icon':'mdi:gauge'}},
|
||||
0x00000a28: {'name':['input', 'pv4', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V', 'ha':{'dev':'input_pv4', 'dev_cla': 'voltage', 'stat_cla': 'measurement', 'id':'volt_pv4_', 'name': 'Voltage', 'val_tpl' :"{{ (value_json['pv4']['Voltage'] | float)}}", 'unvisible':1, 'icon':'mdi:gauge'}},
|
||||
0x00000a8c: {'name':['input', 'pv4', 'Current'], 'level': logging.DEBUG, 'unit': 'A', 'ha':{'dev':'input_pv4', 'dev_cla': 'current', 'stat_cla': 'measurement', 'id':'cur_pv4_', 'name': 'Current', 'val_tpl' :"{{ (value_json['pv4']['Current'] | float)}}", 'unvisible':1, 'icon':'mdi:gauge'}},
|
||||
0x00000af0: {'name':['input', 'pv4', 'Power'], 'level': logging.DEBUG, 'unit': 'W', 'ha':{'dev':'input_pv4', 'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'power_pv4_','name': 'Power', 'val_tpl' :"{{ (value_json['pv4']['Power'] | float)}}", 'icon':'mdi:gauge'}},
|
||||
0x00000c1c: {'name':['input', 'pv1', 'Daily_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev':'input_pv1', 'dev_cla': 'energy', 'stat_cla': 'total_increasing', 'id':'daily_gen_pv1_','name': 'Daily Generation', 'val_tpl' :"{{ (value_json['pv1']['Daily_Generation'] | float)}}", 'icon':'mdi:solar-power-variant'}},
|
||||
0x00000c80: {'name':['input', 'pv1', 'Total_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev':'input_pv1', 'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_pv1_','name': 'Total Generation', 'val_tpl' :"{{ (value_json['pv1']['Total_Generation'] | float)}}", 'icon':'mdi:solar-power'}},
|
||||
0x00000ce4: {'name':['input', 'pv2', 'Daily_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev':'input_pv2', 'dev_cla': 'energy', 'stat_cla': 'total_increasing', 'id':'daily_gen_pv2_','name': 'Daily Generation', 'val_tpl' :"{{ (value_json['pv2']['Daily_Generation'] | float)}}", 'icon':'mdi:solar-power-variant'}},
|
||||
0x00000d48: {'name':['input', 'pv2', 'Total_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev':'input_pv2', 'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_pv2_','name': 'Total Generation', 'val_tpl' :"{{ (value_json['pv2']['Total_Generation'] | float)}}", 'icon':'mdi:solar-power'}},
|
||||
0x00000dac: {'name':['input', 'pv3', 'Daily_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev':'input_pv3', 'dev_cla': 'energy', 'stat_cla': 'total_increasing', 'id':'daily_gen_pv3_','name': 'Daily Generation', 'val_tpl' :"{{ (value_json['pv3']['Daily_Generation'] | float)}}", 'icon':'mdi:solar-power-variant'}},
|
||||
0x00000e10: {'name':['input', 'pv3', 'Total_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev':'input_pv3', 'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_pv3_','name': 'Total Generation', 'val_tpl' :"{{ (value_json['pv3']['Total_Generation'] | float)}}", 'icon':'mdi:solar-power'}},
|
||||
0x00000e74: {'name':['input', 'pv4', 'Daily_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev':'input_pv4', 'dev_cla': 'energy', 'stat_cla': 'total_increasing', 'id':'daily_gen_pv4_','name': 'Daily Generation', 'val_tpl' :"{{ (value_json['pv4']['Daily_Generation'] | float)}}", 'icon':'mdi:solar-power-variant'}},
|
||||
0x00000ed8: {'name':['input', 'pv4', 'Total_Generation'], 'level': logging.DEBUG, 'unit': 'kWh', 'ha':{'dev':'input_pv4', 'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_pv4_','name': 'Total Generation', 'val_tpl' :"{{ (value_json['pv4']['Total_Generation'] | float)}}", 'icon':'mdi:solar-power'}},
|
||||
# total:
|
||||
0x00000b54: {'name':['total', 'Daily_Generation'], 'level': logging.INFO, 'unit': 'kWh', 'ha':{'dev_cla': 'energy', 'stat_cla': 'total_increasing', 'id':'daily_gen_', 'fmt':'| float','name': 'Daily Generation'}},
|
||||
0x00000bb8: {'name':['total', 'Total_Generation'], 'level': logging.INFO, 'unit': 'kWh', 'ha':{'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_', 'fmt':'| float','name': 'Total Generation', 'icon':'mdi:solar-power'}},
|
||||
0x000c96a8: {'name':['total', 'Power_On_Time'], 'level': logging.DEBUG, 'unit': 's', 'ha':{'dev_cla': 'duration', 'stat_cla': 'measurement', 'id':'power_on_time_', 'name': 'Power on Time', 'val_tpl':"{{ (value_json['Power_On_Time'] | float)}}", 'nat_prc':'3'}},
|
||||
0x00000b54: {'name':['total', 'Daily_Generation'], 'level': logging.INFO, 'unit': 'kWh', 'ha':{'dev':'inverter', 'dev_cla': 'energy', 'stat_cla': 'total_increasing', 'id':'daily_gen_', 'fmt':'| float','name': 'Daily Generation', 'icon':'mdi:solar-power-variant'}},
|
||||
0x00000bb8: {'name':['total', 'Total_Generation'], 'level': logging.INFO, 'unit': 'kWh', 'ha':{'dev':'inverter', 'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_', 'fmt':'| float','name': 'Total Generation', 'icon':'mdi:solar-power'}},
|
||||
|
||||
# controller:
|
||||
0x000c3500: {'name':['controller', 'Signal_Strength'], 'level': logging.DEBUG, 'unit': '%' , 'ha':{'dev':'controller', 'dev_cla': None, 'stat_cla': 'measurement', 'id':'signal_', 'fmt':'| int', 'name': 'Signal Strength', 'icon':'mdi:wifi'}},
|
||||
0x000c96a8: {'name':['controller', 'Power_On_Time'], 'level': logging.DEBUG, 'unit': 's', 'ha':{'dev':'controller', 'dev_cla': 'duration', 'stat_cla': 'measurement', 'id':'power_on_time_', 'name': 'Power on Time', 'val_tpl':"{{ (value_json['Power_On_Time'] | float)}}", 'nat_prc':'3'}},
|
||||
0x000cf850: {'name':['controller', 'Data_Up_Interval'], 'level': logging.DEBUG, 'unit': 's', 'ha':{'dev':'controller', 'dev_cla': None, 'stat_cla': 'measurement', 'id':'data_up_intval_', 'fmt':'| int', 'name': 'Data Up Interval', 'icon':'mdi:update'}},
|
||||
|
||||
}
|
||||
|
||||
|
||||
def dev_value(self, idx:str|int) -> str|int|float|None:
|
||||
'''returns the stored device value from our database
|
||||
|
||||
idx:int ==> lookup the value in the database and return it as str, int or flout. If the value is not available return 'None'
|
||||
idx:str ==> returns the string as a fixed value without a database loopup
|
||||
'''
|
||||
if type (idx) is str:
|
||||
return idx # return idx as a fixed value
|
||||
elif idx in self.__info_defs:
|
||||
dict = self.db
|
||||
row = self.__info_defs[idx]
|
||||
keys = row['name']
|
||||
|
||||
for key in keys:
|
||||
if key not in dict:
|
||||
return None # value not found in the database
|
||||
dict = dict[key]
|
||||
return dict # value of the reqeusted entry
|
||||
|
||||
return None # unknwon idx, not in __info_defs
|
||||
|
||||
|
||||
def ha_confs(self, prfx="tsun/garagendach/", snr='123', sug_area =''):
|
||||
'''Generator function yields a json register struct for home-assistant auto configuration and a unique entity string
|
||||
|
||||
arguments:
|
||||
prfx:str ==> MQTT prefix for the home assistant 'stat_t string
|
||||
snr:str ==> serial number of the inverter, used to build unique entity strings
|
||||
sug_area:str ==> suggested area string from the config file'''
|
||||
tab = self.__info_defs
|
||||
for key in tab:
|
||||
row = tab[key]
|
||||
|
||||
#check if we have details for home assistant
|
||||
if 'ha' in row:
|
||||
ha = row['ha']
|
||||
attr = {}
|
||||
if 'comp' in ha:
|
||||
component = ha['comp']
|
||||
else:
|
||||
component = 'sensor'
|
||||
attr = {} # dict to collect all the sensor entity details
|
||||
if 'name' in ha:
|
||||
attr['name'] = ha['name'] # eg. 'name': "Actual Power"
|
||||
attr['name'] = ha['name'] # take the entity name from the ha dict
|
||||
else:
|
||||
attr['name'] = row['name'][-1] # eg. 'name': "Actual Power"
|
||||
attr['name'] = row['name'][-1] # otherwise take a name from the name array
|
||||
|
||||
attr['stat_t'] = prfx +row['name'][0] # eg. 'stat_t': "tsun/garagendach/grid"
|
||||
attr['dev_cla'] = ha['dev_cla'] # eg. 'dev_cla': 'power'
|
||||
attr['stat_cla'] = ha['stat_cla'] # eg. 'stat_cla': "measurement"
|
||||
attr['uniq_id'] = ha['id']+snr # eg. 'uniq_id':'out_power_123'
|
||||
attr['uniq_id'] = ha['id']+snr # build the 'uniq_id' from the id str + the serial no of the inverter
|
||||
if 'val_tpl' in ha:
|
||||
attr['val_tpl'] = ha['val_tpl'] # eg. 'val_tpl': "{{ value_json['Output_Power']|float }}"
|
||||
attr['val_tpl'] = ha['val_tpl'] # get value template for complexe data structures
|
||||
elif 'fmt' in ha:
|
||||
attr['val_tpl'] = '{{value_json' + f"['{row['name'][-1]}'] {ha['fmt']}" + '}}' # eg. 'val_tpl': "{{ value_json['Output_Power']|float }}"
|
||||
|
||||
if 'unit' in row:
|
||||
attr['unit_of_meas'] = row['unit'] # eg. 'unit_of_meas': 'W'
|
||||
attr['unit_of_meas'] = row['unit'] # optional add a 'unit_of_meas' e.g. 'W'
|
||||
if 'icon' in ha:
|
||||
attr['icon'] = ha['icon'] # eg. 'icon':'mdi:solar-power'
|
||||
attr['icon'] = ha['icon'] # optional add an icon for the entity
|
||||
if 'nat_prc' in ha:
|
||||
attr['suggested_display_precision'] = ha['nat_prc']
|
||||
attr['sug_dsp_prc'] = ha['nat_prc'] # optional add the precison of floats
|
||||
|
||||
# eg. 'dev':{'name':'Microinverter','mdl':'MS-600','ids':["inverter_123"],'mf':'TSUN','sa': 'auf Garagendach'}
|
||||
# attr['dev'] = {'name':'Microinverter','mdl':'MS-600','ids':[f'inverter_{snr}'],'mf':'TSUN','sa': 'auf Garagendach'}
|
||||
dev = {}
|
||||
dev['name'] = 'Microinverter' #fixme
|
||||
dev['mdl'] = 'MS-600' #fixme
|
||||
dev['ids'] = [f'inverter_{snr}']
|
||||
dev['mf'] = 'TSUN' #fixme
|
||||
dev['sa'] = sug_area
|
||||
dev['sw'] = '0.01' #fixme
|
||||
dev['hw'] = 'Hw0.01' #fixme
|
||||
#dev['via_device'] = #fixme
|
||||
attr['dev'] = dev
|
||||
|
||||
if 'dev' in ha:
|
||||
device = self.__info_devs[ha['dev']]
|
||||
dev = {}
|
||||
|
||||
yield json.dumps (attr), attr['uniq_id']
|
||||
# the same name fpr 'name' and 'suggested area', so we get dedicated devices in home assistant with short value name and headline
|
||||
if 'name' in device:
|
||||
dev['name'] = device['name']
|
||||
dev['sa'] = device['name']
|
||||
# fixme: we ignore the suggested area, since one area make no sense for multiple devices
|
||||
#else:
|
||||
# dev['name'] = sug_area
|
||||
# dev['sa'] = sug_area
|
||||
|
||||
if 'via' in device: # add the link to the parent device
|
||||
dev['via_device'] = f"{device['via']}_{snr}"
|
||||
|
||||
for key in ('mdl','mf', 'sw', 'hw'): # add optional values fpr 'modell', 'manufaturer', 'sw version' and 'hw version'
|
||||
if key in device:
|
||||
data = self.dev_value(device[key])
|
||||
if data is not None: dev[key] = data
|
||||
|
||||
dev['ids'] = [f"{ha['dev']}_{snr}"]
|
||||
attr['dev'] = dev
|
||||
|
||||
origin = {}
|
||||
origin['name'] = self.app_name
|
||||
origin['sw'] = self.version
|
||||
attr['o'] = origin
|
||||
|
||||
|
||||
yield json.dumps (attr), component, attr['uniq_id']
|
||||
|
||||
|
||||
|
||||
@@ -131,7 +197,10 @@ class Infos:
|
||||
return d['name'], d['level'], d['unit']
|
||||
|
||||
|
||||
def parse(self, buf):
|
||||
def parse(self, buf) -> None:
|
||||
'''parse a data sequence received from the inverter and stores the values in Infos.db
|
||||
|
||||
buf: buffer of the sequence to parse'''
|
||||
result = struct.unpack_from('!l', buf, 0)
|
||||
elms = result[0]
|
||||
i = 0
|
||||
|
||||
103
app/src/inverter.py
Normal file
103
app/src/inverter.py
Normal file
@@ -0,0 +1,103 @@
|
||||
import asyncio, logging, traceback, json
|
||||
from config import Config
|
||||
from async_stream import AsyncStream
|
||||
from mqtt import Mqtt
|
||||
#import gc
|
||||
|
||||
logger = logging.getLogger('conn')
|
||||
|
||||
|
||||
|
||||
class Inverter(AsyncStream):
|
||||
|
||||
def __init__ (self, reader, writer, addr):
|
||||
super().__init__(reader, writer, addr, None, True)
|
||||
self.mqtt = Mqtt()
|
||||
self.ha_restarts = 0
|
||||
ha = Config.get('ha')
|
||||
self.entitiy_prfx = ha['entity_prefix'] + '/'
|
||||
self.discovery_prfx = ha['discovery_prefix'] + '/'
|
||||
|
||||
|
||||
async def server_loop(self, addr):
|
||||
'''Loop for receiving messages from the inverter (server-side)'''
|
||||
logger.info(f'Accept connection from {addr}')
|
||||
await self.loop()
|
||||
logging.info(f'Server loop stopped for {addr}')
|
||||
|
||||
# if the server connection closes, we also have to disconnect the connection to te TSUN cloud
|
||||
if self.remoteStream:
|
||||
logging.debug ("disconnect client connection")
|
||||
self.remoteStream.disc()
|
||||
|
||||
async def client_loop(self, addr):
|
||||
'''Loop for receiving messages from the TSUN cloud (client-side)'''
|
||||
await self.remoteStream.loop()
|
||||
logging.info(f'Client loop stopped for {addr}')
|
||||
|
||||
# if the client connection closes, we don't touch the server connection. Instead we erase the client
|
||||
# connection stream, thus on the next received packet from the inverter, we can establish a new connection
|
||||
# to the TSUN cloud
|
||||
self.remoteStream.remoteStream = None # erase backlink to inverter instance
|
||||
self.remoteStream = None # than erase client connection
|
||||
|
||||
async def async_create_remote(self) -> None:
|
||||
'''Establish a client connection to the TSUN cloud'''
|
||||
tsun = Config.get('tsun')
|
||||
host = tsun['host']
|
||||
port = tsun['port']
|
||||
addr = (host, port)
|
||||
|
||||
try:
|
||||
logging.info(f'Connected to {addr}')
|
||||
connect = asyncio.open_connection(host, port)
|
||||
reader, writer = await connect
|
||||
self.remoteStream = AsyncStream(reader, writer, addr, self, False)
|
||||
asyncio.create_task(self.client_loop(addr))
|
||||
|
||||
except ConnectionRefusedError as error:
|
||||
logging.info(f'{error}')
|
||||
except Exception:
|
||||
logging.error(
|
||||
f"Inverter: Exception for {addr}:\n"
|
||||
f"{traceback.format_exc()}")
|
||||
|
||||
|
||||
|
||||
async def async_publ_mqtt(self) -> None:
|
||||
'''puplish data to MQTT broker'''
|
||||
db = self.db.db
|
||||
# check if new inverter or collector infos are available or when the home assistant has changed the status back to online
|
||||
if (('inverter' in self.new_data and self.new_data['inverter']) or
|
||||
('collector' in self.new_data and self.new_data['collector']) or
|
||||
self.mqtt.ha_restarts != self.ha_restarts):
|
||||
await self.__register_home_assistant()
|
||||
self.ha_restarts = self.mqtt.ha_restarts
|
||||
|
||||
for key in self.new_data:
|
||||
if self.new_data[key] and key in db:
|
||||
data_json = json.dumps(db[key])
|
||||
logger.info(f'{key}: {data_json}')
|
||||
await self.mqtt.publish(f"{self.entitiy_prfx}{self.node_id}{key}", data_json)
|
||||
self.new_data[key] = False
|
||||
|
||||
async def __register_home_assistant(self) -> None:
|
||||
'''register all our topics at home assistant'''
|
||||
try:
|
||||
for data_json, component, id in self.db.ha_confs(self.entitiy_prfx + self.node_id, self.unique_id, self.sug_area):
|
||||
#logger.debug(f'MQTT Register: {data_json}')
|
||||
await self.mqtt.publish(f"{self.discovery_prfx}{component}/{self.node_id}{id}/config", data_json)
|
||||
except Exception:
|
||||
logging.error(
|
||||
f"Inverter: Exception:\n"
|
||||
f"{traceback.format_exc()}")
|
||||
|
||||
def close(self) -> None:
|
||||
logging.debug(f'Inverter.close() {self.addr}')
|
||||
super().close() # call close handler in the parent class
|
||||
# logger.debug (f'Inverter refs: {gc.get_referrers(self)}')
|
||||
|
||||
|
||||
def __del__ (self):
|
||||
logging.debug ("Inverter.__del__")
|
||||
super().__del__()
|
||||
@@ -25,13 +25,13 @@ qualname=conn
|
||||
|
||||
[logger_data]
|
||||
level=DEBUG
|
||||
handlers=console_handler,file_handler_name1,file_handler_name2
|
||||
handlers=file_handler_name1,file_handler_name2
|
||||
propagate=0
|
||||
qualname=data
|
||||
|
||||
[logger_mqtt]
|
||||
level=DEBUG
|
||||
handlers=console_handler,file_handler_name1,file_handler_name2
|
||||
level=INFO
|
||||
handlers=console_handler,file_handler_name1
|
||||
propagate=0
|
||||
qualname=mqtt
|
||||
|
||||
@@ -43,12 +43,12 @@ qualname=tracer
|
||||
|
||||
[handler_console_handler]
|
||||
class=StreamHandler
|
||||
level=INFO
|
||||
level=DEBUG
|
||||
formatter=console_formatter
|
||||
|
||||
[handler_file_handler_name1]
|
||||
class=handlers.TimedRotatingFileHandler
|
||||
level=NOTSET
|
||||
level=INFO
|
||||
formatter=file_formatter
|
||||
args=('log/proxy.log', when:='midnight')
|
||||
|
||||
|
||||
@@ -100,6 +100,13 @@ class Message(metaclass=IterRegistry):
|
||||
'''
|
||||
Our puplic methods
|
||||
'''
|
||||
def close(self) -> None:
|
||||
# we have refernces to methods of this class in self.switch
|
||||
# so we have to erase self.switch, otherwise this instance can't be
|
||||
# deallocated by the garbage collector ==> we get a memory leak
|
||||
del self.switch
|
||||
|
||||
|
||||
def read(self) -> None:
|
||||
self._read()
|
||||
|
||||
@@ -186,7 +193,7 @@ class Message(metaclass=IterRegistry):
|
||||
self.send_msg_ofs = len (self._send_buffer)
|
||||
self._send_buffer += struct.pack(f'!l{len(self.id_str)+1}pBB', 0, self.id_str, ctrl, self.msg_id)
|
||||
fnc = self.switch.get(self.msg_id, self.msg_unknown)
|
||||
logger.info(self.__flow_str(self.server_side, 'tx') + f' Ctl: {int(self.ctrl):#02x} Msg: {fnc.__name__!r}' )
|
||||
logger.info(self.__flow_str(self.server_side, 'tx') + f' Ctl: {int(ctrl):#02x} Msg: {fnc.__name__!r}' )
|
||||
|
||||
def __finish_send_msg(self) -> None:
|
||||
_len = len(self._send_buffer) - self.send_msg_ofs
|
||||
@@ -287,11 +294,9 @@ class Message(metaclass=IterRegistry):
|
||||
|
||||
|
||||
def msg_unknown(self):
|
||||
logger.error (f"Unknow Msg: ID:{self.msg_id}")
|
||||
self.forward(self._recv_buffer, self.header_len+self.data_len)
|
||||
|
||||
|
||||
def __del__ (self):
|
||||
logger.debug ("Messages __del__")
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -16,13 +16,22 @@ class Singleton(type):
|
||||
|
||||
class Mqtt(metaclass=Singleton):
|
||||
client = None
|
||||
|
||||
|
||||
def __init__(self):
|
||||
logger_mqtt.debug(f'MQTT: __init__')
|
||||
loop = asyncio.get_event_loop()
|
||||
self.task = loop.create_task(self.__loop())
|
||||
|
||||
self.ha_restarts = 0
|
||||
|
||||
|
||||
@property
|
||||
def ha_restarts(self):
|
||||
return self._ha_restarts
|
||||
|
||||
@ha_restarts.setter
|
||||
def ha_restarts(self, value):
|
||||
self._ha_restarts = value
|
||||
|
||||
def __del__(self):
|
||||
logger_mqtt.debug(f'MQTT: __del__')
|
||||
|
||||
@@ -55,7 +64,11 @@ class Mqtt(metaclass=Singleton):
|
||||
async with self.client.messages() as messages:
|
||||
await self.client.subscribe(f"{ha['auto_conf_prefix']}/status")
|
||||
async for message in messages:
|
||||
logger_mqtt.info(f'Home-Assistant Status: {message.payload.decode("UTF-8")}')
|
||||
status = message.payload.decode("UTF-8")
|
||||
logger_mqtt.info(f'Home-Assistant Status: {status}')
|
||||
if status == 'online':
|
||||
self.ha_restarts += 1
|
||||
|
||||
except aiomqtt.MqttError:
|
||||
logger_mqtt.info(f"Connection lost; Reconnecting in {interval} seconds ...")
|
||||
await asyncio.sleep(interval)
|
||||
|
||||
@@ -1,43 +0,0 @@
|
||||
import asyncio, logging, traceback
|
||||
from async_stream import AsyncStream
|
||||
|
||||
class Proxy:
|
||||
def __init__ (proxy, reader, writer, addr):
|
||||
proxy.ServerStream = AsyncStream(proxy, reader, writer, addr)
|
||||
proxy.ClientStream = None
|
||||
|
||||
async def server_loop(proxy, addr):
|
||||
logging.info(f'Accept connection from {addr}')
|
||||
await proxy.ServerStream.loop()
|
||||
logging.info(f'Close server connection {addr}')
|
||||
|
||||
if proxy.ClientStream:
|
||||
logging.debug ("close client connection")
|
||||
proxy.ClientStream.close()
|
||||
|
||||
async def client_loop(proxy, addr):
|
||||
await proxy.ClientStream.loop()
|
||||
logging.info(f'Close client connection {addr}')
|
||||
proxy.ServerStream.remoteStream = None
|
||||
proxy.ClientStream = None
|
||||
|
||||
async def CreateClientStream (proxy, stream, host, port):
|
||||
addr = (host, port)
|
||||
|
||||
try:
|
||||
logging.info(f'Connected to {addr}')
|
||||
connect = asyncio.open_connection(host, port)
|
||||
reader, writer = await connect
|
||||
proxy.ClientStream = AsyncStream(proxy, reader, writer, addr, stream, server_side=False)
|
||||
asyncio.create_task(proxy.client_loop(addr))
|
||||
|
||||
except ConnectionRefusedError as error:
|
||||
logging.info(f'{error}')
|
||||
except Exception:
|
||||
logging.error(
|
||||
f"Proxy: Exception for {addr}:\n"
|
||||
f"{traceback.format_exc()}")
|
||||
return proxy.ClientStream
|
||||
|
||||
def __del__ (proxy):
|
||||
logging.debug ("Proxy __del__")
|
||||
@@ -1,8 +1,7 @@
|
||||
import logging, asyncio, signal, functools, os
|
||||
#from logging.handlers import TimedRotatingFileHandler
|
||||
from logging import config
|
||||
from async_stream import AsyncStream
|
||||
from proxy import Proxy
|
||||
from inverter import Inverter
|
||||
from config import Config
|
||||
from mqtt import Mqtt
|
||||
|
||||
@@ -11,7 +10,7 @@ async def handle_client(reader, writer):
|
||||
'''Handles a new incoming connection and starts an async loop'''
|
||||
|
||||
addr = writer.get_extra_info('peername')
|
||||
await Proxy(reader, writer, addr).server_loop(addr)
|
||||
await Inverter(reader, writer, addr).server_loop(addr)
|
||||
|
||||
|
||||
def handle_SIGTERM(loop):
|
||||
@@ -42,9 +41,11 @@ if __name__ == "__main__":
|
||||
# Setup our daily, rotating logger
|
||||
#
|
||||
serv_name = os.getenv('SERVICE_NAME', 'proxy')
|
||||
version = os.getenv('VERSION', 'unknown')
|
||||
|
||||
logging.config.fileConfig('logging.ini')
|
||||
logging.info(f'Server "{serv_name}" will be started')
|
||||
logging.info(f'Server "{serv_name} - {version}" will be started')
|
||||
logging.getLogger().setLevel(logging.DEBUG if __debug__ else logging.INFO)
|
||||
|
||||
# read config file
|
||||
Config.read()
|
||||
|
||||
@@ -12,6 +12,12 @@ def ContrDataSeq(): # Get Time Request message
|
||||
msg += b'\x49\x00\x00\x00\x02\x00\x0d\x04\x08\x49\x00\x00\x00\x00\x00\x07\xa1\x84\x49\x00\x00\x00\x01\x00\x0c\x50\x59\x49\x00\x00\x00\x4c\x00\x0d\x1f\x60\x49\x00\x00\x00\x00'
|
||||
return msg
|
||||
|
||||
@pytest.fixture
|
||||
def InvDataSeq(): # Data indication from the controller
|
||||
msg = b'\x00\x00\x00\x06\x00\x00\x00\x0a\x54\x08\x4d\x69\x63\x72\x6f\x69\x6e\x76\x00\x00\x00\x14\x54\x04\x54\x53\x55\x4e\x00\x00\x00\x1E\x54\x07\x56\x35\x2e\x30\x2e\x31\x31\x00\x00\x00\x28'
|
||||
msg += b'\x54\x10\x54\x31\x37\x45\x37\x33\x30\x37\x30\x32\x31\x44\x30\x30\x36\x41\x00\x00\x00\x32\x54\x0a\x54\x53\x4f\x4c\x2d\x4d\x53\x36\x30\x30\x00\x00\x00\x3c\x54\x05\x41\x2c\x42\x2c\x43'
|
||||
return msg
|
||||
|
||||
|
||||
def test_parse_control(ContrDataSeq):
|
||||
i = Infos()
|
||||
@@ -19,37 +25,94 @@ def test_parse_control(ContrDataSeq):
|
||||
pass
|
||||
|
||||
assert json.dumps(i.db) == json.dumps(
|
||||
{"collector": {"Collector_Fw_Version": "RSW_400_V1.00.06", "Chip_Type": "Raymon", "Chip_Model": "RSW-1-10001", "Trace_URL": "t.raymoniot.com", "Logger_URL": "logger.talent-monitoring.com", "Data_Up_Interval": 300}, "env": {"Signal_Strength": 100}, "total": {"Power_On_Time": 29}})
|
||||
|
||||
def test_build_ha_conf():
|
||||
i = Infos()
|
||||
d_json, id = next (i.ha_confs(prfx="tsun/garagendach/", snr='123'))
|
||||
assert id == 'out_power_123'
|
||||
assert d_json == json.dumps({"name": "Actual Power", "stat_t": "tsun/garagendach/grid", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "out_power_123", "val_tpl": "{{value_json['Output_Power'] | float}}", "unit_of_meas": "W", "dev": {"name": "Microinverter", "mdl": "MS-600", "ids": ["inverter_123"], "mf": "TSUN", "sa": "", "sw": "0.01", "hw": "Hw0.01"}})
|
||||
{"collector": {"Collector_Fw_Version": "RSW_400_V1.00.06", "Chip_Type": "Raymon", "Chip_Model": "RSW-1-10001", "Trace_URL": "t.raymoniot.com", "Logger_URL": "logger.talent-monitoring.com"}, "controller": {"Signal_Strength": 100, "Power_On_Time": 29, "Data_Up_Interval": 300}})
|
||||
|
||||
def test_build_ha_conf2():
|
||||
def test_parse_inverter(InvDataSeq):
|
||||
i = Infos()
|
||||
for key, result in i.parse (InvDataSeq):
|
||||
pass
|
||||
|
||||
assert json.dumps(i.db) == json.dumps(
|
||||
{"inverter": {"Product_Name": "Microinv", "Manufacturer": "TSUN", "Version": "V5.0.11", "Serial_Number": "T17E7307021D006A", "Equipment_Model": "TSOL-MS600"}})
|
||||
|
||||
def test_parse_cont_and_invert(ContrDataSeq, InvDataSeq):
|
||||
i = Infos()
|
||||
for key, result in i.parse (ContrDataSeq):
|
||||
pass
|
||||
|
||||
for key, result in i.parse (InvDataSeq):
|
||||
pass
|
||||
|
||||
assert json.dumps(i.db) == json.dumps(
|
||||
{
|
||||
"collector": {"Collector_Fw_Version": "RSW_400_V1.00.06", "Chip_Type": "Raymon", "Chip_Model": "RSW-1-10001", "Trace_URL": "t.raymoniot.com", "Logger_URL": "logger.talent-monitoring.com"}, "controller": {"Signal_Strength": 100, "Power_On_Time": 29, "Data_Up_Interval": 300},
|
||||
"inverter": {"Product_Name": "Microinv", "Manufacturer": "TSUN", "Version": "V5.0.11", "Serial_Number": "T17E7307021D006A", "Equipment_Model": "TSOL-MS600"}})
|
||||
|
||||
|
||||
def test_build_ha_conf1(ContrDataSeq):
|
||||
i = Infos()
|
||||
tests = 0
|
||||
for d_json, id in i.ha_confs(prfx="tsun/garagendach/", snr='123'):
|
||||
for d_json, comp, id in i.ha_confs(prfx="tsun/garagendach/", snr='123'):
|
||||
|
||||
if id == 'out_power_123':
|
||||
assert d_json == json.dumps({"name": "Actual Power", "stat_t": "tsun/garagendach/grid", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "out_power_123", "val_tpl": "{{value_json['Output_Power'] | float}}", "unit_of_meas": "W", "dev": {"name": "Microinverter", "mdl": "MS-600", "ids": ["inverter_123"], "mf": "TSUN", "sa": "", "sw": "0.01", "hw": "Hw0.01"}})
|
||||
assert comp == 'sensor'
|
||||
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/grid", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "out_power_123", "val_tpl": "{{value_json['Output_Power'] | float}}", "unit_of_meas": "W", "dev": {"name": "Micro Inverter", "sa": "Micro Inverter", "via_device": "controller_123", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
|
||||
tests +=1
|
||||
|
||||
elif id == 'daily_gen_123':
|
||||
assert d_json == json.dumps({"name": "Daily Generation", "stat_t": "tsun/garagendach/total", "dev_cla": "energy", "stat_cla": "total_increasing", "uniq_id": "daily_gen_123", "val_tpl": "{{value_json['Daily_Generation'] | float}}", "unit_of_meas": "kWh", "dev": {"name": "Microinverter", "mdl": "MS-600", "ids": ["inverter_123"], "mf": "TSUN", "sa": "", "sw": "0.01", "hw": "Hw0.01"}})
|
||||
assert comp == 'sensor'
|
||||
assert d_json == json.dumps({"name": "Daily Generation", "stat_t": "tsun/garagendach/total", "dev_cla": "energy", "stat_cla": "total_increasing", "uniq_id": "daily_gen_123", "val_tpl": "{{value_json['Daily_Generation'] | float}}", "unit_of_meas": "kWh", "icon": "mdi:solar-power-variant", "dev": {"name": "Micro Inverter", "sa": "Micro Inverter", "via_device": "controller_123", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
|
||||
tests +=1
|
||||
|
||||
elif id == 'power_pv1_123':
|
||||
assert d_json == json.dumps({"name": "Power PV1", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv1_123", "val_tpl": "{{ (value_json['pv1']['Power'] | float)}}", "unit_of_meas": "W", "dev": {"name": "Microinverter", "mdl": "MS-600", "ids": ["inverter_123"], "mf": "TSUN", "sa": "", "sw": "0.01", "hw": "Hw0.01"}})
|
||||
assert comp == 'sensor'
|
||||
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv1_123", "val_tpl": "{{ (value_json['pv1']['Power'] | float)}}", "unit_of_meas": "W", "icon": "mdi:gauge", "dev": {"name": "Module PV1", "sa": "Module PV1", "via_device": "inverter_123", "ids": ["input_pv1_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
|
||||
tests +=1
|
||||
|
||||
elif id == 'total_gen_123':
|
||||
assert d_json == json.dumps({"name": "Total Generation", "stat_t": "tsun/garagendach/total", "dev_cla": "energy", "stat_cla": "total", "uniq_id": "total_gen_123", "val_tpl": "{{value_json['Total_Generation'] | float}}", "unit_of_meas": "kWh", "icon": "mdi:solar-power", "dev": {"name": "Microinverter", "mdl": "MS-600", "ids": ["inverter_123"], "mf": "TSUN", "sa": "", "sw": "0.01", "hw": "Hw0.01"}})
|
||||
elif id == 'power_pv2_123':
|
||||
assert comp == 'sensor'
|
||||
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv2_123", "val_tpl": "{{ (value_json['pv2']['Power'] | float)}}", "unit_of_meas": "W", "icon": "mdi:gauge", "dev": {"name": "Module PV2", "sa": "Module PV2", "via_device": "inverter_123", "ids": ["input_pv2_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
|
||||
tests +=1
|
||||
assert tests==4
|
||||
|
||||
def test_build_ha_conf3():
|
||||
elif id == 'signal_123':
|
||||
assert comp == 'sensor'
|
||||
assert d_json == json.dumps({"name": "Signal Strength", "stat_t": "tsun/garagendach/controller", "dev_cla": None, "stat_cla": "measurement", "uniq_id": "signal_123", "val_tpl": "{{value_json[\'Signal_Strength\'] | int}}", "unit_of_meas": "%", "icon": "mdi:wifi", "dev": {"name": "Controller", "sa": "Controller", "ids": ["controller_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
|
||||
tests +=1
|
||||
assert tests==5
|
||||
|
||||
def test_build_ha_conf2(ContrDataSeq, InvDataSeq):
|
||||
i = Infos()
|
||||
for d_json, id in i.ha_confs(prfx="tsun/garagendach/", snr='123'):
|
||||
for key, result in i.parse (ContrDataSeq):
|
||||
pass
|
||||
|
||||
for key, result in i.parse (InvDataSeq):
|
||||
pass
|
||||
|
||||
tests = 0
|
||||
for d_json, comp, id in i.ha_confs(prfx="tsun/garagendach/", snr='123'):
|
||||
|
||||
if id == 'out_power_123':
|
||||
assert comp == 'sensor'
|
||||
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/grid", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "out_power_123", "val_tpl": "{{value_json['Output_Power'] | float}}", "unit_of_meas": "W", "dev": {"name": "Micro Inverter", "sa": "Micro Inverter", "via_device": "controller_123", "mdl": "TSOL-MS600", "mf": "TSUN", "sw": "V5.0.11", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
|
||||
tests +=1
|
||||
|
||||
elif id == 'daily_gen_123':
|
||||
assert comp == 'sensor'
|
||||
assert d_json == json.dumps({"name": "Daily Generation", "stat_t": "tsun/garagendach/total", "dev_cla": "energy", "stat_cla": "total_increasing", "uniq_id": "daily_gen_123", "val_tpl": "{{value_json['Daily_Generation'] | float}}", "unit_of_meas": "kWh", "icon": "mdi:solar-power-variant", "dev": {"name": "Micro Inverter", "sa": "Micro Inverter", "via_device": "controller_123", "mdl": "TSOL-MS600", "mf": "TSUN", "sw": "V5.0.11", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
|
||||
tests +=1
|
||||
|
||||
elif id == 'power_pv1_123':
|
||||
assert comp == 'sensor'
|
||||
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv1_123", "val_tpl": "{{ (value_json['pv1']['Power'] | float)}}", "unit_of_meas": "W", "icon": "mdi:gauge", "dev": {"name": "Module PV1", "sa": "Module PV1", "via_device": "inverter_123", "ids": ["input_pv1_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
|
||||
tests +=1
|
||||
|
||||
elif id == 'power_pv2_123':
|
||||
assert comp == 'sensor'
|
||||
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv2_123", "val_tpl": "{{ (value_json['pv2']['Power'] | float)}}", "unit_of_meas": "W", "icon": "mdi:gauge", "dev": {"name": "Module PV2", "sa": "Module PV2", "via_device": "inverter_123", "ids": ["input_pv2_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
|
||||
tests +=1
|
||||
|
||||
elif id == 'signal_123':
|
||||
assert comp == 'sensor'
|
||||
assert d_json == json.dumps({"name": "Signal Strength", "stat_t": "tsun/garagendach/controller", "dev_cla": None, "stat_cla": "measurement", "uniq_id": "signal_123", "val_tpl": "{{value_json[\'Signal_Strength\'] | int}}", "unit_of_meas": "%", "icon": "mdi:wifi", "dev": {"name": "Controller", "sa": "Controller", "mdl": "RSW-1-10001", "mf": "Raymon", "sw": "RSW_400_V1.00.06", "ids": ["controller_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
|
||||
tests +=1
|
||||
assert tests==5
|
||||
|
||||
@@ -66,20 +66,17 @@ services:
|
||||
####### T S U N - P R O X Y ######
|
||||
tsun-proxy:
|
||||
container_name: tsun-proxy
|
||||
image: docker.io/sallius/tsun-gen3-proxy:latest
|
||||
build:
|
||||
context: https://github.com/s-allius/tsun-gen3-proxy.git#main:app
|
||||
args:
|
||||
- UID=1026
|
||||
image: ghcr.io/s-allius/tsun-gen3-proxy:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- mqtt
|
||||
environment:
|
||||
- TZ=Europe/Brussels
|
||||
- SERVICE_NAME=tsun-proxy
|
||||
- UID=${UID:-1000}
|
||||
- GID=${GID:-1000}
|
||||
dns:
|
||||
- 8.8.8.8
|
||||
- 4.4.4.4
|
||||
- ${DNS1:-8.8.8.8}
|
||||
- $(DNS2:-4.4.4.4}
|
||||
ports:
|
||||
- 5005:5005
|
||||
volumes:
|
||||
|
||||
Reference in New Issue
Block a user