Compare commits

..

20 Commits

Author SHA1 Message Date
Stefan Allius
8b4a94bfcb Version 0.3.0 2023-10-10 20:45:12 +02:00
Stefan Allius
98dab7db99 Version 0.3.0 2023-10-10 20:17:04 +02:00
Stefan Allius
42ae95fd1c remove --no-cache for release candidates (rc) 2023-10-10 20:15:10 +02:00
Stefan Allius
9ffd105278 classify more value for diagnostics 2023-10-10 20:03:05 +02:00
Stefan Allius
97f426269f switch to python 3.12 2023-10-09 22:21:00 +02:00
Stefan Allius
c7bf3f2e44 formatting 2023-10-09 20:48:46 +02:00
Stefan Allius
2781bf3a14 Independence from TSUN 2023-10-09 20:47:05 +02:00
Stefan Allius
fcd3fddb19 optimize and reduce logging 2023-10-09 20:02:30 +02:00
Stefan Allius
88cdcabd6f use abbreviation 'ic' for icon 2023-10-09 19:58:37 +02:00
Stefan Allius
1f2f359188 optimize and reduce logging 2023-10-09 19:57:49 +02:00
Stefan Allius
2dd09288d5 bum aiomqtt version to 1.2.1 2023-10-08 16:32:24 +02:00
Stefan Allius
5c5c3bc926 Merge pull request #14 from s-allius/reduze-size
Reduze size
2023-10-07 23:10:40 +02:00
Stefan Allius
2cf7a2db36 Version 0.2.0 2023-10-07 23:08:39 +02:00
Stefan Allius
3225566b9b fix formating of a log message 2023-10-07 21:24:49 +02:00
Stefan Allius
fa567f68c0 - disable DEBUG log for releases
- support building of release candidates
2023-10-07 21:14:57 +02:00
Stefan Allius
e1536cb697 adapt log levels, optimize expensive hex dump logs 2023-10-07 21:03:49 +02:00
Stefan Allius
b06d832504 set log level to DEBUG for dev versions 2023-10-07 20:58:18 +02:00
Stefan Allius
ed14ed484b add build support for release candidates (rc) 2023-10-07 20:55:26 +02:00
Stefan Allius
ddba3f6285 optimize and update some comments 2023-10-07 16:39:39 +02:00
Stefan Allius
8264cc6d00 reduce continer size ans security attack surface 2023-10-07 16:20:40 +02:00
12 changed files with 125 additions and 81 deletions

View File

@@ -7,6 +7,27 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
## [Unreleased] ## [Unreleased]
## [0.3.0] - 2023-10-10
❗Due to the definition of values for diagnostics, the MQTT devices of controller and inverter should be deleted in the Home Assistant before updating to version '0.3.0'. After the update, these are automatically created again. The measurement data is retained.
### Changes
- optimize and reduce logging
- switch to pathon 3.12
- classify some values for diagnostics
## [0.2.0] - 2023-10-07
This version halves the size of the Docker image and reduces the attack surface for security vulnerabilities, by omitting unneeded code. The feature set is exactly the same as the previous release version 0.1.0.
### Changes
- move from slim-bookworm to an alpine base image
- install python requirements with pip wheel
- disable DEBUG log for releases
- support building of release candidates
## [0.1.0] - 2023-10-06 ## [0.1.0] - 2023-10-06
- refactoring of the connection classes - refactoring of the connection classes

View File

@@ -7,8 +7,8 @@
<p align="center"> <p align="center">
<a href="https://opensource.org/licenses/BSD-3-Clause"><img alt="License: BSD-3-Clause" src="https://img.shields.io/badge/License-BSD_3--Clause-green.svg"></a> <a href="https://opensource.org/licenses/BSD-3-Clause"><img alt="License: BSD-3-Clause" src="https://img.shields.io/badge/License-BSD_3--Clause-green.svg"></a>
<a href="https://www.python.org/downloads/release/python-3110/"><img alt="Supported Python versions" src="https://img.shields.io/badge/python-3.11-blue.svg"></a> <a href="https://www.python.org/downloads/release/python-3110/"><img alt="Supported Python versions" src="https://img.shields.io/badge/python-3.11-blue.svg"></a>
<a href="https://sbtinstruments.github.io/aiomqtt/introduction.html"><img alt="Supported Python versions" src="https://img.shields.io/badge/aiomqtt-1.2.0-lightblue.svg"></a> <a href="https://sbtinstruments.github.io/aiomqtt/introduction.html"><img alt="Supported aiomqtt versions" src="https://img.shields.io/badge/aiomqtt-1.2.1-lightblue.svg"></a>
<a href="https://toml.io/en/v1.0.0"><img alt="Supported Python versions" src="https://img.shields.io/badge/toml-1.0.0-lightblue.svg"></a> <a href="https://toml.io/en/v1.0.0"><img alt="Supported toml versions" src="https://img.shields.io/badge/toml-1.0.0-lightblue.svg"></a>
</p> </p>
@@ -16,13 +16,16 @@
### ###
# Overview # Overview
The "TSUN Gen3 Micro-Inverter" proxy enables a reliable connection between TSUN third generation inverters and an MQTT broker. With the proxy, you can easily retrieve real-time values such as power, current and daily energy and integrate the inverter into typical home automations. This works even without an internet connection. The optional connection to the TSUN Cloud can be disabled! This proxy enables a reliable connection between TSUN third generation inverters and an MQTT broker. With the proxy, you can easily retrieve real-time values such as power, current and daily energy and integrate the inverter into typical home automations. This works even without an internet connection. The optional connection to the TSUN Cloud can be disabled!
In detail, the inverter establishes a TCP connection to the TSUN cloud to transmit current measured values every 300 seconds. To be able to forward the measurement data to an MQTT broker, the proxy must be looped into this TCP connection. In detail, the inverter establishes a TCP connection to the TSUN cloud to transmit current measured values every 300 seconds. To be able to forward the measurement data to an MQTT broker, the proxy must be looped into this TCP connection.
Through this, the inverter then establishes a connection to the proxy and the proxy establishes another connection to the TSUN Cloud. The transmitted data is interpreted by the proxy and then passed on to both the TSUN Cloud and the MQTT broker. The connection to the TSUN Cloud is optional and can be switched off in the configuration (default is on). Then no more data is sent to the Internet, but no more remote updates of firmware and operating parameters (e.g. rated power, grid parameters) are possible. Through this, the inverter then establishes a connection to the proxy and the proxy establishes another connection to the TSUN Cloud. The transmitted data is interpreted by the proxy and then passed on to both the TSUN Cloud and the MQTT broker. The connection to the TSUN Cloud is optional and can be switched off in the configuration (default is on). Then no more data is sent to the Internet, but no more remote updates of firmware and operating parameters (e.g. rated power, grid parameters) are possible.
By means of `docker` a simple installation and operation is possible. By using `docker-composer`, a complete stack of proxy, `MQTT-brocker` and `home-assistant` can be started easily. By means of `docker` a simple installation and operation is possible. By using `docker-composer`, a complete stack of proxy, `MQTT-brocker` and `home-assistant` can be started easily.
###
This project is not related to the company TSUN. It is a private initiative that aims to connect TSUN inverters with an MQTT broker. There is no support and no warranty from TSUN.
###
``` ```
❗An essential requirement is that the proxy can be looped into the connection ❗An essential requirement is that the proxy can be looped into the connection

View File

@@ -2,40 +2,41 @@ ARG SERVICE_NAME="tsun-proxy"
ARG UID=1000 ARG UID=1000
ARG GID=1000 ARG GID=1000
# set base image (host OS) #
FROM python:3.11-slim-bookworm AS builder # first stage for our base image
FROM python:3.12-alpine AS base
USER root USER root
# install gosu for a better su+exec command RUN apk update && \
RUN set -eux; \ apk upgrade
apt-get update; \ RUN apk add --no-cache su-exec
apt-get install -y gosu; \
rm -rf /var/lib/apt/lists/*; \
# verify that the binary works
gosu nobody true
RUN pip install --upgrade pip #
# second stage for building wheels packages
FROM base as builder
RUN apk add --no-cache build-base && \
python -m pip install --no-cache-dir -U pip wheel
# copy the dependencies file to the working directory # copy the dependencies file to the root dir and install requirements
COPY ./requirements.txt . COPY ./requirements.txt /root/
RUN python -OO -m pip wheel --no-cache-dir --wheel-dir=/root/wheels -r /root/requirements.txt
# install dependencies
RUN pip install --user -r requirements.txt
# #
# second unnamed stage # third stage for our runtime image
FROM python:3.11-slim-bookworm FROM base as runtime
ARG SERVICE_NAME ARG SERVICE_NAME
ARG VERSION ARG VERSION
ARG UID ARG UID
ARG GID ARG GID
ARG LOG_LVL
ENV VERSION=$VERSION ENV VERSION=$VERSION
ENV SERVICE_NAME=$SERVICE_NAME ENV SERVICE_NAME=$SERVICE_NAME
ENV UID=$UID ENV UID=$UID
ENV GID=$GID ENV GID=$GID
ENV LOG_LVL=$LOG_LVL
# set the working directory in the container # set the working directory in the container
@@ -43,18 +44,16 @@ WORKDIR /home/$SERVICE_NAME
# update PATH environment variable # update PATH environment variable
ENV HOME=/home/$SERVICE_NAME ENV HOME=/home/$SERVICE_NAME
ENV PATH=/home/$SERVICE_NAME/.local:$PATH
VOLUME ["/home/$SERVICE_NAME/log", "/home/$SERVICE_NAME/config"] VOLUME ["/home/$SERVICE_NAME/log", "/home/$SERVICE_NAME/config"]
# copy only the dependencies installation from the 1st stage image # install the requirements from the wheels packages from the builder stage
COPY --from=builder --chown=$SERVICE_NAME:$SERVICE_NAME /root/.local /home/$SERVICE_NAME/.local COPY --from=builder /root/wheels /root/wheels
COPY --from=builder /usr/sbin/gosu /usr/sbin/gosu RUN python -m pip install --no-cache --no-index /root/wheels/* && \
rm -rf /root/wheels
COPY entrypoint.sh /root/entrypoint.sh
RUN chmod +x /root/entrypoint.sh
# copy the content of the local src and config directory to the working directory # copy the content of the local src and config directory to the working directory
COPY --chmod=0700 entrypoint.sh /root/entrypoint.sh
COPY config . COPY config .
COPY src . COPY src .

View File

@@ -9,19 +9,21 @@ arr=(${VERSION//./ })
MAJOR=${arr[0]} MAJOR=${arr[0]}
IMAGE=tsun-gen3-proxy IMAGE=tsun-gen3-proxy
if [[ $1 == dev ]];then if [[ $1 == dev ]] || [[ $1 == rc ]] ;then
IMAGE=docker.io/sallius/${IMAGE} IMAGE=docker.io/sallius/${IMAGE}
VERSION=${VERSION}-dev VERSION=${VERSION}-$1
elif [[ $1 == rel ]];then elif [[ $1 == rel ]];then
IMAGE=ghcr.io/s-allius/${IMAGE} IMAGE=ghcr.io/s-allius/${IMAGE}
else else
echo argument missing! echo argument missing!
echo try: $0 '[dev|rel]' echo try: $0 '[dev|rc|rel]'
exit 1 exit 1
fi fi
echo version: $VERSION build-date: $BUILD_DATE image: $IMAGE echo version: $VERSION build-date: $BUILD_DATE image: $IMAGE
if [[ $1 == dev ]];then if [[ $1 == dev ]];then
docker build --build-arg "VERSION=${VERSION}" --build-arg "LOG_LVL=DEBUG" --label "org.label-schema.build-date=${BUILD_DATE}" --label "org.opencontainers.image.version=${VERSION}" -t ${IMAGE}:latest app
elif [[ $1 == rc ]];then
docker build --build-arg "VERSION=${VERSION}" --label "org.label-schema.build-date=${BUILD_DATE}" --label "org.opencontainers.image.version=${VERSION}" -t ${IMAGE}:latest app docker build --build-arg "VERSION=${VERSION}" --label "org.label-schema.build-date=${BUILD_DATE}" --label "org.opencontainers.image.version=${VERSION}" -t ${IMAGE}:latest app
elif [[ $1 == rel ]];then elif [[ $1 == rel ]];then
docker build --no-cache --build-arg "VERSION=${VERSION}" --label "org.label-schema.build-date=${BUILD_DATE}" --label "org.opencontainers.image.version=${VERSION}" -t ${IMAGE}:latest -t ${IMAGE}:${MAJOR} -t ${IMAGE}:${VERSION} app docker build --no-cache --build-arg "VERSION=${VERSION}" --label "org.label-schema.build-date=${BUILD_DATE}" --label "org.opencontainers.image.version=${VERSION}" -t ${IMAGE}:latest -t ${IMAGE}:${MAJOR} -t ${IMAGE}:${VERSION} app

View File

@@ -10,17 +10,15 @@ echo "#"
if [ "$user" = '0' ]; then if [ "$user" = '0' ]; then
mkdir -p /home/$SERVICE_NAME/log /home/$SERVICE_NAME/config mkdir -p /home/$SERVICE_NAME/log /home/$SERVICE_NAME/config
if id $SERVICE_NAME ; then if ! id $SERVICE_NAME &> /dev/null; then
echo "user still exists"
else
addgroup --gid $GID $SERVICE_NAME 2> /dev/null addgroup --gid $GID $SERVICE_NAME 2> /dev/null
adduser --ingroup $SERVICE_NAME --shell /bin/false --disabled-password --no-create-home --comment "" --uid $UID $SERVICE_NAME adduser -G $SERVICE_NAME -s /bin/false -D -H -g "" -u $UID $SERVICE_NAME
fi fi
chown -R $SERVICE_NAME:$SERVICE_NAME /home/$SERVICE_NAME || true chown -R $SERVICE_NAME:$SERVICE_NAME /home/$SERVICE_NAME || true
echo "######################################################" echo "######################################################"
echo "#" echo "#"
exec gosu $SERVICE_NAME "$@" exec su-exec $SERVICE_NAME "$@"
else else
exec "$@" exec "$@"
fi fi

View File

@@ -21,7 +21,7 @@ class AsyncStream(Message):
Our puplic methods Our puplic methods
''' '''
def set_serial_no(self, serial_no : str): def set_serial_no(self, serial_no : str):
logger.info(f'SerialNo: {serial_no}') logger.debug(f'SerialNo: {serial_no}')
if self.unique_id != serial_no: if self.unique_id != serial_no:
@@ -40,7 +40,7 @@ class AsyncStream(Message):
if not inverters['allow_all']: if not inverters['allow_all']:
self.unique_id = None self.unique_id = None
logger.error('ignore message from unknow inverter!') logger.warning(f'ignore message from unknow inverter! (SerialNo: {serial_no})')
return return
self.unique_id = serial_no self.unique_id = serial_no
@@ -67,7 +67,7 @@ class AsyncStream(Message):
except (ConnectionResetError, except (ConnectionResetError,
ConnectionAbortedError, ConnectionAbortedError,
RuntimeError) as error: RuntimeError) as error:
logger.error(f'In loop for {self.addr}: {error}') logger.warning(f'In loop for {self.addr}: {error}')
self.close() self.close()
return return
except Exception: except Exception:

View File

@@ -56,7 +56,7 @@ class Infos:
0x0000044c: {'name':['grid', 'Current'], 'level': logging.DEBUG, 'unit': 'A', 'ha':{'dev':'inverter', 'dev_cla': 'current', 'stat_cla': 'measurement', 'id':'out_cur_', 'fmt':'| float','name': 'Grid Current'}}, 0x0000044c: {'name':['grid', 'Current'], 'level': logging.DEBUG, 'unit': 'A', 'ha':{'dev':'inverter', 'dev_cla': 'current', 'stat_cla': 'measurement', 'id':'out_cur_', 'fmt':'| float','name': 'Grid Current'}},
0x000004b0: {'name':['grid', 'Frequency'], 'level': logging.DEBUG, 'unit': 'Hz', 'ha':{'dev':'inverter', 'dev_cla': 'frequency', 'stat_cla': 'measurement', 'id':'out_freq_', 'fmt':'| float','name': 'Grid Frequency'}}, 0x000004b0: {'name':['grid', 'Frequency'], 'level': logging.DEBUG, 'unit': 'Hz', 'ha':{'dev':'inverter', 'dev_cla': 'frequency', 'stat_cla': 'measurement', 'id':'out_freq_', 'fmt':'| float','name': 'Grid Frequency'}},
0x00000640: {'name':['grid', 'Output_Power'], 'level': logging.INFO, 'unit': 'W', 'ha':{'dev':'inverter', 'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'out_power_', 'fmt':'| float','name': 'Power'}}, 0x00000640: {'name':['grid', 'Output_Power'], 'level': logging.INFO, 'unit': 'W', 'ha':{'dev':'inverter', 'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'out_power_', 'fmt':'| float','name': 'Power'}},
0x000005dc: {'name':['env', 'Rated_Power'], 'level': logging.DEBUG, 'unit': 'W', 'ha':{'dev':'inverter', 'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'rated_power_','fmt':'| int', 'name': 'Rated Power'}}, 0x000005dc: {'name':['env', 'Rated_Power'], 'level': logging.DEBUG, 'unit': 'W', 'ha':{'dev':'inverter', 'dev_cla': 'power', 'stat_cla': 'measurement', 'id':'rated_power_','fmt':'| int', 'name': 'Rated Power','ent_cat':'diagnostic'}},
0x00000514: {'name':['env', 'Inverter_Temp'], 'level': logging.DEBUG, 'unit': '°C', 'ha':{'dev':'inverter', 'dev_cla': 'temperature', 'stat_cla': 'measurement', 'id':'temp_', 'fmt':'| int','name': 'Temperature'}}, 0x00000514: {'name':['env', 'Inverter_Temp'], 'level': logging.DEBUG, 'unit': '°C', 'ha':{'dev':'inverter', 'dev_cla': 'temperature', 'stat_cla': 'measurement', 'id':'temp_', 'fmt':'| int','name': 'Temperature'}},
# input measures: # input measures:
@@ -85,9 +85,9 @@ class Infos:
0x00000bb8: {'name':['total', 'Total_Generation'], 'level': logging.INFO, 'unit': 'kWh', 'ha':{'dev':'inverter', 'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_', 'fmt':'| float','name': 'Total Generation', 'icon':'mdi:solar-power'}}, 0x00000bb8: {'name':['total', 'Total_Generation'], 'level': logging.INFO, 'unit': 'kWh', 'ha':{'dev':'inverter', 'dev_cla': 'energy', 'stat_cla': 'total', 'id':'total_gen_', 'fmt':'| float','name': 'Total Generation', 'icon':'mdi:solar-power'}},
# controller: # controller:
0x000c3500: {'name':['controller', 'Signal_Strength'], 'level': logging.DEBUG, 'unit': '%' , 'ha':{'dev':'controller', 'dev_cla': None, 'stat_cla': 'measurement', 'id':'signal_', 'fmt':'| int', 'name': 'Signal Strength', 'icon':'mdi:wifi'}}, 0x000c3500: {'name':['controller', 'Signal_Strength'], 'level': logging.DEBUG, 'unit': '%' , 'ha':{'dev':'controller', 'dev_cla': None, 'stat_cla': 'measurement', 'id':'signal_', 'fmt':'| int', 'name': 'Signal Strength', 'icon':'mdi:wifi','ent_cat':'diagnostic'}},
0x000c96a8: {'name':['controller', 'Power_On_Time'], 'level': logging.DEBUG, 'unit': 's', 'ha':{'dev':'controller', 'dev_cla': 'duration', 'stat_cla': 'measurement', 'id':'power_on_time_', 'name': 'Power on Time', 'val_tpl':"{{ (value_json['Power_On_Time'] | float)}}", 'nat_prc':'3'}}, 0x000c96a8: {'name':['controller', 'Power_On_Time'], 'level': logging.DEBUG, 'unit': 's', 'ha':{'dev':'controller', 'dev_cla': 'duration', 'stat_cla': 'measurement', 'id':'power_on_time_', 'name': 'Power on Time', 'val_tpl':"{{ (value_json['Power_On_Time'] | float)}}", 'nat_prc':'3','ent_cat':'diagnostic'}},
0x000cf850: {'name':['controller', 'Data_Up_Interval'], 'level': logging.DEBUG, 'unit': 's', 'ha':{'dev':'controller', 'dev_cla': None, 'stat_cla': 'measurement', 'id':'data_up_intval_', 'fmt':'| int', 'name': 'Data Up Interval', 'icon':'mdi:update'}}, 0x000cf850: {'name':['controller', 'Data_Up_Interval'], 'level': logging.DEBUG, 'unit': 's', 'ha':{'dev':'controller', 'dev_cla': None, 'stat_cla': 'measurement', 'id':'data_up_intval_', 'fmt':'| int', 'name': 'Data Up Interval', 'icon':'mdi:update','ent_cat':'diagnostic'}},
} }
@@ -149,10 +149,12 @@ class Infos:
if 'unit' in row: if 'unit' in row:
attr['unit_of_meas'] = row['unit'] # optional add a 'unit_of_meas' e.g. 'W' attr['unit_of_meas'] = row['unit'] # optional add a 'unit_of_meas' e.g. 'W'
if 'icon' in ha: if 'icon' in ha:
attr['icon'] = ha['icon'] # optional add an icon for the entity attr['ic'] = ha['icon'] # optional add an icon for the entity
if 'nat_prc' in ha: if 'nat_prc' in ha:
attr['sug_dsp_prc'] = ha['nat_prc'] # optional add the precison of floats attr['sug_dsp_prc'] = ha['nat_prc'] # optional add the precison of floats
if 'ent_cat' in ha:
attr['ent_cat'] = ha['ent_cat'] # diagnostic, config
# eg. 'dev':{'name':'Microinverter','mdl':'MS-600','ids':["inverter_123"],'mf':'TSUN','sa': 'auf Garagendach'} # eg. 'dev':{'name':'Microinverter','mdl':'MS-600','ids':["inverter_123"],'mf':'TSUN','sa': 'auf Garagendach'}
# attr['dev'] = {'name':'Microinverter','mdl':'MS-600','ids':[f'inverter_{snr}'],'mf':'TSUN','sa': 'auf Garagendach'} # attr['dev'] = {'name':'Microinverter','mdl':'MS-600','ids':[f'inverter_{snr}'],'mf':'TSUN','sa': 'auf Garagendach'}
if 'dev' in ha: if 'dev' in ha:

View File

@@ -4,7 +4,8 @@ from async_stream import AsyncStream
from mqtt import Mqtt from mqtt import Mqtt
#import gc #import gc
logger = logging.getLogger('conn') #logger = logging.getLogger('conn')
logger_mqtt = logging.getLogger('mqtt')
@@ -21,7 +22,7 @@ class Inverter(AsyncStream):
async def server_loop(self, addr): async def server_loop(self, addr):
'''Loop for receiving messages from the inverter (server-side)''' '''Loop for receiving messages from the inverter (server-side)'''
logger.info(f'Accept connection from {addr}') logging.info(f'Accept connection from {addr}')
await self.loop() await self.loop()
logging.info(f'Server loop stopped for {addr}') logging.info(f'Server loop stopped for {addr}')
@@ -77,7 +78,7 @@ class Inverter(AsyncStream):
for key in self.new_data: for key in self.new_data:
if self.new_data[key] and key in db: if self.new_data[key] and key in db:
data_json = json.dumps(db[key]) data_json = json.dumps(db[key])
logger.info(f'{key}: {data_json}') logger_mqtt.debug(f'{key}: {data_json}')
await self.mqtt.publish(f"{self.entitiy_prfx}{self.node_id}{key}", data_json) await self.mqtt.publish(f"{self.entitiy_prfx}{self.node_id}{key}", data_json)
self.new_data[key] = False self.new_data[key] = False
@@ -85,7 +86,7 @@ class Inverter(AsyncStream):
'''register all our topics at home assistant''' '''register all our topics at home assistant'''
try: try:
for data_json, component, id in self.db.ha_confs(self.entitiy_prfx + self.node_id, self.unique_id, self.sug_area): for data_json, component, id in self.db.ha_confs(self.entitiy_prfx + self.node_id, self.unique_id, self.sug_area):
#logger.debug(f'MQTT Register: {data_json}') logger_mqtt.debug(f'MQTT Register: {data_json}')
await self.mqtt.publish(f"{self.discovery_prfx}{component}/{self.node_id}{id}/config", data_json) await self.mqtt.publish(f"{self.discovery_prfx}{component}/{self.node_id}{id}/config", data_json)
except Exception: except Exception:
logging.error( logging.error(

View File

@@ -11,30 +11,32 @@ keys=console_formatter,file_formatter
level=DEBUG level=DEBUG
handlers=console_handler,file_handler_name1 handlers=console_handler,file_handler_name1
[logger_mesg]
level=DEBUG
handlers=console_handler,file_handler_name1,file_handler_name2
propagate=0
qualname=msg
[logger_conn] [logger_conn]
level=DEBUG level=DEBUG
handlers=console_handler,file_handler_name1,file_handler_name2 handlers=console_handler,file_handler_name1
propagate=0 propagate=0
qualname=conn qualname=conn
[logger_data]
level=DEBUG
handlers=file_handler_name1,file_handler_name2
propagate=0
qualname=data
[logger_mqtt] [logger_mqtt]
level=INFO level=INFO
handlers=console_handler,file_handler_name1 handlers=console_handler,file_handler_name1
propagate=0 propagate=0
qualname=mqtt qualname=mqtt
[logger_data]
level=DEBUG
handlers=file_handler_name1
propagate=0
qualname=data
[logger_mesg]
level=DEBUG
handlers=file_handler_name2
propagate=0
qualname=msg
[logger_tracer] [logger_tracer]
level=INFO level=INFO
handlers=file_handler_name2 handlers=file_handler_name2
@@ -60,9 +62,9 @@ args=('log/trace.log', when:='midnight')
[formatter_console_formatter] [formatter_console_formatter]
format=%(asctime)s %(levelname)5s | %(name)4s | %(message)s' format=%(asctime)s %(levelname)5s | %(name)4s | %(message)s'
datefmt='%d-%m-%Y %H:%M:%S datefmt='%Y-%m-%d %H:%M:%S
[formatter_file_formatter] [formatter_file_formatter]
format=%(asctime)s %(levelname)5s | %(name)4s | %(message)s' format=%(asctime)s %(levelname)5s | %(name)4s | %(message)s'
datefmt='%d-%m-%Y %H:%M:%S datefmt='%Y-%m-%d %H:%M:%S

View File

@@ -18,7 +18,8 @@ def hex_dump_memory(level, info, data, num):
lines = [] lines = []
lines.append(info) lines.append(info)
tracer = logging.getLogger('tracer') tracer = logging.getLogger('tracer')
if not tracer.isEnabledFor(level): return
#data = list((num * ctypes.c_byte).from_address(ptr)) #data = list((num * ctypes.c_byte).from_address(ptr))
@@ -294,7 +295,7 @@ class Message(metaclass=IterRegistry):
def msg_unknown(self): def msg_unknown(self):
logger.error (f"Unknow Msg: ID:{self.msg_id}") logger.warning (f"Unknow Msg: ID:{self.msg_id}")
self.forward(self._recv_buffer, self.header_len+self.data_len) self.forward(self._recv_buffer, self.header_len+self.data_len)

View File

@@ -32,8 +32,16 @@ def handle_SIGTERM(loop):
logging.info('Shutdown complete') logging.info('Shutdown complete')
def get_log_level() -> int:
'''checks if LOG_LVL is set in the environment and returns the corresponding logging.LOG_LEVEL'''
log_level = os.getenv('LOG_LVL', 'INFO')
if log_level== 'DEBUG':
log_level = logging.DEBUG
elif log_level== 'WARN':
log_level = logging.WARNING
else:
log_level = logging.INFO
return log_level
if __name__ == "__main__": if __name__ == "__main__":
@@ -41,16 +49,23 @@ if __name__ == "__main__":
# Setup our daily, rotating logger # Setup our daily, rotating logger
# #
serv_name = os.getenv('SERVICE_NAME', 'proxy') serv_name = os.getenv('SERVICE_NAME', 'proxy')
version = os.getenv('VERSION', 'unknown') version = os.getenv('VERSION', 'unknown')
logging.config.fileConfig('logging.ini') logging.config.fileConfig('logging.ini')
logging.info(f'Server "{serv_name} - {version}" will be started') logging.info(f'Server "{serv_name} - {version}" will be started')
logging.getLogger().setLevel(logging.DEBUG if __debug__ else logging.INFO)
# set lowest-severity for 'root', 'msg', 'conn' and 'data' logger
log_level = get_log_level()
logging.getLogger().setLevel(log_level)
logging.getLogger('msg').setLevel(log_level)
logging.getLogger('conn').setLevel(log_level)
logging.getLogger('data').setLevel(log_level)
# read config file # read config file
Config.read() Config.read()
loop = asyncio.get_event_loop() loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
# call Mqtt singleton to establisch the connection to the mqtt broker # call Mqtt singleton to establisch the connection to the mqtt broker
mqtt = Mqtt() mqtt = Mqtt()

View File

@@ -61,22 +61,22 @@ def test_build_ha_conf1(ContrDataSeq):
elif id == 'daily_gen_123': elif id == 'daily_gen_123':
assert comp == 'sensor' assert comp == 'sensor'
assert d_json == json.dumps({"name": "Daily Generation", "stat_t": "tsun/garagendach/total", "dev_cla": "energy", "stat_cla": "total_increasing", "uniq_id": "daily_gen_123", "val_tpl": "{{value_json['Daily_Generation'] | float}}", "unit_of_meas": "kWh", "icon": "mdi:solar-power-variant", "dev": {"name": "Micro Inverter", "sa": "Micro Inverter", "via_device": "controller_123", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}}) assert d_json == json.dumps({"name": "Daily Generation", "stat_t": "tsun/garagendach/total", "dev_cla": "energy", "stat_cla": "total_increasing", "uniq_id": "daily_gen_123", "val_tpl": "{{value_json['Daily_Generation'] | float}}", "unit_of_meas": "kWh", "ic": "mdi:solar-power-variant", "dev": {"name": "Micro Inverter", "sa": "Micro Inverter", "via_device": "controller_123", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1 tests +=1
elif id == 'power_pv1_123': elif id == 'power_pv1_123':
assert comp == 'sensor' assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv1_123", "val_tpl": "{{ (value_json['pv1']['Power'] | float)}}", "unit_of_meas": "W", "icon": "mdi:gauge", "dev": {"name": "Module PV1", "sa": "Module PV1", "via_device": "inverter_123", "ids": ["input_pv1_123"]}, "o": {"name": "proxy", "sw": "unknown"}}) assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv1_123", "val_tpl": "{{ (value_json['pv1']['Power'] | float)}}", "unit_of_meas": "W", "ic": "mdi:gauge", "dev": {"name": "Module PV1", "sa": "Module PV1", "via_device": "inverter_123", "ids": ["input_pv1_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1 tests +=1
elif id == 'power_pv2_123': elif id == 'power_pv2_123':
assert comp == 'sensor' assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv2_123", "val_tpl": "{{ (value_json['pv2']['Power'] | float)}}", "unit_of_meas": "W", "icon": "mdi:gauge", "dev": {"name": "Module PV2", "sa": "Module PV2", "via_device": "inverter_123", "ids": ["input_pv2_123"]}, "o": {"name": "proxy", "sw": "unknown"}}) assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv2_123", "val_tpl": "{{ (value_json['pv2']['Power'] | float)}}", "unit_of_meas": "W", "ic": "mdi:gauge", "dev": {"name": "Module PV2", "sa": "Module PV2", "via_device": "inverter_123", "ids": ["input_pv2_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1 tests +=1
elif id == 'signal_123': elif id == 'signal_123':
assert comp == 'sensor' assert comp == 'sensor'
assert d_json == json.dumps({"name": "Signal Strength", "stat_t": "tsun/garagendach/controller", "dev_cla": None, "stat_cla": "measurement", "uniq_id": "signal_123", "val_tpl": "{{value_json[\'Signal_Strength\'] | int}}", "unit_of_meas": "%", "icon": "mdi:wifi", "dev": {"name": "Controller", "sa": "Controller", "ids": ["controller_123"]}, "o": {"name": "proxy", "sw": "unknown"}}) assert d_json == json.dumps({"name": "Signal Strength", "stat_t": "tsun/garagendach/controller", "dev_cla": None, "stat_cla": "measurement", "uniq_id": "signal_123", "val_tpl": "{{value_json[\'Signal_Strength\'] | int}}", "unit_of_meas": "%", "ic": "mdi:wifi", "dev": {"name": "Controller", "sa": "Controller", "ids": ["controller_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1 tests +=1
assert tests==5 assert tests==5
@@ -98,21 +98,21 @@ def test_build_ha_conf2(ContrDataSeq, InvDataSeq):
elif id == 'daily_gen_123': elif id == 'daily_gen_123':
assert comp == 'sensor' assert comp == 'sensor'
assert d_json == json.dumps({"name": "Daily Generation", "stat_t": "tsun/garagendach/total", "dev_cla": "energy", "stat_cla": "total_increasing", "uniq_id": "daily_gen_123", "val_tpl": "{{value_json['Daily_Generation'] | float}}", "unit_of_meas": "kWh", "icon": "mdi:solar-power-variant", "dev": {"name": "Micro Inverter", "sa": "Micro Inverter", "via_device": "controller_123", "mdl": "TSOL-MS600", "mf": "TSUN", "sw": "V5.0.11", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}}) assert d_json == json.dumps({"name": "Daily Generation", "stat_t": "tsun/garagendach/total", "dev_cla": "energy", "stat_cla": "total_increasing", "uniq_id": "daily_gen_123", "val_tpl": "{{value_json['Daily_Generation'] | float}}", "unit_of_meas": "kWh", "ic": "mdi:solar-power-variant", "dev": {"name": "Micro Inverter", "sa": "Micro Inverter", "via_device": "controller_123", "mdl": "TSOL-MS600", "mf": "TSUN", "sw": "V5.0.11", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1 tests +=1
elif id == 'power_pv1_123': elif id == 'power_pv1_123':
assert comp == 'sensor' assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv1_123", "val_tpl": "{{ (value_json['pv1']['Power'] | float)}}", "unit_of_meas": "W", "icon": "mdi:gauge", "dev": {"name": "Module PV1", "sa": "Module PV1", "via_device": "inverter_123", "ids": ["input_pv1_123"]}, "o": {"name": "proxy", "sw": "unknown"}}) assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv1_123", "val_tpl": "{{ (value_json['pv1']['Power'] | float)}}", "unit_of_meas": "W", "ic": "mdi:gauge", "dev": {"name": "Module PV1", "sa": "Module PV1", "via_device": "inverter_123", "ids": ["input_pv1_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1 tests +=1
elif id == 'power_pv2_123': elif id == 'power_pv2_123':
assert comp == 'sensor' assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv2_123", "val_tpl": "{{ (value_json['pv2']['Power'] | float)}}", "unit_of_meas": "W", "icon": "mdi:gauge", "dev": {"name": "Module PV2", "sa": "Module PV2", "via_device": "inverter_123", "ids": ["input_pv2_123"]}, "o": {"name": "proxy", "sw": "unknown"}}) assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv2_123", "val_tpl": "{{ (value_json['pv2']['Power'] | float)}}", "unit_of_meas": "W", "ic": "mdi:gauge", "dev": {"name": "Module PV2", "sa": "Module PV2", "via_device": "inverter_123", "ids": ["input_pv2_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1 tests +=1
elif id == 'signal_123': elif id == 'signal_123':
assert comp == 'sensor' assert comp == 'sensor'
assert d_json == json.dumps({"name": "Signal Strength", "stat_t": "tsun/garagendach/controller", "dev_cla": None, "stat_cla": "measurement", "uniq_id": "signal_123", "val_tpl": "{{value_json[\'Signal_Strength\'] | int}}", "unit_of_meas": "%", "icon": "mdi:wifi", "dev": {"name": "Controller", "sa": "Controller", "mdl": "RSW-1-10001", "mf": "Raymon", "sw": "RSW_400_V1.00.06", "ids": ["controller_123"]}, "o": {"name": "proxy", "sw": "unknown"}}) assert d_json == json.dumps({"name": "Signal Strength", "stat_t": "tsun/garagendach/controller", "dev_cla": None, "stat_cla": "measurement", "uniq_id": "signal_123", "val_tpl": "{{value_json[\'Signal_Strength\'] | int}}", "unit_of_meas": "%", "ic": "mdi:wifi", "dev": {"name": "Controller", "sa": "Controller", "mdl": "RSW-1-10001", "mf": "Raymon", "sw": "RSW_400_V1.00.06", "ids": ["controller_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1 tests +=1
assert tests==5 assert tests==5