Compare commits

...

37 Commits

Author SHA1 Message Date
Stefan Allius
2a5266dfd6 catch socket.gaierror exception 2025-07-15 20:52:24 +02:00
Stefan Allius
dcc3fe67a5 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into s-allius/issue472 2025-07-15 20:32:30 +02:00
renovate[bot]
0b05f6cd9a Update dependency coverage to v7.9.2 (#470)
* Update dependency coverage to v7.9.2

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-07-15 20:23:01 +02:00
Stefan Allius
c12b224414 Update dependency coverage to v7.9.2 2025-07-15 20:21:05 +02:00
Stefan Allius
c5675fea6e Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into renovate/coverage-7.x 2025-07-15 20:16:20 +02:00
renovate[bot]
0e35a506e0 Update ghcr.io/hassio-addons/base Docker tag to v18.0.3 (#469)
* update python and pip to compatible versions

* Update ghcr.io/hassio-addons/base Docker tag to v18.0.3

* add-on: remove armhf and armv7 support

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-07-15 20:13:55 +02:00
renovate[bot]
8625fd06ad Update dependency coverage to v7.9.2 2025-07-03 13:44:15 +00:00
renovate[bot]
eba2c3e452 Update ghcr.io/hassio-addons/base Docker tag to v18 (#468)
* Update ghcr.io/hassio-addons/base Docker tag to v18

* improve docker annotations

* update python and pip to compatible versions

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-06-29 21:47:37 +02:00
renovate[bot]
118fab8b6c Update dependency python-dotenv to v1.1.1 (#467)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-24 18:24:28 +02:00
Stefan Allius
d25f142e10 add links to add-on urls (#466)
* add links to add-on urls

* Add translations

* set app.testing to get exceptions during test

* improve unit-tests for the web-UI

* update changelog

* extend languages tests

* workaround for github runner
2025-06-22 21:39:31 +02:00
Stefan Allius
eb59e19c0a Fix Sonar Qube errors and warnings (#464)
* replace constructor call with a literal

  https://sonarcloud.io/project/issues?open=AZeMhhlEyR1Wrs09sNyb&id=s-allius_tsun-gen3-proxy

* re-raise cancel error after cleanup

https://sonarcloud.io/project/issues?open=AZeMhhltyR1Wrs09sNyc&id=s-allius_tsun-gen3-proxy

* remove duplicated line

* change send_modbus_cmd into a synchronous function

* make send_start_cmd synchronous

https://sonarcloud.io/project/issues?open=AZeMhhhyyR1Wrs09sNya&id=s-allius_tsun-gen3-proxy

* make more functions synchronous

* update changelog
2025-06-21 12:18:48 +02:00
Stefan Allius
bacebbd649 S allius/issue456 (#462)
* - remove unused 32-bit architectures from the prebuild multiarch containers

* update po file
2025-06-21 10:41:47 +02:00
renovate[bot]
ebbb675e63 Update dependency flake8 to v7.3.0 (#459)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-21 10:27:45 +02:00
Stefan Allius
04fd9ed7f6 S allius/issue460 (#461)
* - Improve Makefile

* - Babel don't build new po file if only the pot creation-date was changed
2025-06-21 10:26:17 +02:00
renovate[bot]
f3c22c9853 Update dependency pytest to v8.4.1 (#458)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-20 10:51:02 +02:00
renovate[bot]
460db31fa6 Update python Docker tag to v3.13.5 (#453)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-20 10:46:35 +02:00
renovate[bot]
144c9080cb Update dependency coverage to v7.9.1 (#454)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-15 22:43:00 +02:00
renovate[bot]
dc1a28260e Update dependency coverage to v7.9.0 (#450)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-12 22:16:56 +02:00
renovate[bot]
e59529adc0 Update dependency pytest-cov to v6.2.1 (#449)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-12 22:13:41 +02:00
renovate[bot]
8d93b2a636 Update python Docker tag to v3.13.4 (#446)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-12 22:12:01 +02:00
renovate[bot]
01e9e70957 Update dependency pytest to v8.4.0 (#444)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-12 22:11:37 +02:00
renovate[bot]
1721bbebe2 Update dependency pytest-asyncio to v1 (#433)
* Update dependency pytest-asyncio to v1

* set version to 0.15.0

* Update dependency pytest-asyncio to v1

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-05-31 23:55:50 +02:00
Stefan Allius
41168fbb4d S allius/issue438 (#442)
* Update change log (#436)

* S allius/issue427 (#434)

* mock the aiomqtt library and increse coverage

* test inv response for a mb scan request

* improve test coverage

* S allius/issue427 (#435)

* mock the aiomqtt library and increse coverage

* test inv response for a mb scan request

* improve test coverage

* improve test case

* version 0.14.0

* handle missing MQTT addon

- we have to check if the supervisor API and a
MQTT broker add-on is installed. If not we assume
the user has an external MQTT broker

* handle missing MQTT addon

* run also on releases/* branch

* avoid printing of the MQTT config inkl. password

* revise the log outputs

* update version 0.14.1

* new version 0.14.1
2025-05-31 23:30:16 +02:00
Stefan Allius
25ba6ef8f3 version 0.14.0 (#441) 2025-05-31 23:27:49 +02:00
Stefan Allius
2a40bd7b71 S allius/issue427 (#435)
* mock the aiomqtt library and increse coverage

* test inv response for a mb scan request

* improve test coverage

* improve test case
2025-05-26 23:42:13 +02:00
Stefan Allius
95182d2196 S allius/issue427 (#434)
* mock the aiomqtt library and increse coverage

* test inv response for a mb scan request

* improve test coverage
2025-05-26 23:16:33 +02:00
Stefan Allius
f1da544c88 S allius/update python (#431)
* S allius/update python (#430)

* add-on: bump python to version 3.12.10-r1 (#429)
2025-05-25 03:52:59 +02:00
Stefan Allius
7365980c2f S allius/update python (#430)
* add-on: bump python to version 3.12.10-r1 (#429)
2025-05-25 03:21:21 +02:00
Stefan Allius
11e3226460 add-on: bump python to version 3.12.10-r1 (#429) 2025-05-25 02:27:22 +02:00
Stefan Allius
f69f9c6d63 mock the aiomqtt library and increse coverage (#428)
* mock the aiomqtt library and increse coverage

* test inv response for a mb scan request
2025-05-25 01:34:22 +02:00
Stefan Allius
321c66838d set no of pv modules for MS800 GEN3PLUS inverters (#424)
* set no of pv modules for MS800 GEN3PLUS inverters

* fix unit test

* increase test coverage

* change the PV module handling

- in default we set the number of modules now to
  two. So with the first data from the inverter
  we only register two modules. After we determine
  the inverter module, the number can increase to
  four and more PV modules will be registered.

  With the default value of 4, we register always
  4 modules and can't reduce the number of areas
  when we detect that the inverter only supoorts
  two PV modules
2025-05-24 23:12:55 +02:00
renovate[bot]
0a8e708735 Update dependency coverage to v7.8.2 (#426)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-05-23 22:53:24 +02:00
Stefan Allius
bd88647f0b fix the paths to copy the config.example.toml file (#425) 2025-05-22 21:29:41 +02:00
renovate[bot]
bb2250bca1 Update dependency coverage to v7.8.1 (#419)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-05-21 20:51:16 +02:00
Stefan Allius
e25aa5f922 S allius/issue397 (#418)
* change icon for notes
2025-05-20 23:40:26 +02:00
Stefan Allius
46945d55e1 add dcu_power MQTT topic (#416)
* add dcu_power MQTT topic

* add DCU_COMMAND counter

* test invalid dcu_power values

* handle and test DCU Command responses

* test dcu commands from the TSUN cloud

* cleanup MQTT topic handling

* update changelog

* test MQTT error and exception handling

* increase test coverage

* test dispatcher exceptions

* fix full_topic definition in dispatch test
2025-05-20 19:54:24 +02:00
Stefan Allius
c1bdec0844 S allius/issue396 (#413)
* improve translation of delete modal
2025-05-13 22:53:37 +02:00
40 changed files with 1048 additions and 261 deletions

View File

@@ -5,7 +5,7 @@ name: Python application
on:
push:
branches: [ "main", "dev-*", "*/issue*" ]
branches: [ "main", "dev-*", "*/issue*", "releases/*" ]
paths-ignore:
- '**.md' # Do no build on *.md changes
- '**.yml' # Do no build on *.yml changes
@@ -18,7 +18,7 @@ on:
- '**.dockerfile' # Do no build on *.dockerfile changes
- '**.sh' # Do no build on *.sh changes
pull_request:
branches: [ "main", "dev-*" ]
branches: [ "main", "dev-*", "releases/*" ]
permissions:
contents: read

View File

@@ -1 +1 @@
3.13.2
3.13.5

View File

@@ -7,6 +7,27 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
## [unreleased]
- catch socket.gaierror exception and log this with info level
- Update dependency coverage to v7.9.2
- add-on: bump base-image to version 18.0.3
- add-on: remove armhf and armv7 support
- add-on: add links to config and log-file to the web-UI
- fix some SonarQube warnings
- remove unused 32-bit architectures
- Babel don't build new po file if only the pot creation-date was changed
- Improve Makefile
- Update dependency pytest-asyncio to v1
## [0.14.1] - 2025-05-31
- handle missing MQTT addon [#438](https://github.com/s-allius/tsun-gen3-proxy/issues/438)
## [0.14.0] - 2025-05-29
- add-on: bump python to version 3.12.10-r1
- set no of pv modules for MS800 GEN3PLUS inverters
- fix the paths to copy the config.example.toml file during proxy start
- add MQTT topic `dcu_power` for setting output power on DCUs
- Update ghcr.io/hassio-addons/base Docker tag to v17.2.5
- fix a lot of pytest-asyncio problems in the unit tests
- Cleanup startup code for Quart and the Proxy

View File

@@ -1,27 +1,37 @@
.PHONY: build babel clean addon-dev addon-debug addon-rc addon-rel debug dev preview rc rel check-docker-compose install
.PHONY: help build babel clean addon-dev addon-debug addon-rc addon-rel debug dev preview rc rel check-docker-compose install
babel:
help: ## show help message
@awk 'BEGIN {FS = ":.*##"; printf "\nUsage:\n make \033[36m\033[0m\n"} /^[$$()% a-zA-Z0-9_-]+:.*?##/ { printf " \033[36m%-15s\033[0m %s\n", $$1, $$2 } /^##@/ { printf "\n\033[1m%s\033[0m\n", substr($$0, 5) } ' $(MAKEFILE_LIST)
babel: ## build language files
$(MAKE) -C app $@
build:
$(MAKE) -C ha_addons $@
clean:
clean: ## delete all built files
$(MAKE) -C app $@
$(MAKE) -C ha_addons $@
debug dev preview rc rel:
debug dev preview rc rel: ## build docker container in <dev|debg|rc|rel> version
$(MAKE) -C app babel
$(MAKE) -C app $@
addon-dev addon-debug addon-rc addon-rel:
addon-dev addon-debug addon-rc addon-rel: ## build HA add-on in <dev|debg|rc|rel> version
$(MAKE) -C app babel
$(MAKE) -C ha_addons $(patsubst addon-%,%,$@)
check-docker-compose:
check-docker-compose: ## check the docker-compose file
docker-compose config -q
install:
python3 -m pip install --upgrade pip
python3 -m pip install -r requirements.txt
python3 -m pip install -r requirements-test.txt
PY_VER := $(shell cat .python-version)
install: ## install requirements into the pyenv and switch to proper venv
@pyenv local $(PY_VER) || { pyenv install $(PY_VER) && pyenv local $(PY_VER) || exit 1; }
@pyenv exec pip install --upgrade pip
@pyenv exec pip install -r requirements.txt
@pyenv exec pip install -r requirements-test.txt
pyenv exec python --version
run: ## run proxy locally out of the actual venv
pyenv exec python app/src/server.py -c /app/src/cnf

View File

@@ -1 +1 @@
0.14.0
0.15.0

View File

@@ -55,7 +55,7 @@ $(BABEL_TRANSLATIONS)/%.pot : $(SRC)/.babel.cfg $(BABEL_INPUT)
$(BABEL_TRANSLATIONS)/%/LC_MESSAGES/messages.po : $(BABEL_TRANSLATIONS)/messages.pot
@mkdir -p $(@D)
@pybabel update --init-missing -i $< -d $(BABEL_TRANSLATIONS) -l $*
@pybabel update --init-missing --ignore-pot-creation-date -i $< -d $(BABEL_TRANSLATIONS) -l $*
$(BABEL_TRANSLATIONS)/%/LC_MESSAGES/messages.mo : $(BABEL_TRANSLATIONS)/%/LC_MESSAGES/messages.po
@pybabel compile -d $(BABEL_TRANSLATIONS) -l $*

View File

@@ -29,17 +29,17 @@ target "_common" {
"type =sbom,generator=docker/scout-sbom-indexer:latest"
]
annotations = [
"index:org.opencontainers.image.title=TSUN Gen3 Proxy",
"index:org.opencontainers.image.authors=Stefan Allius",
"index:org.opencontainers.image.created=${BUILD_DATE}",
"index:org.opencontainers.image.version=${VERSION}",
"index:org.opencontainers.image.revision=${BRANCH}",
"index:org.opencontainers.image.description=${DESCRIPTION}",
"index,manifest-descriptor:org.opencontainers.image.title=TSUN-Proxy",
"index,manifest-descriptor:org.opencontainers.image.authors=Stefan Allius",
"index,manifest-descriptor:org.opencontainers.image.created=${BUILD_DATE}",
"index,manifest-descriptor:org.opencontainers.image.version=${VERSION}",
"index,manifest-descriptor:org.opencontainers.image.revision=${BRANCH}",
"index,manifest-descriptor:org.opencontainers.image.description=${DESCRIPTION}",
"index:org.opencontainers.image.licenses=BSD-3-Clause",
"index:org.opencontainers.image.source=https://github.com/s-allius/tsun-gen3-proxy"
]
labels = {
"org.opencontainers.image.title" = "TSUN Gen3 Proxy"
"org.opencontainers.image.title" = "TSUN-Proxy"
"org.opencontainers.image.authors" = "Stefan Allius"
"org.opencontainers.image.created" = "${BUILD_DATE}"
"org.opencontainers.image.version" = "${VERSION}"
@@ -53,7 +53,7 @@ target "_common" {
]
no-cache = false
platforms = ["linux/amd64", "linux/arm64", "linux/arm/v7"]
platforms = ["linux/amd64", "linux/arm64"]
}
target "_debug" {

View File

@@ -1,8 +1,8 @@
flake8==7.2.0
pytest==8.3.5
pytest-asyncio==0.26.0
pytest-cov==6.1.1
python-dotenv==1.1.0
flake8==7.3.0
pytest==8.4.1
pytest-asyncio==1.0.0
pytest-cov==6.2.1
python-dotenv==1.1.1
mock==5.2.0
coverage==7.8.0
coverage==7.9.2
jinja2-cli==0.8.2

View File

@@ -162,7 +162,8 @@ class Config():
)
@classmethod
def init(cls, def_reader: ConfigIfc, log_path: str = '') -> None | str:
def init(cls, def_reader: ConfigIfc, log_path: str = '',
cnf_path: str = 'config') -> None | str:
'''Initialise the Proxy-Config
Copy the internal default config file into the config directory
@@ -173,12 +174,13 @@ and initialise the Config with the default configuration '''
try:
# make the default config transparaent by copying it
# in the config.example file
logging.debug('Copy Default Config to config.example.toml')
logging.info(
f'Copy Default Config to {cnf_path}config.example.toml')
shutil.copy2("default_config.toml",
"config/config.example.toml")
except Exception:
pass
shutil.copy2("cnf/default_config.toml",
cnf_path + "config.example.toml")
except Exception as e:
logging.error(e)
# read example config file as default configuration
try:

View File

@@ -216,7 +216,7 @@ class InfosG3P(Infos):
self.set_db_def_value(Register.MANUFACTURER, 'TSUN')
self.set_db_def_value(Register.EQUIPMENT_MODEL, 'TSOL-MSxx00')
self.set_db_def_value(Register.CHIP_TYPE, 'IGEN TECH')
self.set_db_def_value(Register.NO_INPUTS, 4)
self.set_db_def_value(Register.NO_INPUTS, 2)
def __hide_topic(self, row: dict) -> bool:
if 'dep' in row:

50
app/src/gen3plus/solarman_v5.py Normal file → Executable file
View File

@@ -247,6 +247,7 @@ class SolarmanBase(Message):
class SolarmanV5(SolarmanBase):
AT_CMD = 1
MB_RTU_CMD = 2
DCU_CMD = 5
AT_CMD_RSP = 8
MB_CLIENT_DATA_UP = 30
'''Data up time in client mode'''
@@ -340,9 +341,9 @@ class SolarmanV5(SolarmanBase):
self.log_lvl.clear()
super().close()
async def send_start_cmd(self, snr: int, host: str,
forward: bool,
start_timeout=MB_CLIENT_DATA_UP):
def send_start_cmd(self, snr: int, host: str,
forward: bool,
start_timeout=MB_CLIENT_DATA_UP):
self.no_forwarding = True
self.establish_inv_emu = forward
self.snr = snr
@@ -532,6 +533,26 @@ class SolarmanV5(SolarmanBase):
except Exception:
self.ifc.tx_clear()
def send_dcu_cmd(self, pdu: bytearray):
if self.sensor_list != 0x3026:
logger.debug(f'[{self.node_id}] DCU CMD not allowed,'
f' for sensor: {self.sensor_list:#04x}')
return
if self.state != State.up:
logger.warning(f'[{self.node_id}] ignore DCU CMD,'
' cause the state is not UP anymore')
return
self.inverter.forward_dcu_cmd_resp = False
self._build_header(0x4510)
self.ifc.tx_add(struct.pack('<BHLLL', self.DCU_CMD,
self.sensor_list, 0, 0, 0))
self.ifc.tx_add(pdu)
self._finish_send_msg()
self.ifc.tx_log(logging.INFO, f'Send DCU CMD :{self.addr}:')
self.ifc.tx_flush()
def __forward_msg(self):
self.forward(self.ifc.rx_peek(), self.header_len+self.data_len+2)
@@ -541,12 +562,17 @@ class SolarmanV5(SolarmanBase):
rated = db.get_db_value(Register.RATED_POWER, 0)
model = None
if max_pow == 2000:
db.set_db_def_value(Register.NO_INPUTS, 4)
if rated == 800 or rated == 600:
model = f'TSOL-MS{max_pow}({rated})'
else:
model = f'TSOL-MS{max_pow}'
elif max_pow == 1800 or max_pow == 1600:
db.set_db_def_value(Register.NO_INPUTS, 4)
model = f'TSOL-MS{max_pow}'
elif max_pow <= 800:
model = f'TSOL-MS{max_pow}'
if model:
logger.info(f'Model: {model}')
self.db.set_db_def_value(Register.EQUIPMENT_MODEL, model)
@@ -647,6 +673,10 @@ class SolarmanV5(SolarmanBase):
self.inc_counter('AT_Command')
self.inverter.forward_at_cmd_resp = True
if ftype == self.DCU_CMD:
self.inc_counter('DCU_Command')
self.inverter.forward_dcu_cmd_resp = True
elif ftype == self.MB_RTU_CMD:
rstream = self.ifc.remote.stream
if rstream.mb.recv_req(data[15:],
@@ -670,6 +700,10 @@ class SolarmanV5(SolarmanBase):
if self.inverter.forward_at_cmd_resp:
return logging.INFO
return logging.DEBUG
elif ftype == self.DCU_CMD:
if self.inverter.forward_dcu_cmd_resp:
return logging.INFO
return logging.DEBUG
elif ftype == self.MB_RTU_CMD \
and self.server_side:
return self.mb.last_log_lvl
@@ -689,6 +723,16 @@ class SolarmanV5(SolarmanBase):
logger.info(f'{key}: {data_json}')
self.publish_mqtt(f'{Proxy.entity_prfx}{node_id}{key}', data_json) # noqa: E501
return
elif ftype == self.DCU_CMD:
if not self.inverter.forward_dcu_cmd_resp:
data_json = '+ok'
node_id = self.node_id
key = 'dcu_resp'
logger.info(f'{key}: {data_json}')
self.publish_mqtt(f'{Proxy.entity_prfx}{node_id}{key}', data_json) # noqa: E501
return
elif ftype == self.MB_RTU_CMD:
self.__modbus_command_rsp(data)
return

View File

@@ -44,6 +44,7 @@ class Register(Enum):
MODBUS_COMMAND = 60
AT_COMMAND_BLOCKED = 61
CLOUD_CONN_CNT = 62
DCU_COMMAND = 63
OUTPUT_POWER = 83
RATED_POWER = 84
INVERTER_TEMP = 85
@@ -625,6 +626,7 @@ class Infos:
Register.INVALID_MSG_FMT: {'name': ['proxy', 'Invalid_Msg_Format'], 'singleton': True, 'ha': {'dev': 'proxy', 'comp': 'sensor', 'dev_cla': None, 'stat_cla': None, 'id': 'inv_msg_fmt_', 'fmt': FMT_INT, 'name': 'Invalid Message Format', 'icon': COUNTER, 'ent_cat': 'diagnostic'}}, # noqa: E501
Register.AT_COMMAND: {'name': ['proxy', 'AT_Command'], 'singleton': True, 'ha': {'dev': 'proxy', 'comp': 'sensor', 'dev_cla': None, 'stat_cla': None, 'id': 'at_cmd_', 'fmt': FMT_INT, 'name': 'AT Command', 'icon': COUNTER, 'ent_cat': 'diagnostic'}}, # noqa: E501
Register.AT_COMMAND_BLOCKED: {'name': ['proxy', 'AT_Command_Blocked'], 'singleton': True, 'ha': {'dev': 'proxy', 'comp': 'sensor', 'dev_cla': None, 'stat_cla': None, 'id': 'at_cmd_blocked_', 'fmt': FMT_INT, 'name': 'AT Command Blocked', 'icon': COUNTER, 'ent_cat': 'diagnostic'}}, # noqa: E501
Register.DCU_COMMAND: {'name': ['proxy', 'DCU_Command'], 'singleton': True, 'ha': {'dev': 'proxy', 'comp': 'sensor', 'dev_cla': None, 'stat_cla': None, 'id': 'dcu_cmd_', 'fmt': FMT_INT, 'name': 'DCU Command', 'icon': COUNTER, 'ent_cat': 'diagnostic'}}, # noqa: E501
Register.MODBUS_COMMAND: {'name': ['proxy', 'Modbus_Command'], 'singleton': True, 'ha': {'dev': 'proxy', 'comp': 'sensor', 'dev_cla': None, 'stat_cla': None, 'id': 'modbus_cmd_', 'fmt': FMT_INT, 'name': 'Modbus Command', 'icon': COUNTER, 'ent_cat': 'diagnostic'}}, # noqa: E501
# 0xffffff03: {'name':['proxy', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V', 'ha':{'dev':'proxy', 'dev_cla': 'voltage', 'stat_cla': 'measurement', 'id':'proxy_volt_', 'fmt':FMT_FLOAT,'name': 'Grid Voltage'}}, # noqa: E501

View File

@@ -4,6 +4,7 @@ import logging
import traceback
import json
import gc
import socket
from aiomqtt import MqttCodeError
from asyncio import StreamReader, StreamWriter
from ipaddress import ip_address
@@ -138,7 +139,9 @@ class InverterBase(InverterIfc, Proxy):
f'Connected to {addr}')
asyncio.create_task(self.remote.ifc.client_loop(addr))
except (ConnectionRefusedError, TimeoutError) as error:
except (ConnectionRefusedError,
TimeoutError,
socket.gaierror) as error:
logging.info(f'{error}')
except Exception:
Infos.inc_counter('SW_Exception')

View File

@@ -193,7 +193,7 @@ class Message(ProtocolIfc):
return
self.mb.build_msg(dev_id, func, addr, val, log_lvl)
async def send_modbus_cmd(self, func, addr, val, log_lvl) -> None:
def send_modbus_cmd(self, func, addr, val, log_lvl) -> None:
self._send_modbus_cmd(Modbus.INV_ADDR, func, addr, val, log_lvl)
def _send_modbus_scan(self):

View File

@@ -66,7 +66,7 @@ class ModbusTcp():
try:
async with ModbusConn(host, port) as inverter:
stream = inverter.local.stream
await stream.send_start_cmd(snr, host, forward)
stream.send_start_cmd(snr, host, forward)
await stream.ifc.loop()
logger.info(f'[{stream.node_id}:{stream.conn_no}] '
f'Connection closed - Shutdown: '

140
app/src/mqtt.py Normal file → Executable file
View File

@@ -2,6 +2,8 @@ import asyncio
import logging
import aiomqtt
import traceback
import struct
import inspect
from modbus import Modbus
from messages import Message
@@ -27,14 +29,27 @@ class Mqtt(metaclass=Singleton):
loop = asyncio.get_event_loop()
self.task = loop.create_task(self.__loop())
self.ha_restarts = 0
self.topic_defs = [
{'prefix': 'auto_conf_prefix', 'topic': '/status',
'fnc': self._ha_status, 'args': []},
{'prefix': 'entity_prefix', 'topic': '/+/rated_load',
'fnc': self._modbus_cmd,
'args': [Modbus.WRITE_SINGLE_REG, 1, 0x2008]},
{'prefix': 'entity_prefix', 'topic': '/+/out_coeff',
'fnc': self._out_coeff, 'args': []},
{'prefix': 'entity_prefix', 'topic': '/+/dcu_power',
'fnc': self._dcu_cmd, 'args': []},
{'prefix': 'entity_prefix', 'topic': '/+/modbus_read_regs',
'fnc': self._modbus_cmd, 'args': [Modbus.READ_REGS, 2]},
{'prefix': 'entity_prefix', 'topic': '/+/modbus_read_inputs',
'fnc': self._modbus_cmd, 'args': [Modbus.READ_INPUTS, 2]},
{'prefix': 'entity_prefix', 'topic': '/+/at_cmd',
'fnc': self._at_cmd, 'args': []},
]
ha = Config.get('ha')
self.ha_status_topic = f"{ha['auto_conf_prefix']}/status"
self.mb_rated_topic = f"{ha['entity_prefix']}/+/rated_load"
self.mb_out_coeff_topic = f"{ha['entity_prefix']}/+/out_coeff"
self.mb_reads_topic = f"{ha['entity_prefix']}/+/modbus_read_regs"
self.mb_inputs_topic = f"{ha['entity_prefix']}/+/modbus_read_inputs"
self.mb_at_cmd_topic = f"{ha['entity_prefix']}/+/at_cmd"
for entry in self.topic_defs:
entry['full_topic'] = f"{ha[entry['prefix']]}{entry['topic']}"
@property
def ha_restarts(self):
@@ -75,19 +90,7 @@ class Mqtt(metaclass=Singleton):
try:
async with self.__client:
logger_mqtt.info('MQTT broker connection established')
self.ctime = datetime.now()
self.published = 0
self.received = 0
if self.__cb_mqtt_is_up:
await self.__cb_mqtt_is_up()
await self.__client.subscribe(self.ha_status_topic)
await self.__client.subscribe(self.mb_rated_topic)
await self.__client.subscribe(self.mb_out_coeff_topic)
await self.__client.subscribe(self.mb_reads_topic)
await self.__client.subscribe(self.mb_inputs_topic)
await self.__client.subscribe(self.mb_at_cmd_topic)
await self._init_new_conn()
async for message in self.__client.messages:
await self.dispatch_msg(message)
@@ -109,7 +112,7 @@ class Mqtt(metaclass=Singleton):
except asyncio.CancelledError:
logger_mqtt.debug("MQTT task cancelled")
self.__client = None
return
raise
except Exception:
# self.inc_counter('SW_Exception') # fixme
self.ctime = None
@@ -117,47 +120,51 @@ class Mqtt(metaclass=Singleton):
f"Exception:\n"
f"{traceback.format_exc()}")
async def _init_new_conn(self):
self.ctime = datetime.now()
self.published = 0
self.received = 0
if self.__cb_mqtt_is_up:
await self.__cb_mqtt_is_up()
for entry in self.topic_defs:
await self.__client.subscribe(entry['full_topic'])
async def dispatch_msg(self, message):
self.received += 1
if message.topic.matches(self.ha_status_topic):
status = message.payload.decode("UTF-8")
logger_mqtt.info('Home-Assistant Status:'
f' {status}')
if status == 'online':
self.ha_restarts += 1
for entry in self.topic_defs:
if message.topic.matches(entry['full_topic']) \
and 'fnc' in entry:
fnc = entry['fnc']
if inspect.iscoroutinefunction(fnc):
await entry['fnc'](message, *entry['args'])
elif callable(fnc):
entry['fnc'](message, *entry['args'])
async def _ha_status(self, message):
status = message.payload.decode("UTF-8")
logger_mqtt.info('Home-Assistant Status:'
f' {status}')
if status == 'online':
self.ha_restarts += 1
if self.__cb_mqtt_is_up:
await self.__cb_mqtt_is_up()
if message.topic.matches(self.mb_rated_topic):
await self.modbus_cmd(message,
Modbus.WRITE_SINGLE_REG,
1, 0x2008)
if message.topic.matches(self.mb_out_coeff_topic):
payload = message.payload.decode("UTF-8")
try:
val = round(float(payload) * 1024/100)
if val < 0 or val > 1024:
logger_mqtt.error('out_coeff: value must be in'
'the range 0..100,'
f' got: {payload}')
else:
await self.modbus_cmd(message,
Modbus.WRITE_SINGLE_REG,
0, 0x202c, val)
except Exception:
pass
if message.topic.matches(self.mb_reads_topic):
await self.modbus_cmd(message,
Modbus.READ_REGS, 2)
if message.topic.matches(self.mb_inputs_topic):
await self.modbus_cmd(message,
Modbus.READ_INPUTS, 2)
if message.topic.matches(self.mb_at_cmd_topic):
await self.at_cmd(message)
def _out_coeff(self, message):
payload = message.payload.decode("UTF-8")
try:
val = round(float(payload) * 1024/100)
if val < 0 or val > 1024:
logger_mqtt.error('out_coeff: value must be in'
'the range 0..100,'
f' got: {payload}')
else:
self._modbus_cmd(message,
Modbus.WRITE_SINGLE_REG,
0, 0x202c, val)
except Exception:
pass
def each_inverter(self, message, func_name: str):
topic = str(message.topic)
@@ -175,7 +182,7 @@ class Mqtt(metaclass=Singleton):
else:
logger_mqtt.warning(f'Node_id: {node_id} not found')
async def modbus_cmd(self, message, func, params=0, addr=0, val=0):
def _modbus_cmd(self, message, func, params=0, addr=0, val=0):
payload = message.payload.decode("UTF-8")
for fnc in self.each_inverter(message, "send_modbus_cmd"):
res = payload.split(',')
@@ -188,9 +195,24 @@ class Mqtt(metaclass=Singleton):
elif params == 2:
addr = int(res[0], base=16)
val = int(res[1]) # lenght
await fnc(func, addr, val, logging.INFO)
fnc(func, addr, val, logging.INFO)
async def at_cmd(self, message):
async def _at_cmd(self, message):
payload = message.payload.decode("UTF-8")
for fnc in self.each_inverter(message, "send_at_cmd"):
await fnc(payload)
def _dcu_cmd(self, message):
payload = message.payload.decode("UTF-8")
try:
val = round(float(payload) * 10)
if val < 1000 or val > 8000:
logger_mqtt.error('dcu_power: value must be in'
'the range 100..800,'
f' got: {payload}')
else:
pdu = struct.pack('>BBBBBBH', 1, 1, 6, 1, 0, 1, val)
for fnc in self.each_inverter(message, "send_dcu_cmd"):
fnc(pdu)
except Exception:
pass

View File

@@ -12,7 +12,7 @@ class Schedule:
count = 0
@classmethod
def start(cls) -> None:
def start(cls) -> None: # pragma: no cover
'''Start the scheduler and schedule the tasks (cron jobs)'''
logging.debug("Scheduler init")
cls.mqtt = Mqtt(None)
@@ -20,7 +20,7 @@ class Schedule:
crontab('0 0 * * *', func=cls.atmidnight, start=True)
@classmethod
async def atmidnight(cls) -> None:
async def atmidnight(cls) -> None: # pragma: no cover
'''Clear daily counters at midnight'''
logging.info("Clear daily counters at midnight")

View File

@@ -60,7 +60,16 @@ class Server():
@app.context_processor
def utility_processor():
return dict(version=self.version)
var = {'version': self.version,
'slug': os.getenv("SLUG"),
'hostname': os.getenv("HOSTNAME"),
}
if var['slug']:
var['hassio'] = True
slug_len = len(var['slug'])
var['addonname'] = var['slug'] + '_' + \
var['hostname'][slug_len+1:]
return var
def parse_args(self, arg_list: list[str] | None):
parser = argparse.ArgumentParser()
@@ -127,7 +136,8 @@ class Server():
def build_config(self):
# read config file
Config.init(ConfigReadToml(self.src_dir + "cnf/default_config.toml"),
log_path=self.log_path)
log_path=self.log_path,
cnf_path=self.config_path)
ConfigReadEnv()
ConfigReadJson(self.config_path + "config.json")
ConfigReadToml(self.config_path + "config.toml")

View File

@@ -29,9 +29,9 @@ def get_tz():
@web.context_processor
def utility_processor():
return dict(lang=babel_get_locale(),
lang_str=LANGUAGES.get(str(babel_get_locale()), "English"),
languages=LANGUAGES)
return {'lang': babel_get_locale(),
'lang_str': LANGUAGES.get(str(babel_get_locale()), "English"),
'languages': LANGUAGES}
@web.route('/language/<language>')

View File

@@ -22,3 +22,6 @@ class LogHandler(Handler, metaclass=Singleton):
def get_buffer(self, elms=0) -> list:
return list(self.buffer)[-elms:]
def clear(self):
self.buffer.clear()

View File

@@ -7,3 +7,4 @@
.fa-rotate-right:before{content:"\f01e"}
.fa-cloud-arrow-down-alt:before{content:"\f381"}
.fa-cloud-arrow-up-alt:before{content:"\f382"}
.fa-gear:before{content:"\f013"}

View File

@@ -57,8 +57,13 @@
<button href="#" class="w3-bar-item w3-button w3-padding-16 w3-hide-large w3-dark-grey w3-hover-black" onclick="w3_close()" title="close menu"><i class="fa fa-remove fa-fw"></i>  Close Menu</button>
<a href="{{ url_for('.index')}}" class="w3-bar-item w3-button w3-padding {% block menu1_class %}{% endblock %}"><i class="fa fa-network-wired fa-fw"></i>  {{_('Connections')}}</a>
<a href="{{ url_for('.mqtt')}}" class="w3-bar-item w3-button w3-padding {% block menu2_class %}{% endblock %}"><i class="fa fa-database fa-fw"></i>  MQTT</a>
<a href="{{ url_for('.notes')}}" class="w3-bar-item w3-button w3-padding {% block menu3_class %}{% endblock %}"><i class="fa fa-exclamation-triangle fa-fw"></i>  {{_('Important Messages')}}</a>
<a href="{{ url_for('.notes')}}" class="w3-bar-item w3-button w3-padding {% block menu3_class %}{% endblock %}"><i class="fa fa-info fa-fw"></i>  {{_('Important Messages')}}</a>
<a href="{{ url_for('.logging')}}" class="w3-bar-item w3-button w3-padding {% block menu4_class %}{% endblock %}"><i class="fa fa-file-export fa-fw"></i>  {{_('Log Files')}}</a>
{% if hassio is defined %}
<br>
<a href="/hassio/addon/{{addonname}}/config" target="_top" class="w3-bar-item w3-button w3-padding"><i class="fa fa-gear fa-fw"></i>  {{_('Add-on Config')}}</a>
<a href="/hassio/addon/{{addonname}}/logs" target="_top" class="w3-bar-item w3-button w3-padding"><i class="fa fa-file fa-fw"></i>  {{_('Add-on Log')}}</a>
{% endif %}
</div>
</nav>

View File

@@ -7,9 +7,9 @@
<div id="id01" class="w3-modal">
<div class="w3-modal-content" style="width:600px">
<div class="w3-container w3-padding-24">
<h2>{{_("Do you really want to delete the log file")}}:<br><b><span id="id03"></span></b> ?</h2>
<h2>{{_('Do you really want to delete the log file: <br>%(file)s ?', file='<b><span id="id03"></span></b>')}}</h2>
<div class="w3-bar">
<button id="id02" class="w3-button w3-red" onclick="deleteFile(); document.getElementById('id01').style.display='none'">{{_('Delete File</button')}}>
<button id="id02" class="w3-button w3-red" onclick="deleteFile(); document.getElementById('id01').style.display='none'">{{_('Delete File')}}</button>
<button class="w3-button w3-grey w3-right" onclick="document.getElementById('id01').style.display='none'">{{_('Abort')}}</button>
</div>
</div>

View File

@@ -2,7 +2,7 @@
{% block title %}{{_("TSUN Proxy - Important Messages")}}{% endblock title %}
{% block menu3_class %}w3-blue{% endblock %}
{% block headline %}<i class="fa fa-exclamation-triangle fa-fw"></i>  {{_('Important Messages')}}{% endblock headline %}
{% block headline %}<i class="fa fa-info fa-fw"></i>  {{_('Important Messages')}}{% endblock headline %}
{% block content %}
<div id="notes-list"></div>
{% endblock content%}

View File

@@ -1,19 +1,19 @@
2025-04-30 00:01:23 INFO | root | Server "proxy - unknown" will be started
2025-04-30 00:01:23 INFO | root | current dir: /Users/sallius/tsun/tsun-gen3-proxy
2025-04-30 00:01:23 INFO | root | config_path: ./config/
2025-04-30 00:01:23 INFO | root | json_config: None
2025-04-30 00:01:23 INFO | root | toml_config: None
2025-04-30 00:01:23 INFO | root | trans_path: ../translations/
2025-04-30 00:01:23 INFO | root | rel_urls: False
2025-04-30 00:01:23 INFO | root | log_path: ./log/
2025-04-30 00:01:23 INFO | root | log_backups: unlimited
2025-04-30 00:01:23 INFO | root | LOG_LVL : None
2025-04-30 00:01:23 INFO | root | ******
2025-04-30 00:01:23 INFO | root | Read from /Users/sallius/tsun/tsun-gen3-proxy/app/src/cnf/default_config.toml => ok
2025-04-30 00:01:23 INFO | root | Read from environment => ok
2025-04-30 00:01:23 INFO | root | Read from ./config/config.json => n/a
2025-04-30 00:01:23 INFO | root | Read from ./config/config.toml => n/a
2025-04-30 00:01:23 INFO | root | ******
2025-04-30 00:01:23 INFO | root | listen on port: 5005 for inverters
2025-04-30 00:01:23 INFO | root | listen on port: 10000 for inverters
2025-04-30 00:01:23 INFO | root | Start Quart
2025-04-30 00:01:24 INFO | root | current dir: /Users/sallius/tsun/tsun-gen3-proxy
2025-04-30 00:01:25 INFO | root | config_path: ./config/
2025-04-30 00:01:26 INFO | root | json_config: None
2025-04-30 00:01:27 INFO | root | toml_config: None
2025-04-30 00:01:28 INFO | root | trans_path: ../translations/
2025-04-30 00:01:29 INFO | root | rel_urls: False
2025-04-30 00:01:30 INFO | root | log_path: ./log/
2025-04-30 00:01:31 INFO | root | log_backups: unlimited
2025-04-30 00:01:32 INFO | root | LOG_LVL : None
2025-04-30 00:01:33 INFO | root | ******
2025-04-30 00:01:34 INFO | root | Read from /Users/sallius/tsun/tsun-gen3-proxy/app/src/cnf/default_config.toml => ok
2025-04-30 00:01:35 INFO | root | Read from environment => ok
2025-04-30 00:01:36 INFO | root | Read from ./config/config.json => n/a
2025-04-30 00:01:37 INFO | root | Read from ./config/config.toml => n/a
2025-04-30 00:01:38 INFO | root | ******
2025-04-30 00:01:39 INFO | root | listen on port: 5005 for inverters
2025-04-30 00:01:40 INFO | root | listen on port: 10000 for inverters
2025-04-30 00:01:41 INFO | root | Start Quart

View File

@@ -17,13 +17,13 @@ def test_statistic_counter():
assert val == None or val == 0
i.static_init() # initialize counter
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 0, "Cloud_Conn_Cnt": 0, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "Modbus_Command": 0}})
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 0, "Cloud_Conn_Cnt": 0, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "DCU_Command": 0, "Modbus_Command": 0}})
val = i.dev_value(Register.INVERTER_CNT) # valid and initiliazed addr
assert val == 0
i.inc_counter('Inverter_Cnt')
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 1, "Cloud_Conn_Cnt": 0, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "Modbus_Command": 0}})
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 1, "Cloud_Conn_Cnt": 0, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "DCU_Command": 0, "Modbus_Command": 0}})
val = i.dev_value(Register.INVERTER_CNT)
assert val == 1

View File

@@ -109,7 +109,7 @@ def test_default_db():
i = InfosG3P(client_mode=False)
assert json.dumps(i.db) == json.dumps({
"inverter": {"Manufacturer": "TSUN", "Equipment_Model": "TSOL-MSxx00", "No_Inputs": 4},
"inverter": {"Manufacturer": "TSUN", "Equipment_Model": "TSOL-MSxx00", "No_Inputs": 2},
"collector": {"Chip_Type": "IGEN TECH"},
})
@@ -271,7 +271,7 @@ def test_build_ha_conf1():
elif id == 'inv_count_456':
assert False
assert tests==7
assert tests==5
def test_build_ha_conf2():
i = InfosG3P(client_mode=False)
@@ -346,7 +346,7 @@ def test_build_ha_conf3():
elif id == 'inv_count_456':
assert False
assert tests==7
assert tests==5
def test_build_ha_conf4():
i = InfosG3P(client_mode=True)

259
app/tests/test_mqtt.py Normal file → Executable file
View File

@@ -3,8 +3,10 @@ import pytest
import asyncio
import aiomqtt
import logging
from aiomqtt import MqttError, MessagesIterator
from aiomqtt import Message as AiomqttMessage
from mock import patch, Mock
from async_stream import AsyncIfcImpl
from singleton import Singleton
from mqtt import Mqtt
@@ -17,7 +19,7 @@ NO_MOSQUITTO_TEST = False
pytest_plugins = ('pytest_asyncio',)
@pytest.fixture(scope="module", autouse=True)
@pytest.fixture(scope="function", autouse=True)
def module_init():
Singleton._instances.clear()
yield
@@ -33,6 +35,26 @@ def test_hostname():
# else:
return 'test.mosquitto.org'
@pytest.fixture(scope="function")
def aiomqtt_mock(monkeypatch):
recv_que = asyncio.Queue()
async def my_aenter(self):
return self
async def my_subscribe(self, *arg):
return
async def my_anext(self):
return await recv_que.get()
async def my_receive(self, topic: str, payload: bytes):
msg = AiomqttMessage(topic, payload,qos=0, retain=False, mid=0, properties=None)
await recv_que.put(msg)
await asyncio.sleep(0) # dispath the msg
monkeypatch.setattr(aiomqtt.Client, "__aenter__", my_aenter)
monkeypatch.setattr(aiomqtt.Client, "subscribe", my_subscribe)
monkeypatch.setattr(MessagesIterator, "__anext__", my_anext)
monkeypatch.setattr(Mqtt, "receive", my_receive, False)
@pytest.fixture
def config_mqtt_conn(test_hostname, test_port):
Config.act_config = {'mqtt':{'host': test_hostname, 'port': test_port, 'user': '', 'passwd': ''},
@@ -44,6 +66,14 @@ def config_no_conn(test_port):
Config.act_config = {'mqtt':{'host': "", 'port': test_port, 'user': '', 'passwd': ''},
'ha':{'auto_conf_prefix': 'homeassistant','discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun'}
}
Config.def_config = {}
@pytest.fixture
def config_def_conn(test_port):
Config.act_config = {'mqtt':{'host': "unknown_url", 'port': test_port, 'user': '', 'passwd': ''},
'ha':{'auto_conf_prefix': 'homeassistant','discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun'}
}
Config.def_config = Config.act_config
@pytest.fixture
def spy_at_cmd():
@@ -69,6 +99,14 @@ def spy_modbus_cmd_client():
yield wrapped_conn
conn.close()
@pytest.fixture
def spy_dcu_cmd():
conn = SolarmanV5(None, ('test.local', 1234), server_side=True, client_mode= False, ifc=AsyncIfcImpl())
conn.node_id = 'inv_3/'
with patch.object(conn, 'send_dcu_cmd', wraps=conn.send_dcu_cmd) as wrapped_conn:
yield wrapped_conn
conn.close()
def test_native_client(test_hostname, test_port):
"""Sanity check: Make sure the paho-mqtt client can connect to the test
MQTT server. Otherwise the test set NO_MOSQUITTO_TEST to True and disable
@@ -144,13 +182,17 @@ async def test_ha_reconnect(config_mqtt_conn):
await m.close()
@pytest.mark.asyncio
async def test_mqtt_no_config(config_no_conn):
async def test_mqtt_no_config(config_no_conn, monkeypatch):
_ = config_no_conn
assert asyncio.get_running_loop()
on_connect = asyncio.Event()
async def cb():
on_connect.set()
async def my_publish(*args):
return
monkeypatch.setattr(aiomqtt.Client, "publish", my_publish)
try:
m = Mqtt(cb)
@@ -159,74 +201,193 @@ async def test_mqtt_no_config(config_no_conn):
assert not on_connect.is_set()
try:
await m.publish('homeassistant/status', 'online')
assert False
assert m.published == 1
except Exception:
pass
assert False
except TimeoutError:
assert False
finally:
await m.close()
@pytest.mark.asyncio
async def test_msg_dispatch(config_mqtt_conn, spy_modbus_cmd):
async def test_mqtt_except_no_config(config_no_conn, monkeypatch, caplog):
_ = config_no_conn
assert asyncio.get_running_loop()
async def my_aenter(self):
raise MqttError('TestException') from None
monkeypatch.setattr(aiomqtt.Client, "__aenter__", my_aenter)
LOGGER = logging.getLogger("mqtt")
LOGGER.propagate = True
LOGGER.setLevel(logging.INFO)
with caplog.at_level(logging.INFO):
m = Mqtt(None)
assert m.task
await asyncio.sleep(0)
try:
await m.publish('homeassistant/status', 'online')
assert False
except MqttError:
pass
except Exception:
assert False
finally:
await m.close()
assert 'Connection lost; Reconnecting in 5 seconds' in caplog.text
@pytest.mark.asyncio
async def test_mqtt_except_def_config(config_def_conn, monkeypatch, caplog):
_ = config_def_conn
assert asyncio.get_running_loop()
on_connect = asyncio.Event()
async def cb():
on_connect.set()
async def my_aenter(self):
raise MqttError('TestException') from None
monkeypatch.setattr(aiomqtt.Client, "__aenter__", my_aenter)
LOGGER = logging.getLogger("mqtt")
LOGGER.propagate = True
LOGGER.setLevel(logging.INFO)
with caplog.at_level(logging.INFO):
m = Mqtt(cb)
assert m.task
await asyncio.sleep(0)
assert not on_connect.is_set()
try:
await m.publish('homeassistant/status', 'online')
assert False
except MqttError:
pass
except Exception:
assert False
finally:
await m.close()
assert 'MQTT is unconfigured; Check your config.toml!' in caplog.text
@pytest.mark.asyncio
async def test_mqtt_dispatch(config_mqtt_conn, aiomqtt_mock, spy_modbus_cmd):
_ = config_mqtt_conn
_ = aiomqtt_mock
spy = spy_modbus_cmd
try:
m = Mqtt(None)
msg = aiomqtt.Message(topic= 'tsun/inv_1/rated_load', payload= b'2', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x2008, 2, logging.INFO)
assert m.ha_restarts == 0
await m.receive('homeassistant/status', b'online') # send the message
assert m.ha_restarts == 1
await m.receive(topic= 'tsun/inv_1/rated_load', payload= b'2')
spy.assert_called_once_with(Modbus.WRITE_SINGLE_REG, 0x2008, 2, logging.INFO)
spy.reset_mock()
await m.receive(topic= 'tsun/inv_1/out_coeff', payload= b'100')
spy.assert_called_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 1024, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'100', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 1024, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'50', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 512, logging.INFO)
await m.receive(topic= 'tsun/inv_1/out_coeff', payload= b'50')
spy.assert_called_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 512, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/modbus_read_regs', payload= b'0x3000, 10', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_awaited_once_with(Modbus.READ_REGS, 0x3000, 10, logging.INFO)
await m.receive(topic= 'tsun/inv_1/modbus_read_regs', payload= b'0x3000, 10')
spy.assert_called_once_with(Modbus.READ_REGS, 0x3000, 10, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/modbus_read_inputs', payload= b'0x3000, 10', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_awaited_once_with(Modbus.READ_INPUTS, 0x3000, 10, logging.INFO)
await m.receive(topic= 'tsun/inv_1/modbus_read_inputs', payload= b'0x3000, 10')
spy.assert_called_once_with(Modbus.READ_INPUTS, 0x3000, 10, logging.INFO)
# test dispatching with empty mapping table
m.topic_defs.clear()
spy.reset_mock()
await m.receive(topic= 'tsun/inv_1/modbus_read_inputs', payload= b'0x3000, 10')
spy.assert_not_called()
# test dispatching with incomplete mapping table - invalid fnc defined
m.topic_defs.append(
{'prefix': 'entity_prefix', 'topic': '/+/modbus_read_inputs',
'full_topic': 'tsun/+/modbus_read_inputs', 'fnc': 'addr'}
)
spy.reset_mock()
await m.receive(topic= 'tsun/inv_1/modbus_read_inputs', payload= b'0x3000, 10')
spy.assert_not_called()
except MqttError:
assert False
except Exception:
assert False
finally:
await m.close()
@pytest.mark.asyncio
async def test_msg_dispatch_err(config_mqtt_conn, spy_modbus_cmd):
async def test_mqtt_dispatch_cb(config_mqtt_conn, aiomqtt_mock):
_ = config_mqtt_conn
_ = aiomqtt_mock
on_connect = asyncio.Event()
async def cb():
on_connect.set()
try:
m = Mqtt(cb)
assert m.ha_restarts == 0
await m.receive('homeassistant/status', b'online') # send the message
assert on_connect.is_set()
assert m.ha_restarts == 1
except MqttError:
assert False
except Exception:
assert False
finally:
await m.close()
@pytest.mark.asyncio
async def test_mqtt_dispatch_err(config_mqtt_conn, aiomqtt_mock, spy_modbus_cmd, caplog):
_ = config_mqtt_conn
_ = aiomqtt_mock
spy = spy_modbus_cmd
LOGGER = logging.getLogger("mqtt")
LOGGER.propagate = True
LOGGER.setLevel(logging.INFO)
try:
m = Mqtt(None)
# test out of range param
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'-1', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_1/out_coeff', payload= b'-1')
spy.assert_not_called()
# test unknown node_id
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_2/out_coeff', payload= b'2', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_2/out_coeff', payload= b'2')
spy.assert_not_called()
# test invalid fload param
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'2, 3', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_1/out_coeff', payload= b'2, 3')
spy.assert_not_called()
await m.receive(topic= 'tsun/inv_1/modbus_read_regs', payload= b'0x3000, 10, 7')
spy.assert_not_called()
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/modbus_read_regs', payload= b'0x3000, 10, 7', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_1/dcu_power', payload= b'100W')
spy.assert_not_called()
with caplog.at_level(logging.INFO):
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'2', qos= 0, retain = False, mid= 0, properties= None)
for _ in m.each_inverter(msg, "addr"):
pass # do nothing here
assert 'Cmd not supported by: inv_1/' in caplog.text
except MqttError:
assert False
except Exception:
assert False
finally:
await m.close()
@@ -267,3 +428,31 @@ async def test_at_cmd_dispatch(config_mqtt_conn, spy_at_cmd):
finally:
await m.close()
@pytest.mark.asyncio
async def test_dcu_dispatch(config_mqtt_conn, spy_dcu_cmd):
_ = config_mqtt_conn
spy = spy_dcu_cmd
try:
m = Mqtt(None)
msg = aiomqtt.Message(topic= 'tsun/inv_3/dcu_power', payload= b'100.0', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_called_once_with(b'\x01\x01\x06\x01\x00\x01\x03\xe8')
finally:
await m.close()
@pytest.mark.asyncio
async def test_dcu_inv_value(config_mqtt_conn, spy_dcu_cmd):
_ = config_mqtt_conn
spy = spy_dcu_cmd
try:
m = Mqtt(None)
msg = aiomqtt.Message(topic= 'tsun/inv_3/dcu_power', payload= b'99.9', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_not_called()
msg = aiomqtt.Message(topic= 'tsun/inv_3/dcu_power', payload= b'800.1', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_not_called()
finally:
await m.close()

View File

@@ -4,6 +4,10 @@ import logging
import os
from mock import patch
from server import app, Server, ProxyState, HypercornLogHndl
from inverter_base import InverterBase
from gen3.talent import Talent
from test_inverter_base import FakeReader, FakeWriter
pytest_plugins = ('pytest_asyncio',)
@@ -108,20 +112,20 @@ class TestServerClass:
assert logging.getLogger('hypercorn.access').level == logging.INFO
assert logging.getLogger('hypercorn.error').level == logging.INFO
os.environ["LOG_LVL"] = "WARN"
s.parse_args(['--log_backups', '3'])
s.init_logging_system()
assert s.log_backups == 3
assert s.log_level == logging.WARNING
assert logging.handlers.log_backups == 3
assert logging.getLogger().level == s.log_level
assert logging.getLogger('msg').level == s.log_level
assert logging.getLogger('conn').level == s.log_level
assert logging.getLogger('data').level == s.log_level
assert logging.getLogger('tracer').level == s.log_level
assert logging.getLogger('asyncio').level == s.log_level
assert logging.getLogger('hypercorn.access').level == logging.INFO
assert logging.getLogger('hypercorn.error').level == logging.INFO
with patch.dict(os.environ, {'LOG_LVL': 'WARN'}):
s.parse_args(['--log_backups', '3'])
s.init_logging_system()
assert s.log_backups == 3
assert s.log_level == logging.WARNING
assert logging.handlers.log_backups == 3
assert logging.getLogger().level == s.log_level
assert logging.getLogger('msg').level == s.log_level
assert logging.getLogger('conn').level == s.log_level
assert logging.getLogger('data').level == s.log_level
assert logging.getLogger('tracer').level == s.log_level
assert logging.getLogger('asyncio').level == s.log_level
assert logging.getLogger('hypercorn.access').level == logging.INFO
assert logging.getLogger('hypercorn.error').level == logging.INFO
def test_build_config_error(self, caplog):
s = self.FakeServer()
@@ -187,6 +191,7 @@ class TestApp:
"""Test the ready route."""
ProxyState.set_up(False)
app.testing = True
client = app.test_client()
response = await client.get('/-/ready')
assert response.status_code == 503
@@ -202,17 +207,84 @@ class TestApp:
@pytest.mark.asyncio
async def test_healthy(self):
"""Test the healthy route."""
reader = FakeReader()
writer = FakeWriter()
ProxyState.set_up(False)
client = app.test_client()
response = await client.get('/-/healthy')
assert response.status_code == 200
result = await response.get_data()
assert result == b"I'm fine"
with InverterBase(reader, writer, 'tsun', Talent):
ProxyState.set_up(False)
app.testing = True
client = app.test_client()
response = await client.get('/-/healthy')
assert response.status_code == 200
result = await response.get_data()
assert result == b"I'm fine"
ProxyState.set_up(True)
response = await client.get('/-/healthy')
assert response.status_code == 200
result = await response.get_data()
assert result == b"I'm fine"
ProxyState.set_up(True)
response = await client.get('/-/healthy')
assert response.status_code == 200
result = await response.get_data()
assert result == b"I'm fine"
@pytest.mark.asyncio
async def test_unhealthy(self, monkeypatch, caplog):
"""Test the healthy route."""
def result_false(self):
return False
LOGGER = logging.getLogger("mqtt")
LOGGER.propagate = True
LOGGER.setLevel(logging.INFO)
monkeypatch.setattr(InverterBase, "healthy", result_false)
InverterBase._registry.clear()
reader = FakeReader()
writer = FakeWriter()
with caplog.at_level(logging.INFO) and InverterBase(reader, writer, 'tsun', Talent):
ProxyState.set_up(False)
app.testing = True
client = app.test_client()
response = await client.get('/-/healthy')
assert response.status_code == 200
result = await response.get_data()
assert result == b"I'm fine"
assert "" == caplog.text
ProxyState.set_up(True)
response = await client.get('/-/healthy')
assert response.status_code == 503
result = await response.get_data()
assert result == b"I have a problem"
assert "" == caplog.text
@pytest.mark.asyncio
async def test_healthy_exception(self, monkeypatch, caplog):
"""Test the healthy route."""
def result_except(self):
raise ValueError
LOGGER = logging.getLogger("mqtt")
LOGGER.propagate = True
LOGGER.setLevel(logging.INFO)
monkeypatch.setattr(InverterBase, "healthy", result_except)
InverterBase._registry.clear()
reader = FakeReader()
writer = FakeWriter()
with caplog.at_level(logging.INFO) and InverterBase(reader, writer, 'tsun', Talent):
ProxyState.set_up(False)
app.testing = True
client = app.test_client()
response = await client.get('/-/healthy')
assert response.status_code == 200
result = await response.get_data()
assert result == b"I'm fine"
assert "" == caplog.text
ProxyState.set_up(True)
response = await client.get('/-/healthy')
assert response.status_code == 200
result = await response.get_data()
assert result == b"I'm fine"
assert "Exception:" in caplog.text

281
app/tests/test_solarman.py Normal file → Executable file
View File

@@ -462,6 +462,39 @@ def inverter_ind_msg800(): # 0x4210 rated Power 800W
msg += b'\x15'
return msg
@pytest.fixture
def inverter_ind_msg900(): # 0x4210 rated Power 900W
msg = b'\xa5\x99\x01\x10\x42\xe6\x9e' +get_sn() +b'\x01\xb0\x02\xbc\xc8'
msg += b'\x24\x32\x6c\x1f\x00\x00\xa0\x47\xe4\x33\x01\x00\x03\x08\x00\x00'
msg += b'\x59\x31\x37\x45\x37\x41\x30\x46\x30\x31\x30\x42\x30\x31\x33\x45'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x01\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x40\x10\x08\xc8\x00\x49\x13\x8d\x00\x36\x00\x00\x03\x84\x06\x7a'
msg += b'\x01\x61\x00\xa8\x02\x54\x01\x5a\x00\x8a\x01\xe4\x01\x5a\x00\xbd'
msg += b'\x02\x8f\x00\x11\x00\x01\x00\x00\x00\x0b\x00\x00\x27\x98\x00\x04'
msg += b'\x00\x00\x0c\x04\x00\x03\x00\x00\x0a\xe7\x00\x05\x00\x00\x0c\x75'
msg += b'\x00\x00\x00\x00\x06\x16\x02\x00\x00\x00\x55\xaa\x00\x01\x00\x00'
msg += b'\x00\x00\x00\x00\xff\xff\x03\x84\x00\x03\x04\x00\x04\x00\x04\x00'
msg += b'\x04\x00\x00\x01\xff\xff\x00\x01\x00\x06\x00\x68\x00\x68\x05\x00'
msg += b'\x09\xcd\x07\xb6\x13\x9c\x13\x24\x00\x01\x07\xae\x04\x0f\x00\x41'
msg += b'\x00\x0f\x0a\x64\x0a\x64\x00\x06\x00\x06\x09\xf6\x12\x8c\x12\x8c'
msg += b'\x00\x10\x00\x10\x14\x52\x14\x52\x00\x10\x00\x10\x01\x51\x00\x05'
msg += b'\x04\x00\x00\x01\x13\x9c\x0f\xa0\x00\x4e\x00\x66\x03\xe8\x04\x00'
msg += b'\x09\xce\x07\xa8\x13\x9c\x13\x26\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x04\x00\x04\x00\x00\x00\x00\x00\xff\xff\x00\x00'
msg += b'\x00\x00\x00\x00'
msg += correct_checksum(msg)
msg += b'\x15'
return msg
@pytest.fixture
def inverter_ind_msg_81(): # 0x4210 fcode 0x81
msg = b'\xa5\x99\x01\x10\x42\x02\x03' +get_sn() +b'\x81\xb0\x02\xbc\xc8'
@@ -676,6 +709,19 @@ def msg_modbus_rsp(): # 0x1510
msg += b'\x15'
return msg
@pytest.fixture
def msg_modbus_rsp_mb_4(): # 0x1510, MODBUS Type:4
msg = b'\xa5\x3b\x00\x10\x15\x03\x03' +get_sn() +b'\x02\x01'
msg += total()
msg += hb()
msg += b'\x0a\xe2\xfa\x33\x01\x04\x28\x40\x10\x08\xd8'
msg += b'\x00\x00\x13\x87\x00\x31\x00\x68\x02\x58\x00\x00\x01\x53\x00\x02'
msg += b'\x00\x00\x01\x52\x00\x02\x00\x00\x01\x53\x00\x03\x00\x00\x00\x04'
msg += b'\x00\x01\x00\x00\x9e\xa4'
msg += correct_checksum(msg)
msg += b'\x15'
return msg
@pytest.fixture
def msg_modbus_interim_rsp(): # 0x0510
msg = b'\xa5\x3b\x00\x10\x15\x03\x03' +get_sn() +b'\x02\x01'
@@ -812,6 +858,26 @@ def dcu_data_rsp_msg(): # 0x1210
msg += b'\x15'
return msg
@pytest.fixture
def dcu_command_ind_msg(): # 0x4510
msg = b'\xa5\x17\x00\x10\x45\x94\x02' +get_dcu_sn() +b'\x05\x26\x30'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x01\x01\x06\x01\x00\x01\x03\xe8'
msg += correct_checksum(msg)
msg += b'\x15'
return msg
@pytest.fixture
def dcu_command_rsp_msg(): # 0x1510
msg = b'\xa5\x11\x00\x10\x15\x94\x03' +get_dcu_sn() +b'\x05\x01'
msg += total()
msg += hb()
msg += b'\x00\x00\x00\x00'
msg += b'\x01\x01\x01'
msg += correct_checksum(msg)
msg += b'\x15'
return msg
@pytest.fixture
def config_tsun_allow_all():
Config.act_config = {
@@ -854,7 +920,17 @@ def config_tsun_scan_dcu():
@pytest.fixture
def config_tsun_dcu1():
Config.act_config = {'solarman':{'enabled': True},'batteries':{'4100000000000001':{'monitor_sn': 2070233888, 'node_id':'inv1/', 'modbus_polling': True, 'suggested_area':'roof', 'sensor_list': 0}}}
Config.act_config = {
'ha':{
'auto_conf_prefix': 'homeassistant',
'discovery_prefix': 'homeassistant',
'entity_prefix': 'tsun',
'proxy_node_id': 'test_1',
'proxy_unique_id': ''
},
'solarman':{'enabled': True, 'host': 'test_cloud.local', 'port': 1234},'batteries':{'4100000000000001':{'monitor_sn': 2070233888, 'node_id':'inv1/', 'modbus_polling': True, 'suggested_area':'roof', 'sensor_list': 0}}}
Proxy.class_init()
Proxy.mqtt = Mqtt()
@pytest.mark.asyncio
async def test_read_message(device_ind_msg):
@@ -1405,6 +1481,7 @@ async def test_build_modell_600(my_loop, config_tsun_allow_all, inverter_ind_msg
m.read() # read complete msg, and dispatch msg
assert 2000 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 600 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 4 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MS2000(600)' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
assert '02b0' == m.db.get_db_value(Register.SENSOR_LIST, None)
assert 0 == m.sensor_list # must not been set by an inverter data ind
@@ -1424,6 +1501,7 @@ async def test_build_modell_1600(my_loop, config_tsun_allow_all, inverter_ind_ms
m.read() # read complete msg, and dispatch msg
assert 1600 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 1600 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 4 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MS1600' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
m.close()
@@ -1437,6 +1515,7 @@ async def test_build_modell_1800(my_loop, config_tsun_allow_all, inverter_ind_ms
m.read() # read complete msg, and dispatch msg
assert 1800 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 1800 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 4 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MS1800' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
m.close()
@@ -1450,6 +1529,7 @@ async def test_build_modell_2000(my_loop, config_tsun_allow_all, inverter_ind_ms
m.read() # read complete msg, and dispatch msg
assert 2000 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 2000 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 4 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MS2000' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
m.close()
@@ -1463,6 +1543,21 @@ async def test_build_modell_800(my_loop, config_tsun_allow_all, inverter_ind_msg
m.read() # read complete msg, and dispatch msg
assert 800 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 800 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 2 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MS800' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
m.close()
@pytest.mark.asyncio
async def test_build_modell_900(my_loop, config_tsun_allow_all, inverter_ind_msg900):
_ = config_tsun_allow_all
m = MemoryStream(inverter_ind_msg900, (0,))
assert 0 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert None == m.db.get_db_value(Register.RATED_POWER, None)
assert None == m.db.get_db_value(Register.INVERTER_TEMP, None)
m.read() # read complete msg, and dispatch msg
assert 900 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 900 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 2 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MSxx00' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
m.close()
@@ -1529,7 +1624,7 @@ async def test_msg_build_modbus_req(my_loop, config_tsun_inv1, device_ind_msg, d
assert m.ifc.tx_fifo.get()==device_rsp_msg
assert m.ifc.fwd_fifo.get()==device_ind_msg
await m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
assert 0 == m.send_msg_ofs
assert m.ifc.fwd_fifo.get() == b''
assert m.sent_pdu == b'' # modbus command must be ignore, cause connection is still not up
@@ -1547,7 +1642,7 @@ async def test_msg_build_modbus_req(my_loop, config_tsun_inv1, device_ind_msg, d
assert m.ifc.tx_fifo.get()==inverter_rsp_msg
assert m.ifc.fwd_fifo.get()==inverter_ind_msg
await m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
assert 0 == m.send_msg_ofs
assert m.ifc.fwd_fifo.get() == b''
assert m.sent_pdu == msg_modbus_cmd
@@ -2159,6 +2254,61 @@ async def test_modbus_scaning(config_tsun_scan, heartbeat_ind_msg, heartbeat_rsp
assert next(m.mb_timer.exp_count) == 3
m.close()
@pytest.mark.asyncio
async def test_modbus_scaning_inv_rsp(config_tsun_scan, heartbeat_ind_msg, heartbeat_rsp_msg, msg_modbus_rsp_mb_4):
_ = config_tsun_scan
assert asyncio.get_running_loop()
m = MemoryStream(heartbeat_ind_msg, (0x15,0x56,0))
m.append_msg(msg_modbus_rsp_mb_4)
assert m.mb_scan == False
assert asyncio.get_running_loop() == m.mb_timer.loop
m.db.stat['proxy']['Unknown_Ctrl'] = 0
assert m.mb_timer.tim == None
m.read() # read complete msg, and dispatch msg
assert m.mb_scan == True
assert m.mb_start_reg == 0xff80
assert m.mb_step == 0x40
assert m.mb_bytes == 0x14
assert asyncio.get_running_loop() == m.mb_timer.loop
assert not m.header_valid # must be invalid, since msg was handled and buffer flushed
assert m.msg_count == 1
assert m.snr == 2070233889
assert m.control == 0x4710
assert m.msg_recvd[0]['control']==0x4710
assert m.msg_recvd[0]['seq']=='84:11'
assert m.msg_recvd[0]['data_len']==0x1
assert m.ifc.tx_fifo.get()==heartbeat_rsp_msg
assert m.ifc.fwd_fifo.get()==heartbeat_ind_msg
assert m.db.stat['proxy']['Unknown_Ctrl'] == 0
m.ifc.tx_clear() # clear send buffer for next test
assert isclose(m.mb_timeout, 0.5)
assert next(m.mb_timer.exp_count) == 0
await asyncio.sleep(0.5)
assert m.sent_pdu==b'\xa5\x17\x00\x10E\x12\x84!Ce{\x02\xb0\x02\x00\x00\x00\x00\x00\x00' \
b'\x00\x00\x00\x00\x00\x00\x01\x03\xff\xc0\x00\x14\x75\xed\x33\x15'
assert m.ifc.tx_fifo.get()==b''
m.read() # read complete msg, and dispatch msg
assert not m.header_valid # must be invalid, since msg was handled and buffer flushed
assert m.msg_count == 2
assert m.msg_recvd[1]['control']==0x1510
assert m.msg_recvd[1]['seq']=='03:03'
assert m.msg_recvd[1]['data_len']==0x3b
assert m.mb.last_addr == 1
assert m.mb.last_fcode == 3
assert m.mb.last_reg == 0xffc0 # mb_start_reg + mb_step
assert m.mb.last_len == 20
assert m.mb.err == 3
assert next(m.mb_timer.exp_count) == 2
m.close()
@pytest.mark.asyncio
async def test_start_client_mode(my_loop, config_tsun_inv1, str_test_ip):
_ = config_tsun_inv1
@@ -2168,7 +2318,7 @@ async def test_start_client_mode(my_loop, config_tsun_inv1, str_test_ip):
assert m.no_forwarding == False
assert m.mb_timer.tim == None
assert asyncio.get_running_loop() == m.mb_timer.loop
await m.send_start_cmd(get_sn_int(), str_test_ip, False, m.mb_first_timeout)
m.send_start_cmd(get_sn_int(), str_test_ip, False, m.mb_first_timeout)
assert m.sent_pdu==bytearray(b'\xa5\x17\x00\x10E\x01\x00!Ce{\x02\xb0\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x030\x00\x000J\xde\xf1\x15')
assert m.db.get_db_value(Register.IP_ADDRESS) == str_test_ip
assert isclose(m.db.get_db_value(Register.POLLING_INTERVAL), 0.5)
@@ -2201,7 +2351,7 @@ async def test_start_client_mode_scan(config_tsun_scan_dcu, str_test_ip, dcu_mod
assert m.no_forwarding == False
assert m.mb_timer.tim == None
assert asyncio.get_running_loop() == m.mb_timer.loop
await m.send_start_cmd(get_dcu_sn_int(), str_test_ip, False, m.mb_first_timeout)
m.send_start_cmd(get_dcu_sn_int(), str_test_ip, False, m.mb_first_timeout)
assert m.mb_start_reg == 0x0000
assert m.mb_step == 0x100
assert m.mb_bytes == 0x2d
@@ -2402,3 +2552,124 @@ async def test_proxy_at_blocked(my_loop, config_tsun_inv1, patch_open_connection
assert Proxy.mqtt.key == 'tsun/inv1/at_resp'
assert Proxy.mqtt.data == "+ok"
@pytest.mark.asyncio
async def test_dcu_cmd(my_loop, config_tsun_allow_all, dcu_dev_ind_msg, dcu_dev_rsp_msg, dcu_data_ind_msg, dcu_data_rsp_msg, dcu_command_ind_msg, dcu_command_rsp_msg):
'''test dcu_power command fpr a DCU device with sensor 0x3026'''
_ = config_tsun_allow_all
m = MemoryStream(dcu_dev_ind_msg, (0,), True)
m.read() # read device ind
assert m.control == 0x4110
assert str(m.seq) == '01:92'
assert m.ifc.tx_fifo.get()==dcu_dev_rsp_msg
assert m.ifc.fwd_fifo.get()==dcu_dev_ind_msg
m.send_dcu_cmd(b'\x01\x01\x06\x01\x00\x01\x03\xe8')
assert m.ifc.tx_fifo.get()==b''
assert m.ifc.fwd_fifo.get()==b''
assert m.sent_pdu == b''
assert str(m.seq) == '01:92'
assert Proxy.mqtt.key == ''
assert Proxy.mqtt.data == ""
m.append_msg(dcu_data_ind_msg)
m.read() # read inverter ind
assert m.control == 0x4210
assert str(m.seq) == '02:93'
assert m.ifc.tx_fifo.get()==dcu_data_rsp_msg
assert m.ifc.fwd_fifo.get()==dcu_data_ind_msg
m.send_dcu_cmd(b'\x01\x01\x06\x01\x00\x01\x03\xe8')
assert m.ifc.fwd_fifo.get() == b''
assert m.ifc.tx_fifo.get()== b''
assert m.sent_pdu == dcu_command_ind_msg
m.sent_pdu = bytearray()
assert str(m.seq) == '02:94'
assert Proxy.mqtt.key == ''
assert Proxy.mqtt.data == ""
m.append_msg(dcu_command_rsp_msg)
m.read() # read at resp
assert m.control == 0x1510
assert str(m.seq) == '03:94'
assert m.ifc.rx_get()==b''
assert m.ifc.tx_fifo.get()==b''
assert m.ifc.fwd_fifo.get()==b''
assert Proxy.mqtt.key == 'tsun/dcu_resp'
assert Proxy.mqtt.data == "+ok"
Proxy.mqtt.clear() # clear last test result
@pytest.mark.asyncio
async def test_dcu_cmd_not_supported(my_loop, config_tsun_allow_all, device_ind_msg, device_rsp_msg, inverter_ind_msg, inverter_rsp_msg):
'''test that an inverter don't accept the dcu_power command'''
_ = config_tsun_allow_all
m = MemoryStream(device_ind_msg, (0,), True)
m.read() # read device ind
assert m.control == 0x4110
assert str(m.seq) == '01:01'
assert m.ifc.tx_fifo.get()==device_rsp_msg
assert m.ifc.fwd_fifo.get()==device_ind_msg
m.send_dcu_cmd(b'\x01\x01\x06\x01\x00\x01\x03\xe8')
assert m.ifc.tx_fifo.get()==b''
assert m.ifc.fwd_fifo.get()==b''
assert m.sent_pdu == b''
assert str(m.seq) == '01:01'
assert Proxy.mqtt.key == ''
assert Proxy.mqtt.data == ""
m.append_msg(inverter_ind_msg)
m.read() # read inverter ind
assert m.control == 0x4210
assert str(m.seq) == '02:02'
assert m.ifc.tx_fifo.get()==inverter_rsp_msg
assert m.ifc.fwd_fifo.get()==inverter_ind_msg
m.send_dcu_cmd(b'\x01\x01\x06\x01\x00\x01\x03\xe8')
assert m.ifc.fwd_fifo.get() == b''
assert m.ifc.tx_fifo.get()== b''
assert m.sent_pdu == b''
Proxy.mqtt.clear() # clear last test result
@pytest.mark.asyncio
async def test_proxy_dcu_cmd(my_loop, config_tsun_dcu1, patch_open_connection, dcu_command_ind_msg, dcu_command_rsp_msg):
_ = config_tsun_inv1
_ = patch_open_connection
assert asyncio.get_running_loop()
with InverterTest(FakeReader(), FakeWriter(), client_mode=False) as inverter:
await inverter.create_remote()
await asyncio.sleep(0)
r = inverter.remote.stream
l = inverter.local.stream
l.db.stat['proxy']['DCU_Command'] = 0
l.db.stat['proxy']['AT_Command'] = 0
l.db.stat['proxy']['Unknown_Ctrl'] = 0
l.db.stat['proxy']['AT_Command_Blocked'] = 0
l.db.stat['proxy']['Modbus_Command'] = 0
inverter.forward_dcu_cmd_resp = False
r.append_msg(dcu_command_ind_msg)
r.read() # read complete msg, and dispatch msg
assert inverter.forward_dcu_cmd_resp
inverter.forward(r,l)
assert l.ifc.tx_fifo.get()==dcu_command_ind_msg
assert l.db.stat['proxy']['Invalid_Msg_Format'] == 0
assert l.db.stat['proxy']['DCU_Command'] == 1
assert l.db.stat['proxy']['AT_Command'] == 0
assert l.db.stat['proxy']['AT_Command_Blocked'] == 0
assert l.db.stat['proxy']['Modbus_Command'] == 0
assert 2 == l.db.get_db_value(Register.NO_INPUTS, 0)
l.append_msg(dcu_command_rsp_msg)
l.read() # read at resp
assert l.ifc.fwd_fifo.peek()==dcu_command_rsp_msg
inverter.forward(l,r)
assert r.ifc.tx_fifo.get()==dcu_command_rsp_msg
assert Proxy.mqtt.key == ''
assert Proxy.mqtt.data == ""

View File

@@ -144,7 +144,7 @@ async def test_emu_start(my_loop, config_tsun_inv1, msg_modbus_rsp, str_test_ip,
inv = InvStream(msg_modbus_rsp)
assert asyncio.get_running_loop() == inv.mb_timer.loop
await inv.send_start_cmd(get_sn_int(), str_test_ip, True, inv.mb_first_timeout)
inv.send_start_cmd(get_sn_int(), str_test_ip, True, inv.mb_first_timeout)
inv.read() # read complete msg, and dispatch msg
assert not inv.header_valid # must be invalid, since msg was handled and buffer flushed
assert inv.msg_count == 1
@@ -161,7 +161,7 @@ async def test_snd_hb(my_loop, config_tsun_inv1, heartbeat_ind):
inv = InvStream()
cld = CldStream(inv)
# await inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
# inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
cld.send_heartbeat_cb(0)
assert cld.ifc.tx_fifo.peek() == heartbeat_ind
cld.close()
@@ -178,7 +178,7 @@ async def test_snd_inv_data(my_loop, config_tsun_inv1, inverter_ind_msg, inverte
inv.db.set_db_def_value(Register.GRID_FREQUENCY, 50.05)
inv.db.set_db_def_value(Register.PROD_COMPL_TYPE, 6)
assert asyncio.get_running_loop() == inv.mb_timer.loop
await inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
inv.db.set_db_def_value(Register.DATA_UP_INTERVAL, 17) # set test value
cld = CldStream(inv)
@@ -213,7 +213,7 @@ async def test_rcv_invalid(my_loop, config_tsun_inv1, inverter_ind_msg, inverter
_ = config_tsun_inv1
inv = InvStream()
assert asyncio.get_running_loop() == inv.mb_timer.loop
await inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
inv.db.set_db_def_value(Register.DATA_UP_INTERVAL, 17) # set test value
cld = CldStream(inv)

View File

@@ -2411,14 +2411,14 @@ async def test_msg_build_modbus_req(config_tsun_inv1, msg_modbus_cmd):
_ = config_tsun_inv1
m = MemoryStream(b'', (0,), True)
m.id_str = b"R170000000000001"
await m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
assert 0 == m.send_msg_ofs
assert m.ifc.fwd_fifo.get() == b''
assert m.ifc.tx_fifo.get() == b''
assert m.sent_pdu == b''
m.state = State.up
await m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
assert 0 == m.send_msg_ofs
assert m.ifc.fwd_fifo.get() == b''
assert m.ifc.tx_fifo.get() == b''

View File

@@ -1,22 +1,37 @@
# test_with_pytest.py
import pytest
from server import app
from web import Web, web
import logging
import os, errno
import datetime
from os import DirEntry, stat_result
from quart import current_app
from mock import patch
from server import app as my_app
from server import Server
from web import web
from async_stream import AsyncStreamClient
from gen3plus.inverter_g3p import InverterG3P
from web.log_handler import LogHandler
from test_inverter_g3p import FakeReader, FakeWriter, config_conn
from cnf.config import Config
from mock import patch
from proxy import Proxy
import os, errno
from os import DirEntry, stat_result
import datetime
class FakeServer(Server):
def __init__(self):
pass # don't call the suoer(.__init__ for unit tests
pytest_plugins = ('pytest_asyncio',)
@pytest.fixture(scope="session")
def app():
yield my_app
@pytest.fixture(scope="session")
def client():
def client(app):
app.secret_key = 'super secret key'
app.testing = True
return app.test_client()
@pytest.fixture
@@ -52,6 +67,7 @@ async def test_home(client):
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b"<title>TSUN Proxy - Connections</title>" in await response.data
@pytest.mark.asyncio
async def test_page(client):
@@ -59,14 +75,17 @@ async def test_page(client):
response = await client.get('/mqtt')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b"<title>TSUN Proxy - MQTT Status</title>" in await response.data
assert b'fetch("/mqtt-fetch")' in await response.data
@pytest.mark.asyncio
async def test_rel_page(client):
"""Test the mqtt route."""
"""Test the mqtt route with relative paths."""
web.build_relative_urls = True
response = await client.get('/mqtt')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'fetch("./mqtt-fetch")' in await response.data
web.build_relative_urls = False
@pytest.mark.asyncio
@@ -75,6 +94,7 @@ async def test_notes(client):
response = await client.get('/notes')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b"<title>TSUN Proxy - Important Messages</title>" in await response.data
@pytest.mark.asyncio
async def test_logging(client):
@@ -82,6 +102,7 @@ async def test_logging(client):
response = await client.get('/logging')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b"<title>TSUN Proxy - Log Files</title>" in await response.data
@pytest.mark.asyncio
async def test_favicon96(client):
@@ -119,37 +140,37 @@ async def test_manifest(client):
assert response.mimetype == 'application/manifest+json'
@pytest.mark.asyncio
async def test_data_fetch(create_inverter):
async def test_data_fetch(client, create_inverter):
"""Test the data-fetch route."""
_ = create_inverter
client = app.test_client()
response = await client.get('/data-fetch')
assert response.status_code == 200
response = await client.get('/data-fetch')
assert response.status_code == 200
assert b'<h5>Connections</h5>' in await response.data
@pytest.mark.asyncio
async def test_data_fetch1(create_inverter_server):
async def test_data_fetch1(client, create_inverter_server):
"""Test the data-fetch route with server connection."""
_ = create_inverter_server
client = app.test_client()
response = await client.get('/data-fetch')
assert response.status_code == 200
response = await client.get('/data-fetch')
assert response.status_code == 200
assert b'<h5>Connections</h5>' in await response.data
@pytest.mark.asyncio
async def test_data_fetch2(create_inverter_client):
async def test_data_fetch2(client, create_inverter_client):
"""Test the data-fetch route with client connection."""
_ = create_inverter_client
client = app.test_client()
response = await client.get('/data-fetch')
assert response.status_code == 200
response = await client.get('/data-fetch')
assert response.status_code == 200
assert b'<h5>Connections</h5>' in await response.data
@pytest.mark.asyncio
async def test_language_en(client):
@@ -159,21 +180,44 @@ async def test_language_en(client):
assert response.content_language.pop() == 'en'
assert response.location == '/index'
assert response.mimetype == 'text/html'
assert b'<html lang=en' in await response.data
assert b'<title>Redirecting...</title>' in await response.data
client.set_cookie('test', key='language', value='de')
response = await client.get('/mqtt')
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'<html lang="en"' in await response.data
assert b'<title>TSUN Proxy - Connections</title>' in await response.data
@pytest.mark.asyncio
async def test_language_de(client):
"""Test the language/de route."""
response = await client.get('/language/de', headers={'referer': '/'})
assert response.status_code == 302
assert response.content_language.pop() == 'de'
assert response.location == '/'
assert response.mimetype == 'text/html'
assert b'<html lang=en>' in await response.data
assert b'<title>Redirecting...</title>' in await response.data
client.set_cookie('test', key='language', value='en')
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'<html lang="de"' in await response.data
# the following assert fails on github runner, since the translation to german fails
# assert b'<title>TSUN Proxy - Verbindungen</title>' in await response.data
"""Switch back to english"""
response = await client.get('/language/en', headers={'referer': '/index'})
assert response.status_code == 302
assert response.content_language.pop() == 'en'
assert response.location == '/index'
assert response.mimetype == 'text/html'
assert b'<html lang=en>' in await response.data
assert b'<title>Redirecting...</title>' in await response.data
@pytest.mark.asyncio
async def test_language_unknown(client):
@@ -182,6 +226,12 @@ async def test_language_unknown(client):
assert response.status_code == 404
assert response.mimetype == 'text/html'
client.set_cookie('test', key='language', value='en')
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'<title>TSUN Proxy - Connections</title>' in await response.data
@pytest.mark.asyncio
async def test_mqtt_fetch(client, create_inverter):
@@ -191,15 +241,47 @@ async def test_mqtt_fetch(client, create_inverter):
response = await client.get('/mqtt-fetch')
assert response.status_code == 200
assert b'<h5>MQTT devices</h5>' in await response.data
@pytest.mark.asyncio
async def test_notes_fetch(client, config_conn):
"""Test the notes-fetch route."""
_ = create_inverter
_ = config_conn
s = FakeServer()
s.src_dir = 'app/src/'
s.init_logging_system()
# First clear log and test Well done message
logh = LogHandler()
logh.clear()
response = await client.get('/notes-fetch')
assert response.status_code == 200
assert b'<h2>Well done!</h2>' in await response.data
# Check info logs which must be ignored here
logging.info('config_info')
logh.flush()
response = await client.get('/notes-fetch')
assert response.status_code == 200
assert b'<h2>Well done!</h2>' in await response.data
# Check warning logs which must be added to the note list
logging.warning('config_warning')
logh.flush()
response = await client.get('/notes-fetch')
assert response.status_code == 200
assert b'WARNING' in await response.data
assert b'config_warning' in await response.data
# Check error logs which must be added to the note list
logging.error('config_err')
logh.flush()
response = await client.get('/notes-fetch')
assert response.status_code == 200
assert b'ERROR' in await response.data
assert b'config_err' in await response.data
@pytest.mark.asyncio
@@ -229,6 +311,7 @@ async def test_file_fetch(client, config_conn, monkeypatch):
monkeypatch.delattr(stat_result, "st_birthtime")
response = await client.get('/file-fetch')
assert response.status_code == 200
assert b'<h4>test.txt</h4>' in await response.data
@pytest.mark.asyncio
async def test_send_file(client, config_conn):
@@ -237,6 +320,7 @@ async def test_send_file(client, config_conn):
assert Config.log_path == 'app/tests/log/'
response = await client.get('/send-file/test.txt')
assert response.status_code == 200
assert b'2025-04-30 00:01:23' in await response.data
@pytest.mark.asyncio
@@ -291,3 +375,20 @@ async def test_del_file_err(client, config_conn, patch_os_remove_err):
assert Config.log_path == 'app/tests/log/'
response = await client.delete ('/del-file/test.txt')
assert response.status_code == 404
@pytest.mark.asyncio
async def test_addon_links(client):
"""Test links to HA add-on config/log in UI"""
with patch.dict(os.environ, {'SLUG': 'c676133d', 'HOSTNAME': 'c676133d-tsun-proxy'}):
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'Add-on Config' in await response.data
assert b'href="/hassio/addon/c676133d_tsun-proxy/logs' in await response.data
assert b'href="/hassio/addon/c676133d_tsun-proxy/config' in await response.data
# check that links are not available if env vars SLUG and HOSTNAME are not defined (docker version)
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'Add-on Config' not in await response.data

View File

@@ -8,7 +8,7 @@ msgid ""
msgstr ""
"Project-Id-Version: tsun-gen3-proxy 0.14.0\n"
"Report-Msgid-Bugs-To: EMAIL@ADDRESS\n"
"POT-Creation-Date: 2025-05-13 20:55+0200\n"
"POT-Creation-Date: 2025-05-13 22:34+0200\n"
"PO-Revision-Date: 2025-04-18 16:24+0200\n"
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
"Language: de\n"
@@ -19,27 +19,27 @@ msgstr ""
"Content-Transfer-Encoding: 8bit\n"
"Generated-By: Babel 2.17.0\n"
#: src/web/conn_table.py:52 src/web/templates/base.html.j2:58
#: src/web/conn_table.py:53 src/web/templates/base.html.j2:58
msgid "Connections"
msgstr "Verbindungen"
#: src/web/conn_table.py:59
#: src/web/conn_table.py:60
msgid "Device-IP:Port"
msgstr "Geräte-IP:Port"
#: src/web/conn_table.py:59
#: src/web/conn_table.py:60
msgid "Device-IP"
msgstr "Geräte-IP"
#: src/web/conn_table.py:60 src/web/mqtt_table.py:34
#: src/web/conn_table.py:61 src/web/mqtt_table.py:34
msgid "Serial-No"
msgstr "Seriennummer"
#: src/web/conn_table.py:61
#: src/web/conn_table.py:62
msgid "Cloud-IP:Port"
msgstr "Cloud-IP:Port"
#: src/web/conn_table.py:61
#: src/web/conn_table.py:62
msgid "Cloud-IP"
msgstr "Cloud-IP"
@@ -75,6 +75,14 @@ msgstr "Wichtige Hinweise"
msgid "Log Files"
msgstr "Log Dateien"
#: src/web/templates/base.html.j2:64
msgid "Add-on Config"
msgstr "Add-on Konfiguration"
#: src/web/templates/base.html.j2:65
msgid "Add-on Log"
msgstr "Add-on Protokoll"
#: src/web/templates/page_index.html.j2:3
msgid "TSUN Proxy - Connections"
msgstr "TSUN Proxy - Verbindungen"
@@ -120,12 +128,13 @@ msgid "TSUN Proxy - Log Files"
msgstr "TSUN Proxy - Log Dateien"
#: src/web/templates/page_logging.html.j2:10
msgid "Do you really want to delete the log file"
msgstr "Soll die Datei wirklich gelöscht werden"
#, python-format
msgid "Do you really want to delete the log file: <br>%(file)s ?"
msgstr "Soll die Datei: <br>%(file)s<br>wirklich gelöscht werden?"
#: src/web/templates/page_logging.html.j2:12
msgid "Delete File</button"
msgstr "File löschen"
msgid "Delete File"
msgstr "Datei löschen"
#: src/web/templates/page_logging.html.j2:13
msgid "Abort"

View File

@@ -192,7 +192,7 @@ $(repro_all_subdirs) :
mkdir -p $@
$(repro_all_templates) : $(INST_BASE)/ha_addon_%/config.yaml: $(TEMPL)/config.jinja $(TEMPL)/%_data.json $(SRC)/.version FORCE
$(JINJA) --strict -D AppVersion=$(VERSION)-$* -D BuildID=$(BUILD_ID) $< $(filter %.json,$^) -o $@
$(JINJA) --strict -D AppVersion=$(VERSION)-$*$(RC) -D BuildID=$(BUILD_ID) $< $(filter %.json,$^) -o $@
$(repro_all_apparmor) : $(INST_BASE)/ha_addon_%/apparmor.txt: $(TEMPL)/apparmor.jinja $(TEMPL)/%_data.json
$(JINJA) --strict $< $(filter %.json,$^) -o $@

View File

@@ -29,27 +29,23 @@ target "_common" {
"type =sbom,generator=docker/scout-sbom-indexer:latest"
]
annotations = [
"index:io.hass.version=${VERSION}",
"index:io.hass.type=addon",
"index:io.hass.arch=armhf|aarch64|i386|amd64",
"index:org.opencontainers.image.title=TSUN-Proxy",
"index:org.opencontainers.image.authors=Stefan Allius",
"index:org.opencontainers.image.created=${BUILD_DATE}",
"index:org.opencontainers.image.version=${VERSION}",
"index:org.opencontainers.image.revision=${BRANCH}",
"index:org.opencontainers.image.description=${DESCRIPTION}",
"index:io.hass.arch=aarch64|amd64",
"index,manifest-descriptor:org.opencontainers.image.title=TSUN-Proxy",
"index,manifest-descriptor:org.opencontainers.image.authors=Stefan Allius",
"index,manifest-descriptor:org.opencontainers.image.created=${BUILD_DATE}",
"index,manifest-descriptor:org.opencontainers.image.version=${VERSION}",
"index,manifest-descriptor:org.opencontainers.image.description=${DESCRIPTION}",
"index:org.opencontainers.image.licenses=BSD-3-Clause",
"index:org.opencontainers.image.source=https://github.com/s-allius/tsun-gen3-proxy/ha_addons/ha_addon"
"index:org.opencontainers.image.source=https://github.com/s-allius/tsun-gen3-proxy/ha_addons/ha_addon",
]
labels = {
"io.hass.version" = "${VERSION}"
"io.hass.type" = "addon"
"io.hass.arch" = "armhf|aarch64|i386|amd64"
"io.hass.arch" = "aarch64|amd64"
"org.opencontainers.image.title" = "TSUN-Proxy"
"org.opencontainers.image.authors" = "Stefan Allius"
"org.opencontainers.image.created" = "${BUILD_DATE}"
"org.opencontainers.image.version" = "${VERSION}"
"org.opencontainers.image.revision" = "${BRANCH}"
"org.opencontainers.image.description" = "${DESCRIPTION}"
"org.opencontainers.image.licenses" = "BSD-3-Clause"
"org.opencontainers.image.source" = "https://github.com/s-allius/tsun-gen3-proxy/ha_addonsha_addon"
@@ -59,7 +55,7 @@ target "_common" {
]
no-cache = false
platforms = ["linux/amd64", "linux/arm64", "linux/arm/v7"]
platforms = ["linux/amd64", "linux/arm64"]
}
target "_debug" {

View File

@@ -13,12 +13,12 @@
# 1 Build Base Image #
######################
ARG BUILD_FROM="ghcr.io/hassio-addons/base:17.2.5"
ARG BUILD_FROM="ghcr.io/hassio-addons/base:18.0.3"
# hadolint ignore=DL3006
FROM $BUILD_FROM AS base
# Installiere Python, pip und virtuelle Umgebungstools
RUN apk add --no-cache python3=3.12.10-r0 py3-pip=24.3.1-r0 && \
RUN apk add --no-cache python3=3.12.11-r0 py3-pip=25.1.1-r0 && \
python -m venv /opt/venv && \
. /opt/venv/bin/activate

View File

@@ -1,18 +1,46 @@
#!/usr/bin/with-contenv bashio
echo "Add-on environment started"
bashio::log.blue "-----------------------------------------------------------"
bashio::log.blue "run.sh: info: setup Add-on environment"
bashio::cache.flush_all
MQTT_HOST=""
SLUG=""
HOSTNAME=""
if bashio::supervisor.ping; then
bashio::log "run.sh: info: check Home Assistant bashio for config values"
if bashio::services.available mqtt; then
MQTT_HOST=$(bashio::services mqtt "host")
MQTT_PORT=$(bashio::services mqtt "port")
MQTT_USER=$(bashio::services mqtt "username")
MQTT_PASSWORD=$(bashio::services mqtt "password")
else
bashio::log.yellow "run.sh: info: Home Assistant MQTT service not available!"
fi
SLUG=$(bashio::addon.repository)
HOSTNAME=$(bashio::addon.hostname)
else
bashio::log.red "run.sh: error: Home Assistant Supervisor API not available!"
fi
echo "check for Home Assistant MQTT"
MQTT_HOST=$(bashio::services mqtt "host")
MQTT_PORT=$(bashio::services mqtt "port")
MQTT_USER=$(bashio::services mqtt "username")
MQTT_PASSWORD=$(bashio::services mqtt "password")
if [ -z "$SLUG" ]; then
bashio::log.yellow "run.sh: info: addon slug not found"
else
bashio::log.green "run.sh: info: found addon slug: $SLUG"
export SLUG
fi
if [ -z "$HOSTNAME" ]; then
bashio::log.yellow "run.sh: info: addon hostname not found"
else
bashio::log.green "run.sh: info: found addon hostname: $HOSTNAME"
export HOSTNAME
fi
# if a MQTT was/not found, drop a note
if [ -z "$MQTT_HOST" ]; then
echo "MQTT not found"
bashio::log.yellow "run.sh: info: MQTT config not found"
else
echo "MQTT found"
bashio::log.green "run.sh: info: found MQTT config"
export MQTT_HOST
export MQTT_PORT
export MQTT_USER
@@ -29,5 +57,6 @@ cd /home/proxy || exit
export VERSION=$(cat /proxy-version.txt)
echo "Start Proxyserver..."
bashio::log.blue "run.sh: info: Start Proxyserver..."
bashio::log.blue "-----------------------------------------------------------"
python3 server.py --rel_urls --json_config=/data/options.json --log_path=/homeassistant/tsun-proxy/logs/ --config_path=/homeassistant/tsun-proxy/ --log_backups=2

View File

@@ -10,8 +10,6 @@ init: false
arch:
- aarch64
- amd64
- armhf
- armv7
startup: services
homeassistant_api: true
map:

View File

@@ -2,7 +2,6 @@
{
"name": "TSUN-Proxy (Release Candidate)",
"description": "MQTT Proxy for TSUN Photovoltaic Inverters",
"version": "rc",
"image": "ghcr.io/s-allius/tsun-gen3-addon",
"slug": "tsun-proxy-rc",
"advanced": true,