Compare commits

..

25 Commits

Author SHA1 Message Date
Stefan Allius
bca026bb64 initial implementation 2025-08-14 17:25:28 +02:00
renovate[bot]
e126f4e780 Update dependency pytest-asyncio to v1.1.0 (#476)
* Update dependency pytest-asyncio to v1.1.0

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-07-16 20:36:52 +02:00
Stefan Allius
7da7d6f15c Save task references (#475)
* Save a tast reference

Important: Save a reference of the created task,
to avoid a task disappearing mid-execution. The
event loop only keeps weak references to tasks.
A task that isn’t referenced elsewhere may get
garbage collected at any time, even before it’s
done. For reliable “fire-and-forget” background
tasks, gather them in a collection
2025-07-16 20:15:21 +02:00
Stefan Allius
8c3f3ba827 S allius/issue472 (#473)
* catch socket.gaierror exception

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-15 21:09:29 +02:00
renovate[bot]
0b05f6cd9a Update dependency coverage to v7.9.2 (#470)
* Update dependency coverage to v7.9.2

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-07-15 20:23:01 +02:00
renovate[bot]
0e35a506e0 Update ghcr.io/hassio-addons/base Docker tag to v18.0.3 (#469)
* update python and pip to compatible versions

* Update ghcr.io/hassio-addons/base Docker tag to v18.0.3

* add-on: remove armhf and armv7 support

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-07-15 20:13:55 +02:00
renovate[bot]
eba2c3e452 Update ghcr.io/hassio-addons/base Docker tag to v18 (#468)
* Update ghcr.io/hassio-addons/base Docker tag to v18

* improve docker annotations

* update python and pip to compatible versions

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-06-29 21:47:37 +02:00
renovate[bot]
118fab8b6c Update dependency python-dotenv to v1.1.1 (#467)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-24 18:24:28 +02:00
Stefan Allius
d25f142e10 add links to add-on urls (#466)
* add links to add-on urls

* Add translations

* set app.testing to get exceptions during test

* improve unit-tests for the web-UI

* update changelog

* extend languages tests

* workaround for github runner
2025-06-22 21:39:31 +02:00
Stefan Allius
eb59e19c0a Fix Sonar Qube errors and warnings (#464)
* replace constructor call with a literal

  https://sonarcloud.io/project/issues?open=AZeMhhlEyR1Wrs09sNyb&id=s-allius_tsun-gen3-proxy

* re-raise cancel error after cleanup

https://sonarcloud.io/project/issues?open=AZeMhhltyR1Wrs09sNyc&id=s-allius_tsun-gen3-proxy

* remove duplicated line

* change send_modbus_cmd into a synchronous function

* make send_start_cmd synchronous

https://sonarcloud.io/project/issues?open=AZeMhhhyyR1Wrs09sNya&id=s-allius_tsun-gen3-proxy

* make more functions synchronous

* update changelog
2025-06-21 12:18:48 +02:00
Stefan Allius
bacebbd649 S allius/issue456 (#462)
* - remove unused 32-bit architectures from the prebuild multiarch containers

* update po file
2025-06-21 10:41:47 +02:00
renovate[bot]
ebbb675e63 Update dependency flake8 to v7.3.0 (#459)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-21 10:27:45 +02:00
Stefan Allius
04fd9ed7f6 S allius/issue460 (#461)
* - Improve Makefile

* - Babel don't build new po file if only the pot creation-date was changed
2025-06-21 10:26:17 +02:00
renovate[bot]
f3c22c9853 Update dependency pytest to v8.4.1 (#458)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-20 10:51:02 +02:00
renovate[bot]
460db31fa6 Update python Docker tag to v3.13.5 (#453)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-20 10:46:35 +02:00
renovate[bot]
144c9080cb Update dependency coverage to v7.9.1 (#454)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-15 22:43:00 +02:00
renovate[bot]
dc1a28260e Update dependency coverage to v7.9.0 (#450)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-12 22:16:56 +02:00
renovate[bot]
e59529adc0 Update dependency pytest-cov to v6.2.1 (#449)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-12 22:13:41 +02:00
renovate[bot]
8d93b2a636 Update python Docker tag to v3.13.4 (#446)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-12 22:12:01 +02:00
renovate[bot]
01e9e70957 Update dependency pytest to v8.4.0 (#444)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-12 22:11:37 +02:00
renovate[bot]
1721bbebe2 Update dependency pytest-asyncio to v1 (#433)
* Update dependency pytest-asyncio to v1

* set version to 0.15.0

* Update dependency pytest-asyncio to v1

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-05-31 23:55:50 +02:00
Stefan Allius
41168fbb4d S allius/issue438 (#442)
* Update change log (#436)

* S allius/issue427 (#434)

* mock the aiomqtt library and increse coverage

* test inv response for a mb scan request

* improve test coverage

* S allius/issue427 (#435)

* mock the aiomqtt library and increse coverage

* test inv response for a mb scan request

* improve test coverage

* improve test case

* version 0.14.0

* handle missing MQTT addon

- we have to check if the supervisor API and a
MQTT broker add-on is installed. If not we assume
the user has an external MQTT broker

* handle missing MQTT addon

* run also on releases/* branch

* avoid printing of the MQTT config inkl. password

* revise the log outputs

* update version 0.14.1

* new version 0.14.1
2025-05-31 23:30:16 +02:00
Stefan Allius
25ba6ef8f3 version 0.14.0 (#441) 2025-05-31 23:27:49 +02:00
Stefan Allius
2a40bd7b71 S allius/issue427 (#435)
* mock the aiomqtt library and increse coverage

* test inv response for a mb scan request

* improve test coverage

* improve test case
2025-05-26 23:42:13 +02:00
Stefan Allius
95182d2196 S allius/issue427 (#434)
* mock the aiomqtt library and increse coverage

* test inv response for a mb scan request

* improve test coverage
2025-05-26 23:16:33 +02:00
34 changed files with 381 additions and 145 deletions

View File

@@ -5,7 +5,7 @@ name: Python application
on:
push:
branches: [ "main", "dev-*", "*/issue*" ]
branches: [ "main", "dev-*", "*/issue*", "releases/*" ]
paths-ignore:
- '**.md' # Do no build on *.md changes
- '**.yml' # Do no build on *.yml changes
@@ -18,7 +18,7 @@ on:
- '**.dockerfile' # Do no build on *.dockerfile changes
- '**.sh' # Do no build on *.sh changes
pull_request:
branches: [ "main", "dev-*" ]
branches: [ "main", "dev-*", "releases/*" ]
permissions:
contents: read

View File

@@ -1 +1 @@
3.13.2
3.13.5

View File

@@ -7,6 +7,25 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
## [unreleased]
- Update dependency pytest-asyncio to v1.1.0
- save task references, to avoid a task disappearing mid-execution
- catch socket.gaierror exception and log this with info level
- Update dependency coverage to v7.9.2
- add-on: bump base-image to version 18.0.3
- add-on: remove armhf and armv7 support
- add-on: add links to config and log-file to the web-UI
- fix some SonarQube warnings
- remove unused 32-bit architectures
- Babel don't build new po file if only the pot creation-date was changed
- Improve Makefile
- Update dependency pytest-asyncio to v1
## [0.14.1] - 2025-05-31
- handle missing MQTT addon [#438](https://github.com/s-allius/tsun-gen3-proxy/issues/438)
## [0.14.0] - 2025-05-29
- add-on: bump python to version 3.12.10-r1
- set no of pv modules for MS800 GEN3PLUS inverters
- fix the paths to copy the config.example.toml file during proxy start

View File

@@ -1,27 +1,37 @@
.PHONY: build babel clean addon-dev addon-debug addon-rc addon-rel debug dev preview rc rel check-docker-compose install
.PHONY: help build babel clean addon-dev addon-debug addon-rc addon-rel debug dev preview rc rel check-docker-compose install
babel:
help: ## show help message
@awk 'BEGIN {FS = ":.*##"; printf "\nUsage:\n make \033[36m\033[0m\n"} /^[$$()% a-zA-Z0-9_-]+:.*?##/ { printf " \033[36m%-15s\033[0m %s\n", $$1, $$2 } /^##@/ { printf "\n\033[1m%s\033[0m\n", substr($$0, 5) } ' $(MAKEFILE_LIST)
babel: ## build language files
$(MAKE) -C app $@
build:
$(MAKE) -C ha_addons $@
clean:
clean: ## delete all built files
$(MAKE) -C app $@
$(MAKE) -C ha_addons $@
debug dev preview rc rel:
debug dev preview rc rel: ## build docker container in <dev|debg|rc|rel> version
$(MAKE) -C app babel
$(MAKE) -C app $@
addon-dev addon-debug addon-rc addon-rel:
addon-dev addon-debug addon-rc addon-rel: ## build HA add-on in <dev|debg|rc|rel> version
$(MAKE) -C app babel
$(MAKE) -C ha_addons $(patsubst addon-%,%,$@)
check-docker-compose:
check-docker-compose: ## check the docker-compose file
docker-compose config -q
install:
python3 -m pip install --upgrade pip
python3 -m pip install -r requirements.txt
python3 -m pip install -r requirements-test.txt
PY_VER := $(shell cat .python-version)
install: ## install requirements into the pyenv and switch to proper venv
@pyenv local $(PY_VER) || { pyenv install $(PY_VER) && pyenv local $(PY_VER) || exit 1; }
@pyenv exec pip install --upgrade pip
@pyenv exec pip install -r requirements.txt
@pyenv exec pip install -r requirements-test.txt
pyenv exec python --version
run: ## run proxy locally out of the actual venv
pyenv exec python app/src/server.py -c /app/src/cnf

View File

@@ -1 +1 @@
0.14.0
0.15.0

View File

@@ -55,7 +55,7 @@ $(BABEL_TRANSLATIONS)/%.pot : $(SRC)/.babel.cfg $(BABEL_INPUT)
$(BABEL_TRANSLATIONS)/%/LC_MESSAGES/messages.po : $(BABEL_TRANSLATIONS)/messages.pot
@mkdir -p $(@D)
@pybabel update --init-missing -i $< -d $(BABEL_TRANSLATIONS) -l $*
@pybabel update --init-missing --ignore-pot-creation-date -i $< -d $(BABEL_TRANSLATIONS) -l $*
$(BABEL_TRANSLATIONS)/%/LC_MESSAGES/messages.mo : $(BABEL_TRANSLATIONS)/%/LC_MESSAGES/messages.po
@pybabel compile -d $(BABEL_TRANSLATIONS) -l $*

View File

@@ -29,17 +29,17 @@ target "_common" {
"type =sbom,generator=docker/scout-sbom-indexer:latest"
]
annotations = [
"index:org.opencontainers.image.title=TSUN Gen3 Proxy",
"index:org.opencontainers.image.authors=Stefan Allius",
"index:org.opencontainers.image.created=${BUILD_DATE}",
"index:org.opencontainers.image.version=${VERSION}",
"index:org.opencontainers.image.revision=${BRANCH}",
"index:org.opencontainers.image.description=${DESCRIPTION}",
"index,manifest-descriptor:org.opencontainers.image.title=TSUN-Proxy",
"index,manifest-descriptor:org.opencontainers.image.authors=Stefan Allius",
"index,manifest-descriptor:org.opencontainers.image.created=${BUILD_DATE}",
"index,manifest-descriptor:org.opencontainers.image.version=${VERSION}",
"index,manifest-descriptor:org.opencontainers.image.revision=${BRANCH}",
"index,manifest-descriptor:org.opencontainers.image.description=${DESCRIPTION}",
"index:org.opencontainers.image.licenses=BSD-3-Clause",
"index:org.opencontainers.image.source=https://github.com/s-allius/tsun-gen3-proxy"
]
labels = {
"org.opencontainers.image.title" = "TSUN Gen3 Proxy"
"org.opencontainers.image.title" = "TSUN-Proxy"
"org.opencontainers.image.authors" = "Stefan Allius"
"org.opencontainers.image.created" = "${BUILD_DATE}"
"org.opencontainers.image.version" = "${VERSION}"
@@ -53,7 +53,7 @@ target "_common" {
]
no-cache = false
platforms = ["linux/amd64", "linux/arm64", "linux/arm/v7"]
platforms = ["linux/amd64", "linux/arm64"]
}
target "_debug" {

View File

@@ -1,8 +1,8 @@
flake8==7.2.0
pytest==8.3.5
pytest-asyncio==0.26.0
pytest-cov==6.1.1
python-dotenv==1.1.0
flake8==7.3.0
pytest==8.4.1
pytest-asyncio==1.1.0
pytest-cov==6.2.1
python-dotenv==1.1.1
mock==5.2.0
coverage==7.8.2
coverage==7.9.2
jinja2-cli==0.8.2

View File

@@ -102,3 +102,7 @@ class AsyncIfc(ABC):
@abstractmethod
def prot_set_update_header_cb(self, callback):
pass # pragma: no cover
@abstractmethod
def prot_set_disc_cb(self, callback):
pass # pragma: no cover

View File

@@ -29,6 +29,7 @@ class AsyncIfcImpl(AsyncIfc):
self.timeout_cb = None
self.init_new_client_conn_cb = None
self.update_header_cb = None
self.inv_disc_cb = None
def close(self):
self.timeout_cb = None
@@ -106,6 +107,9 @@ class AsyncIfcImpl(AsyncIfc):
def prot_set_update_header_cb(self, callback):
self.update_header_cb = callback
def prot_set_disc_cb(self, callback):
self.inv_disc_cb = callback
class StreamPtr():
'''Descr StreamPtr'''
@@ -330,6 +334,8 @@ class AsyncStreamServer(AsyncStream):
Infos.inc_counter('ServerMode_Cnt')
await self.publish_outstanding_mqtt()
await self.loop()
if self.inv_disc_cb:
self.inv_disc_cb()
Infos.dec_counter('ServerMode_Cnt')
Infos.dec_counter('Inverter_Cnt')
await self.publish_outstanding_mqtt()
@@ -386,6 +392,8 @@ class AsyncStreamClient(AsyncStream):
Infos.inc_counter('ProxyMode_Cnt')
await self.publish_outstanding_mqtt()
await self.loop()
if self.inv_disc_cb:
self.inv_disc_cb()
if self.emu_mode:
Infos.dec_counter('EmuMode_Cnt')
else:

View File

@@ -36,6 +36,7 @@ class Talent(Message):
def __init__(self, inverter, addr, ifc: "AsyncIfc", server_side: bool,
client_mode: bool = False, id_str=b''):
self.db = InfosG3()
super().__init__('G3', ifc, server_side, self.send_modbus_cb,
mb_timeout=15)
_ = inverter
@@ -51,7 +52,6 @@ class Talent(Message):
self.contact_name = b''
self.contact_mail = b''
self.ts_offset = 0 # time offset between tsun cloud and local
self.db = InfosG3()
self.switch = {
0x00: self.msg_contact_info,
0x13: self.msg_ota_update,

View File

@@ -256,11 +256,11 @@ class SolarmanV5(SolarmanBase):
def __init__(self, inverter, addr, ifc: "AsyncIfc",
server_side: bool, client_mode: bool):
self.db = InfosG3P(client_mode)
super().__init__(addr, ifc, server_side, self.send_modbus_cb,
mb_timeout=8)
self.inverter = inverter
self.db = InfosG3P(client_mode)
self.no_forwarding = False
'''not allowed to connect to TSUN cloud by connection type'''
self.establish_inv_emu = False
@@ -327,6 +327,7 @@ class SolarmanV5(SolarmanBase):
self.sensor_list = 0
self.mb_regs = [{'addr': 0x3000, 'len': 48},
{'addr': 0x2000, 'len': 96}]
self.background_tasks = set()
'''
Our puplic methods
@@ -339,11 +340,12 @@ class SolarmanV5(SolarmanBase):
self.inverter = None
self.switch.clear()
self.log_lvl.clear()
self.background_tasks.clear()
super().close()
async def send_start_cmd(self, snr: int, host: str,
forward: bool,
start_timeout=MB_CLIENT_DATA_UP):
def send_start_cmd(self, snr: int, host: str,
forward: bool,
start_timeout=MB_CLIENT_DATA_UP):
self.no_forwarding = True
self.establish_inv_emu = forward
self.snr = snr
@@ -690,8 +692,10 @@ class SolarmanV5(SolarmanBase):
self.__forward_msg()
def publish_mqtt(self, key, data): # pragma: no cover
asyncio.ensure_future(
task = asyncio.ensure_future(
Proxy.mqtt.publish(key, data))
self.background_tasks.add(task)
task.add_done_callback(self.background_tasks.discard)
def get_cmd_rsp_log_lvl(self) -> int:
ftype = self.ifc.rx_peek()[self.header_len]

View File

@@ -31,6 +31,7 @@ class Register(Enum):
GRID_VOLT_CAL_COEF = 29
OUTPUT_COEFFICIENT = 30
PROD_COMPL_TYPE = 31
AVAIL_STATUS = 32
INVERTER_CNT = 50
UNKNOWN_SNR = 51
UNKNOWN_MSG = 52
@@ -577,6 +578,7 @@ class Infos:
__output_coef_val_tpl = "{% if 'Output_Coefficient' in value_json and value_json['Output_Coefficient'] != None %}{{value_json['Output_Coefficient']|string() +' %'}}{% else %}{{ this.state }}{% endif %}" # noqa: E501
__info_defs = {
Register.AVAIL_STATUS: {'name': ['status', 'status']},
# collector values used for device registration:
Register.COLLECTOR_FW_VERSION: {'name': ['collector', 'Collector_Fw_Version'], 'level': logging.INFO, 'unit': ''}, # noqa: E501
Register.CHIP_TYPE: {'name': ['collector', 'Chip_Type'], 'singleton': False, 'level': logging.DEBUG, 'unit': ''}, # noqa: E501
@@ -946,6 +948,9 @@ class Infos:
attr['dev_cla'] = ha['dev_cla']
attr['stat_cla'] = ha['stat_cla']
attr['uniq_id'] = ha['id']+snr
# attr['availability_topic'] = prfx + "status"
# attr['payload_available'] = "online"
# attr['payload_not_available'] = "offline"
if 'val_tpl' in ha:
attr['val_tpl'] = ha['val_tpl']
elif 'fmt' in ha:

View File

@@ -4,6 +4,7 @@ import logging
import traceback
import json
import gc
import socket
from aiomqtt import MqttCodeError
from asyncio import StreamReader, StreamWriter
from ipaddress import ip_address
@@ -38,6 +39,7 @@ class InverterBase(InverterIfc, Proxy):
self.use_emulation = False
self.__ha_restarts = -1
self.remote = StreamPtr(None)
self.background_tasks = set()
ifc = AsyncStreamServer(reader, writer,
self.async_publ_mqtt,
self.create_remote,
@@ -72,6 +74,7 @@ class InverterBase(InverterIfc, Proxy):
if self.remote.ifc:
self.remote.ifc.close()
self.remote.ifc = None
self.background_tasks.clear()
async def disc(self, shutdown_started=False) -> None:
if self.remote.stream:
@@ -136,9 +139,14 @@ class InverterBase(InverterIfc, Proxy):
logging.info(f'[{self.remote.stream.node_id}:'
f'{self.remote.stream.conn_no}] '
f'Connected to {addr}')
asyncio.create_task(self.remote.ifc.client_loop(addr))
task = asyncio.create_task(
self.remote.ifc.client_loop(addr))
self.background_tasks.add(task)
task.add_done_callback(self.background_tasks.discard)
except (ConnectionRefusedError, TimeoutError) as error:
except (ConnectionRefusedError,
TimeoutError,
socket.gaierror) as error:
logging.info(f'{error}')
except Exception:
Infos.inc_counter('SW_Exception')
@@ -159,6 +167,8 @@ class InverterBase(InverterIfc, Proxy):
stream.new_data['batterie'])
or ('collector' in stream.new_data and
stream.new_data['collector'])
or ('status' in stream.new_data and
stream.new_data['status'])
or self.mqtt.ha_restarts != self.__ha_restarts):
await self._register_proxy_stat_home_assistant()
await self.__register_home_assistant(stream)

View File

@@ -98,7 +98,11 @@ class Message(ProtocolIfc):
self.server_side = server_side
self.ifc = ifc
self.node_id = node_id
self.new_data = {}
if server_side:
ifc.prot_set_disc_cb(self._inv_disc)
self.db.set_db_def_value(Register.AVAIL_STATUS, "on")
self.new_data['status'] = True
self.mb = Modbus(send_modbus_cb, mb_timeout)
self.mb_timer = Timer(self.mb_timout_cb, self.node_id)
else:
@@ -110,7 +114,6 @@ class Message(ProtocolIfc):
self.unique_id = 0
self.inv_serial = ''
self.sug_area = ''
self.new_data = {}
self.state = State.init
self.shutdown_started = False
self.modbus_elms = 0 # for unit tests
@@ -193,7 +196,7 @@ class Message(ProtocolIfc):
return
self.mb.build_msg(dev_id, func, addr, val, log_lvl)
async def send_modbus_cmd(self, func, addr, val, log_lvl) -> None:
def send_modbus_cmd(self, func, addr, val, log_lvl) -> None:
self._send_modbus_cmd(Modbus.INV_ADDR, func, addr, val, log_lvl)
def _send_modbus_scan(self):
@@ -220,6 +223,11 @@ class Message(ProtocolIfc):
f'(reg: 0x{self.mb.last_reg:04x}):',
data[hdr_len:], modbus_msg_len)
def _inv_disc(self):
logging.warning(f"Un-Available: [{self.node_id}]")
self.db.set_db_def_value(Register.AVAIL_STATUS, "off")
self.new_data['status'] = True
'''
Our puplic methods
'''
@@ -237,6 +245,7 @@ class Message(ProtocolIfc):
self.ifc.prot_set_timeout_cb(None)
self.ifc.prot_set_init_new_client_conn_cb(None)
self.ifc.prot_set_update_header_cb(None)
self.ifc.prot_set_disc_cb(None)
self.ifc = None
if self.mb:

View File

@@ -35,6 +35,8 @@ class ModbusConn():
async def __aexit__(self, exc_type, exc, tb):
Infos.dec_counter('ClientMode_Cnt')
Infos.dec_counter('Inverter_Cnt')
if self.inverter.local.ifc.inv_disc_cb:
self.inverter.local.ifc.inv_disc_cb()
await self.inverter.local.ifc.publish_outstanding_mqtt()
self.inverter.__exit__(exc_type, exc, tb)
@@ -43,6 +45,7 @@ class ModbusTcp():
def __init__(self, loop, tim_restart=10) -> None:
self.tim_restart = tim_restart
self.background_tasks = set()
inverters = Config.get('inverters')
batteries = Config.get('batteries')
@@ -54,10 +57,13 @@ class ModbusTcp():
and 'client_mode' in inv):
client = inv['client_mode']
logger.info(f"'client_mode' for Monitoring-SN: {inv['monitor_sn']} host: {client['host']}:{client['port']}, forward: {client['forward']}") # noqa: E501
loop.create_task(self.modbus_loop(client['host'],
client['port'],
inv['monitor_sn'],
client['forward']))
task = loop.create_task(
self.modbus_loop(client['host'],
client['port'],
inv['monitor_sn'],
client['forward']))
self.background_tasks.add(task)
task.add_done_callback(self.background_tasks.discard)
async def modbus_loop(self, host, port,
snr: int, forward: bool) -> None:
@@ -66,7 +72,7 @@ class ModbusTcp():
try:
async with ModbusConn(host, port) as inverter:
stream = inverter.local.stream
await stream.send_start_cmd(snr, host, forward)
stream.send_start_cmd(snr, host, forward)
await stream.ifc.loop()
logger.info(f'[{stream.node_id}:{stream.conn_no}] '
f'Connection closed - Shutdown: '

View File

@@ -112,7 +112,7 @@ class Mqtt(metaclass=Singleton):
except asyncio.CancelledError:
logger_mqtt.debug("MQTT task cancelled")
self.__client = None
return
raise
except Exception:
# self.inc_counter('SW_Exception') # fixme
self.ctime = None
@@ -151,7 +151,7 @@ class Mqtt(metaclass=Singleton):
if self.__cb_mqtt_is_up:
await self.__cb_mqtt_is_up()
async def _out_coeff(self, message):
def _out_coeff(self, message):
payload = message.payload.decode("UTF-8")
try:
val = round(float(payload) * 1024/100)
@@ -160,9 +160,9 @@ class Mqtt(metaclass=Singleton):
'the range 0..100,'
f' got: {payload}')
else:
await self._modbus_cmd(message,
Modbus.WRITE_SINGLE_REG,
0, 0x202c, val)
self._modbus_cmd(message,
Modbus.WRITE_SINGLE_REG,
0, 0x202c, val)
except Exception:
pass
@@ -182,7 +182,7 @@ class Mqtt(metaclass=Singleton):
else:
logger_mqtt.warning(f'Node_id: {node_id} not found')
async def _modbus_cmd(self, message, func, params=0, addr=0, val=0):
def _modbus_cmd(self, message, func, params=0, addr=0, val=0):
payload = message.payload.decode("UTF-8")
for fnc in self.each_inverter(message, "send_modbus_cmd"):
res = payload.split(',')
@@ -195,7 +195,7 @@ class Mqtt(metaclass=Singleton):
elif params == 2:
addr = int(res[0], base=16)
val = int(res[1]) # lenght
await fnc(func, addr, val, logging.INFO)
fnc(func, addr, val, logging.INFO)
async def _at_cmd(self, message):
payload = message.payload.decode("UTF-8")

View File

@@ -60,7 +60,16 @@ class Server():
@app.context_processor
def utility_processor():
return dict(version=self.version)
var = {'version': self.version,
'slug': os.getenv("SLUG"),
'hostname': os.getenv("HOSTNAME"),
}
if var['slug']:
var['hassio'] = True
slug_len = len(var['slug'])
var['addonname'] = var['slug'] + '_' + \
var['hostname'][slug_len+1:]
return var
def parse_args(self, arg_list: list[str] | None):
parser = argparse.ArgumentParser()
@@ -209,6 +218,7 @@ app = Quart(__name__,
static_folder='web/static')
app.secret_key = 'JKLdks.dajlKKKdladkflKwolafallsdfl'
app.jinja_env.globals.update(url_for=url_for)
app.background_tasks = set()
server = Server(app, __name__ == "__main__")
Web(app, server.trans_path, server.rel_urls)
@@ -259,9 +269,13 @@ async def startup_app(): # pragma: no cover
for inv_class, port in [(InverterG3, 5005), (InverterG3P, 10000)]:
logging.info(f'listen on port: {port} for inverters')
loop.create_task(asyncio.start_server(lambda r, w, i=inv_class:
handle_client(r, w, i),
'0.0.0.0', port))
task = loop.create_task(
asyncio.start_server(lambda r, w, i=inv_class:
handle_client(r, w, i),
'0.0.0.0', port))
app.background_tasks.add(task)
task.add_done_callback(app.background_tasks.discard)
ProxyState.set_up(True)
@@ -285,6 +299,7 @@ async def handle_shutdown(): # pragma: no cover
await inverter.disc(True)
logging.info('Proxy disconnecting done')
app.background_tasks.clear()
await Proxy.class_close(loop)

View File

@@ -29,9 +29,9 @@ def get_tz():
@web.context_processor
def utility_processor():
return dict(lang=babel_get_locale(),
lang_str=LANGUAGES.get(str(babel_get_locale()), "English"),
languages=LANGUAGES)
return {'lang': babel_get_locale(),
'lang_str': LANGUAGES.get(str(babel_get_locale()), "English"),
'languages': LANGUAGES}
@web.route('/language/<language>')

View File

@@ -22,3 +22,6 @@ class LogHandler(Handler, metaclass=Singleton):
def get_buffer(self, elms=0) -> list:
return list(self.buffer)[-elms:]
def clear(self):
self.buffer.clear()

View File

@@ -7,3 +7,4 @@
.fa-rotate-right:before{content:"\f01e"}
.fa-cloud-arrow-down-alt:before{content:"\f381"}
.fa-cloud-arrow-up-alt:before{content:"\f382"}
.fa-gear:before{content:"\f013"}

View File

@@ -59,6 +59,11 @@
<a href="{{ url_for('.mqtt')}}" class="w3-bar-item w3-button w3-padding {% block menu2_class %}{% endblock %}"><i class="fa fa-database fa-fw"></i>  MQTT</a>
<a href="{{ url_for('.notes')}}" class="w3-bar-item w3-button w3-padding {% block menu3_class %}{% endblock %}"><i class="fa fa-info fa-fw"></i>  {{_('Important Messages')}}</a>
<a href="{{ url_for('.logging')}}" class="w3-bar-item w3-button w3-padding {% block menu4_class %}{% endblock %}"><i class="fa fa-file-export fa-fw"></i>  {{_('Log Files')}}</a>
{% if hassio is defined %}
<br>
<a href="/hassio/addon/{{addonname}}/config" target="_top" class="w3-bar-item w3-button w3-padding"><i class="fa fa-gear fa-fw"></i>  {{_('Add-on Config')}}</a>
<a href="/hassio/addon/{{addonname}}/logs" target="_top" class="w3-bar-item w3-button w3-padding"><i class="fa fa-file fa-fw"></i>  {{_('Add-on Log')}}</a>
{% endif %}
</div>
</nav>

View File

@@ -1,19 +1,19 @@
2025-04-30 00:01:23 INFO | root | Server "proxy - unknown" will be started
2025-04-30 00:01:23 INFO | root | current dir: /Users/sallius/tsun/tsun-gen3-proxy
2025-04-30 00:01:23 INFO | root | config_path: ./config/
2025-04-30 00:01:23 INFO | root | json_config: None
2025-04-30 00:01:23 INFO | root | toml_config: None
2025-04-30 00:01:23 INFO | root | trans_path: ../translations/
2025-04-30 00:01:23 INFO | root | rel_urls: False
2025-04-30 00:01:23 INFO | root | log_path: ./log/
2025-04-30 00:01:23 INFO | root | log_backups: unlimited
2025-04-30 00:01:23 INFO | root | LOG_LVL : None
2025-04-30 00:01:23 INFO | root | ******
2025-04-30 00:01:23 INFO | root | Read from /Users/sallius/tsun/tsun-gen3-proxy/app/src/cnf/default_config.toml => ok
2025-04-30 00:01:23 INFO | root | Read from environment => ok
2025-04-30 00:01:23 INFO | root | Read from ./config/config.json => n/a
2025-04-30 00:01:23 INFO | root | Read from ./config/config.toml => n/a
2025-04-30 00:01:23 INFO | root | ******
2025-04-30 00:01:23 INFO | root | listen on port: 5005 for inverters
2025-04-30 00:01:23 INFO | root | listen on port: 10000 for inverters
2025-04-30 00:01:23 INFO | root | Start Quart
2025-04-30 00:01:24 INFO | root | current dir: /Users/sallius/tsun/tsun-gen3-proxy
2025-04-30 00:01:25 INFO | root | config_path: ./config/
2025-04-30 00:01:26 INFO | root | json_config: None
2025-04-30 00:01:27 INFO | root | toml_config: None
2025-04-30 00:01:28 INFO | root | trans_path: ../translations/
2025-04-30 00:01:29 INFO | root | rel_urls: False
2025-04-30 00:01:30 INFO | root | log_path: ./log/
2025-04-30 00:01:31 INFO | root | log_backups: unlimited
2025-04-30 00:01:32 INFO | root | LOG_LVL : None
2025-04-30 00:01:33 INFO | root | ******
2025-04-30 00:01:34 INFO | root | Read from /Users/sallius/tsun/tsun-gen3-proxy/app/src/cnf/default_config.toml => ok
2025-04-30 00:01:35 INFO | root | Read from environment => ok
2025-04-30 00:01:36 INFO | root | Read from ./config/config.json => n/a
2025-04-30 00:01:37 INFO | root | Read from ./config/config.toml => n/a
2025-04-30 00:01:38 INFO | root | ******
2025-04-30 00:01:39 INFO | root | listen on port: 5005 for inverters
2025-04-30 00:01:40 INFO | root | listen on port: 10000 for inverters
2025-04-30 00:01:41 INFO | root | Start Quart

View File

@@ -286,23 +286,23 @@ async def test_mqtt_dispatch(config_mqtt_conn, aiomqtt_mock, spy_modbus_cmd):
assert m.ha_restarts == 1
await m.receive(topic= 'tsun/inv_1/rated_load', payload= b'2')
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x2008, 2, logging.INFO)
spy.assert_called_once_with(Modbus.WRITE_SINGLE_REG, 0x2008, 2, logging.INFO)
spy.reset_mock()
await m.receive(topic= 'tsun/inv_1/out_coeff', payload= b'100')
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 1024, logging.INFO)
spy.assert_called_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 1024, logging.INFO)
spy.reset_mock()
await m.receive(topic= 'tsun/inv_1/out_coeff', payload= b'50')
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 512, logging.INFO)
spy.assert_called_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 512, logging.INFO)
spy.reset_mock()
await m.receive(topic= 'tsun/inv_1/modbus_read_regs', payload= b'0x3000, 10')
spy.assert_awaited_once_with(Modbus.READ_REGS, 0x3000, 10, logging.INFO)
spy.assert_called_once_with(Modbus.READ_REGS, 0x3000, 10, logging.INFO)
spy.reset_mock()
await m.receive(topic= 'tsun/inv_1/modbus_read_inputs', payload= b'0x3000, 10')
spy.assert_awaited_once_with(Modbus.READ_INPUTS, 0x3000, 10, logging.INFO)
spy.assert_called_once_with(Modbus.READ_INPUTS, 0x3000, 10, logging.INFO)
# test dispatching with empty mapping table
m.topic_defs.clear()

View File

@@ -191,6 +191,7 @@ class TestApp:
"""Test the ready route."""
ProxyState.set_up(False)
app.testing = True
client = app.test_client()
response = await client.get('/-/ready')
assert response.status_code == 503
@@ -211,6 +212,7 @@ class TestApp:
with InverterBase(reader, writer, 'tsun', Talent):
ProxyState.set_up(False)
app.testing = True
client = app.test_client()
response = await client.get('/-/healthy')
assert response.status_code == 200
@@ -240,6 +242,7 @@ class TestApp:
with caplog.at_level(logging.INFO) and InverterBase(reader, writer, 'tsun', Talent):
ProxyState.set_up(False)
app.testing = True
client = app.test_client()
response = await client.get('/-/healthy')
assert response.status_code == 200
@@ -271,6 +274,7 @@ class TestApp:
with caplog.at_level(logging.INFO) and InverterBase(reader, writer, 'tsun', Talent):
ProxyState.set_up(False)
app.testing = True
client = app.test_client()
response = await client.get('/-/healthy')
assert response.status_code == 200

View File

@@ -1598,18 +1598,18 @@ async def test_msg_iterator(my_loop, config_tsun_inv1):
@pytest.mark.asyncio
async def test_proxy_counter(my_loop, config_tsun_inv1):
m = SolarmanV5(None, ('test.local', 1234), ifc=AsyncIfcImpl(), server_side=True, client_mode=False)
assert m.new_data == {}
assert m.new_data == {'status': True}
m.db.stat['proxy']['Unknown_Msg'] = 0
Infos.new_stat_data['proxy'] = False
m.inc_counter('Unknown_Msg')
assert m.new_data == {}
assert m.new_data == {'status': True}
assert Infos.new_stat_data == {'proxy': True}
assert 1 == m.db.stat['proxy']['Unknown_Msg']
Infos.new_stat_data['proxy'] = False
m.dec_counter('Unknown_Msg')
assert m.new_data == {}
assert m.new_data == {'status': True}
assert Infos.new_stat_data == {'proxy': True}
assert 0 == m.db.stat['proxy']['Unknown_Msg']
m.close()
@@ -1624,7 +1624,7 @@ async def test_msg_build_modbus_req(my_loop, config_tsun_inv1, device_ind_msg, d
assert m.ifc.tx_fifo.get()==device_rsp_msg
assert m.ifc.fwd_fifo.get()==device_ind_msg
await m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
assert 0 == m.send_msg_ofs
assert m.ifc.fwd_fifo.get() == b''
assert m.sent_pdu == b'' # modbus command must be ignore, cause connection is still not up
@@ -1642,7 +1642,7 @@ async def test_msg_build_modbus_req(my_loop, config_tsun_inv1, device_ind_msg, d
assert m.ifc.tx_fifo.get()==inverter_rsp_msg
assert m.ifc.fwd_fifo.get()==inverter_ind_msg
await m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
assert 0 == m.send_msg_ofs
assert m.ifc.fwd_fifo.get() == b''
assert m.sent_pdu == msg_modbus_cmd
@@ -2318,7 +2318,7 @@ async def test_start_client_mode(my_loop, config_tsun_inv1, str_test_ip):
assert m.no_forwarding == False
assert m.mb_timer.tim == None
assert asyncio.get_running_loop() == m.mb_timer.loop
await m.send_start_cmd(get_sn_int(), str_test_ip, False, m.mb_first_timeout)
m.send_start_cmd(get_sn_int(), str_test_ip, False, m.mb_first_timeout)
assert m.sent_pdu==bytearray(b'\xa5\x17\x00\x10E\x01\x00!Ce{\x02\xb0\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x030\x00\x000J\xde\xf1\x15')
assert m.db.get_db_value(Register.IP_ADDRESS) == str_test_ip
assert isclose(m.db.get_db_value(Register.POLLING_INTERVAL), 0.5)
@@ -2351,7 +2351,7 @@ async def test_start_client_mode_scan(config_tsun_scan_dcu, str_test_ip, dcu_mod
assert m.no_forwarding == False
assert m.mb_timer.tim == None
assert asyncio.get_running_loop() == m.mb_timer.loop
await m.send_start_cmd(get_dcu_sn_int(), str_test_ip, False, m.mb_first_timeout)
m.send_start_cmd(get_dcu_sn_int(), str_test_ip, False, m.mb_first_timeout)
assert m.mb_start_reg == 0x0000
assert m.mb_step == 0x100
assert m.mb_bytes == 0x2d

View File

@@ -144,7 +144,7 @@ async def test_emu_start(my_loop, config_tsun_inv1, msg_modbus_rsp, str_test_ip,
inv = InvStream(msg_modbus_rsp)
assert asyncio.get_running_loop() == inv.mb_timer.loop
await inv.send_start_cmd(get_sn_int(), str_test_ip, True, inv.mb_first_timeout)
inv.send_start_cmd(get_sn_int(), str_test_ip, True, inv.mb_first_timeout)
inv.read() # read complete msg, and dispatch msg
assert not inv.header_valid # must be invalid, since msg was handled and buffer flushed
assert inv.msg_count == 1
@@ -161,7 +161,7 @@ async def test_snd_hb(my_loop, config_tsun_inv1, heartbeat_ind):
inv = InvStream()
cld = CldStream(inv)
# await inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
# inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
cld.send_heartbeat_cb(0)
assert cld.ifc.tx_fifo.peek() == heartbeat_ind
cld.close()
@@ -178,7 +178,7 @@ async def test_snd_inv_data(my_loop, config_tsun_inv1, inverter_ind_msg, inverte
inv.db.set_db_def_value(Register.GRID_FREQUENCY, 50.05)
inv.db.set_db_def_value(Register.PROD_COMPL_TYPE, 6)
assert asyncio.get_running_loop() == inv.mb_timer.loop
await inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
inv.db.set_db_def_value(Register.DATA_UP_INTERVAL, 17) # set test value
cld = CldStream(inv)
@@ -213,7 +213,7 @@ async def test_rcv_invalid(my_loop, config_tsun_inv1, inverter_ind_msg, inverter
_ = config_tsun_inv1
inv = InvStream()
assert asyncio.get_running_loop() == inv.mb_timer.loop
await inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
inv.db.set_db_def_value(Register.DATA_UP_INTERVAL, 17) # set test value
cld = CldStream(inv)

View File

@@ -2070,7 +2070,7 @@ def test_proxy_counter():
m.id_str = b"R170000000000001"
c = m.createClientStream(b'')
assert m.new_data == {}
assert m.new_data == {'status': True}
m.db.stat['proxy']['Unknown_Msg'] = 0
c.db.stat['proxy']['Unknown_Msg'] = 0
Infos.new_stat_data['proxy'] = False
@@ -2079,7 +2079,7 @@ def test_proxy_counter():
m.close()
m = MemoryStream(b'')
assert m.new_data == {}
assert m.new_data == {'status': True}
assert Infos.new_stat_data == {'proxy': True}
assert m.db.new_stat_data == {'proxy': True}
assert c.db.new_stat_data == {'proxy': True}
@@ -2088,7 +2088,7 @@ def test_proxy_counter():
Infos.new_stat_data['proxy'] = False
c.inc_counter('Unknown_Msg')
assert m.new_data == {}
assert m.new_data == {'status': True}
assert Infos.new_stat_data == {'proxy': True}
assert m.db.new_stat_data == {'proxy': True}
assert c.db.new_stat_data == {'proxy': True}
@@ -2097,7 +2097,7 @@ def test_proxy_counter():
Infos.new_stat_data['proxy'] = False
c.inc_counter('Modbus_Command')
assert m.new_data == {}
assert m.new_data == {'status': True}
assert Infos.new_stat_data == {'proxy': True}
assert m.db.new_stat_data == {'proxy': True}
assert c.db.new_stat_data == {'proxy': True}
@@ -2106,7 +2106,7 @@ def test_proxy_counter():
Infos.new_stat_data['proxy'] = False
m.dec_counter('Unknown_Msg')
assert m.new_data == {}
assert m.new_data == {'status': True}
assert Infos.new_stat_data == {'proxy': True}
assert 1 == m.db.stat['proxy']['Unknown_Msg']
m.close()
@@ -2258,7 +2258,7 @@ def test_msg_modbus_rsp2(config_tsun_inv1, msg_modbus_rsp20):
m.mb.req_pend = True
m.mb.err = 0
assert m.db.db == {}
assert m.db.db == {'status': {'status': 'on'}}
m.new_data['inverter'] = False
m.read() # read complete msg, and dispatch msg
@@ -2267,7 +2267,7 @@ def test_msg_modbus_rsp2(config_tsun_inv1, msg_modbus_rsp20):
assert m.msg_count == 2
assert m.ifc.fwd_fifo.get()==msg_modbus_rsp20
assert m.ifc.tx_fifo.get()==b''
assert m.db.db == {'collector': {'Serial_Number': 'R170000000000001'}, 'inverter': {'Version': 'V5.1.09', 'Rated_Power': 300}, 'grid': {'Timestamp': m._utc(), 'Voltage': 225.9, 'Current': 0.41, 'Frequency': 49.99, 'Output_Power': 94.8}, 'env': {'Inverter_Temp': 22}, 'input': {'Timestamp': m._utc(), 'pv1': {'Voltage': 0.8, 'Current': 0.0, 'Power': 0.0}, 'pv2': {'Voltage': 34.5, 'Current': 2.89, 'Power': 99.8}, 'pv3': {'Voltage': 0.0, 'Current': 0.0, 'Power': 0.0}, 'pv4': {'Voltage': 0.0, 'Current': 0.0, 'Power': 0.0}}}
assert m.db.db == {'status': {'status': 'on'}, 'collector': {'Serial_Number': 'R170000000000001'}, 'inverter': {'Version': 'V5.1.09', 'Rated_Power': 300}, 'grid': {'Timestamp': m._utc(), 'Voltage': 225.9, 'Current': 0.41, 'Frequency': 49.99, 'Output_Power': 94.8}, 'env': {'Inverter_Temp': 22}, 'input': {'Timestamp': m._utc(), 'pv1': {'Voltage': 0.8, 'Current': 0.0, 'Power': 0.0}, 'pv2': {'Voltage': 34.5, 'Current': 2.89, 'Power': 99.8}, 'pv3': {'Voltage': 0.0, 'Current': 0.0, 'Power': 0.0}, 'pv4': {'Voltage': 0.0, 'Current': 0.0, 'Power': 0.0}}}
assert m.db.get_db_value(Register.VERSION) == 'V5.1.09'
assert m.db.get_db_value(Register.TS_GRID) == m._utc()
assert m.new_data['inverter'] == True
@@ -2288,7 +2288,7 @@ def test_msg_modbus_rsp3(config_tsun_inv1, msg_modbus_rsp21):
m.mb.req_pend = True
m.mb.err = 0
assert m.db.db == {}
assert m.db.db == {'status': {'status': 'on'}}
m.new_data['inverter'] = False
m.read() # read complete msg, and dispatch msg
@@ -2297,7 +2297,7 @@ def test_msg_modbus_rsp3(config_tsun_inv1, msg_modbus_rsp21):
assert m.msg_count == 2
assert m.ifc.fwd_fifo.get()==msg_modbus_rsp21
assert m.ifc.tx_fifo.get()==b''
assert m.db.db == {'collector': {'Serial_Number': 'R170000000000001'}, 'inverter': {'Version': 'V5.1.0E', 'Rated_Power': 300}, 'grid': {'Timestamp': m._utc(), 'Voltage': 225.9, 'Current': 0.41, 'Frequency': 49.99, 'Output_Power': 94.8}, 'env': {'Inverter_Temp': 22}, 'input': {'Timestamp': m._utc(), 'pv1': {'Voltage': 0.8, 'Current': 0.0, 'Power': 0.0}, 'pv2': {'Voltage': 34.5, 'Current': 2.89, 'Power': 99.8}, 'pv3': {'Voltage': 0.0, 'Current': 0.0, 'Power': 0.0}, 'pv4': {'Voltage': 0.0, 'Current': 0.0, 'Power': 0.0}}}
assert m.db.db == {'status': {'status': 'on'}, 'collector': {'Serial_Number': 'R170000000000001'}, 'inverter': {'Version': 'V5.1.0E', 'Rated_Power': 300}, 'grid': {'Timestamp': m._utc(), 'Voltage': 225.9, 'Current': 0.41, 'Frequency': 49.99, 'Output_Power': 94.8}, 'env': {'Inverter_Temp': 22}, 'input': {'Timestamp': m._utc(), 'pv1': {'Voltage': 0.8, 'Current': 0.0, 'Power': 0.0}, 'pv2': {'Voltage': 34.5, 'Current': 2.89, 'Power': 99.8}, 'pv3': {'Voltage': 0.0, 'Current': 0.0, 'Power': 0.0}, 'pv4': {'Voltage': 0.0, 'Current': 0.0, 'Power': 0.0}}}
assert m.db.get_db_value(Register.VERSION) == 'V5.1.0E'
assert m.db.get_db_value(Register.TS_GRID) == m._utc()
assert m.new_data['inverter'] == True
@@ -2411,14 +2411,14 @@ async def test_msg_build_modbus_req(config_tsun_inv1, msg_modbus_cmd):
_ = config_tsun_inv1
m = MemoryStream(b'', (0,), True)
m.id_str = b"R170000000000001"
await m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
assert 0 == m.send_msg_ofs
assert m.ifc.fwd_fifo.get() == b''
assert m.ifc.tx_fifo.get() == b''
assert m.sent_pdu == b''
m.state = State.up
await m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
m.send_modbus_cmd(Modbus.WRITE_SINGLE_REG, 0x2008, 0, logging.DEBUG)
assert 0 == m.send_msg_ofs
assert m.ifc.fwd_fifo.get() == b''
assert m.ifc.tx_fifo.get() == b''

View File

@@ -1,22 +1,37 @@
# test_with_pytest.py
import pytest
from server import app
from web import Web, web
import logging
import os, errno
import datetime
from os import DirEntry, stat_result
from quart import current_app
from mock import patch
from server import app as my_app
from server import Server
from web import web
from async_stream import AsyncStreamClient
from gen3plus.inverter_g3p import InverterG3P
from web.log_handler import LogHandler
from test_inverter_g3p import FakeReader, FakeWriter, config_conn
from cnf.config import Config
from mock import patch
from proxy import Proxy
import os, errno
from os import DirEntry, stat_result
import datetime
class FakeServer(Server):
def __init__(self):
pass # don't call the suoer(.__init__ for unit tests
pytest_plugins = ('pytest_asyncio',)
@pytest.fixture(scope="session")
def app():
yield my_app
@pytest.fixture(scope="session")
def client():
def client(app):
app.secret_key = 'super secret key'
app.testing = True
return app.test_client()
@pytest.fixture
@@ -52,6 +67,7 @@ async def test_home(client):
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b"<title>TSUN Proxy - Connections</title>" in await response.data
@pytest.mark.asyncio
async def test_page(client):
@@ -59,14 +75,17 @@ async def test_page(client):
response = await client.get('/mqtt')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b"<title>TSUN Proxy - MQTT Status</title>" in await response.data
assert b'fetch("/mqtt-fetch")' in await response.data
@pytest.mark.asyncio
async def test_rel_page(client):
"""Test the mqtt route."""
"""Test the mqtt route with relative paths."""
web.build_relative_urls = True
response = await client.get('/mqtt')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'fetch("./mqtt-fetch")' in await response.data
web.build_relative_urls = False
@pytest.mark.asyncio
@@ -75,6 +94,7 @@ async def test_notes(client):
response = await client.get('/notes')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b"<title>TSUN Proxy - Important Messages</title>" in await response.data
@pytest.mark.asyncio
async def test_logging(client):
@@ -82,6 +102,7 @@ async def test_logging(client):
response = await client.get('/logging')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b"<title>TSUN Proxy - Log Files</title>" in await response.data
@pytest.mark.asyncio
async def test_favicon96(client):
@@ -119,37 +140,37 @@ async def test_manifest(client):
assert response.mimetype == 'application/manifest+json'
@pytest.mark.asyncio
async def test_data_fetch(create_inverter):
async def test_data_fetch(client, create_inverter):
"""Test the data-fetch route."""
_ = create_inverter
client = app.test_client()
response = await client.get('/data-fetch')
assert response.status_code == 200
response = await client.get('/data-fetch')
assert response.status_code == 200
assert b'<h5>Connections</h5>' in await response.data
@pytest.mark.asyncio
async def test_data_fetch1(create_inverter_server):
async def test_data_fetch1(client, create_inverter_server):
"""Test the data-fetch route with server connection."""
_ = create_inverter_server
client = app.test_client()
response = await client.get('/data-fetch')
assert response.status_code == 200
response = await client.get('/data-fetch')
assert response.status_code == 200
assert b'<h5>Connections</h5>' in await response.data
@pytest.mark.asyncio
async def test_data_fetch2(create_inverter_client):
async def test_data_fetch2(client, create_inverter_client):
"""Test the data-fetch route with client connection."""
_ = create_inverter_client
client = app.test_client()
response = await client.get('/data-fetch')
assert response.status_code == 200
response = await client.get('/data-fetch')
assert response.status_code == 200
assert b'<h5>Connections</h5>' in await response.data
@pytest.mark.asyncio
async def test_language_en(client):
@@ -159,21 +180,44 @@ async def test_language_en(client):
assert response.content_language.pop() == 'en'
assert response.location == '/index'
assert response.mimetype == 'text/html'
assert b'<html lang=en' in await response.data
assert b'<title>Redirecting...</title>' in await response.data
client.set_cookie('test', key='language', value='de')
response = await client.get('/mqtt')
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'<html lang="en"' in await response.data
assert b'<title>TSUN Proxy - Connections</title>' in await response.data
@pytest.mark.asyncio
async def test_language_de(client):
"""Test the language/de route."""
response = await client.get('/language/de', headers={'referer': '/'})
assert response.status_code == 302
assert response.content_language.pop() == 'de'
assert response.location == '/'
assert response.mimetype == 'text/html'
assert b'<html lang=en>' in await response.data
assert b'<title>Redirecting...</title>' in await response.data
client.set_cookie('test', key='language', value='en')
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'<html lang="de"' in await response.data
# the following assert fails on github runner, since the translation to german fails
# assert b'<title>TSUN Proxy - Verbindungen</title>' in await response.data
"""Switch back to english"""
response = await client.get('/language/en', headers={'referer': '/index'})
assert response.status_code == 302
assert response.content_language.pop() == 'en'
assert response.location == '/index'
assert response.mimetype == 'text/html'
assert b'<html lang=en>' in await response.data
assert b'<title>Redirecting...</title>' in await response.data
@pytest.mark.asyncio
async def test_language_unknown(client):
@@ -182,6 +226,12 @@ async def test_language_unknown(client):
assert response.status_code == 404
assert response.mimetype == 'text/html'
client.set_cookie('test', key='language', value='en')
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'<title>TSUN Proxy - Connections</title>' in await response.data
@pytest.mark.asyncio
async def test_mqtt_fetch(client, create_inverter):
@@ -191,15 +241,47 @@ async def test_mqtt_fetch(client, create_inverter):
response = await client.get('/mqtt-fetch')
assert response.status_code == 200
assert b'<h5>MQTT devices</h5>' in await response.data
@pytest.mark.asyncio
async def test_notes_fetch(client, config_conn):
"""Test the notes-fetch route."""
_ = create_inverter
_ = config_conn
s = FakeServer()
s.src_dir = 'app/src/'
s.init_logging_system()
# First clear log and test Well done message
logh = LogHandler()
logh.clear()
response = await client.get('/notes-fetch')
assert response.status_code == 200
assert b'<h2>Well done!</h2>' in await response.data
# Check info logs which must be ignored here
logging.info('config_info')
logh.flush()
response = await client.get('/notes-fetch')
assert response.status_code == 200
assert b'<h2>Well done!</h2>' in await response.data
# Check warning logs which must be added to the note list
logging.warning('config_warning')
logh.flush()
response = await client.get('/notes-fetch')
assert response.status_code == 200
assert b'WARNING' in await response.data
assert b'config_warning' in await response.data
# Check error logs which must be added to the note list
logging.error('config_err')
logh.flush()
response = await client.get('/notes-fetch')
assert response.status_code == 200
assert b'ERROR' in await response.data
assert b'config_err' in await response.data
@pytest.mark.asyncio
@@ -229,6 +311,7 @@ async def test_file_fetch(client, config_conn, monkeypatch):
monkeypatch.delattr(stat_result, "st_birthtime")
response = await client.get('/file-fetch')
assert response.status_code == 200
assert b'<h4>test.txt</h4>' in await response.data
@pytest.mark.asyncio
async def test_send_file(client, config_conn):
@@ -237,6 +320,7 @@ async def test_send_file(client, config_conn):
assert Config.log_path == 'app/tests/log/'
response = await client.get('/send-file/test.txt')
assert response.status_code == 200
assert b'2025-04-30 00:01:23' in await response.data
@pytest.mark.asyncio
@@ -291,3 +375,20 @@ async def test_del_file_err(client, config_conn, patch_os_remove_err):
assert Config.log_path == 'app/tests/log/'
response = await client.delete ('/del-file/test.txt')
assert response.status_code == 404
@pytest.mark.asyncio
async def test_addon_links(client):
"""Test links to HA add-on config/log in UI"""
with patch.dict(os.environ, {'SLUG': 'c676133d', 'HOSTNAME': 'c676133d-tsun-proxy'}):
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'Add-on Config' in await response.data
assert b'href="/hassio/addon/c676133d_tsun-proxy/logs' in await response.data
assert b'href="/hassio/addon/c676133d_tsun-proxy/config' in await response.data
# check that links are not available if env vars SLUG and HOSTNAME are not defined (docker version)
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
assert b'Add-on Config' not in await response.data

View File

@@ -75,6 +75,14 @@ msgstr "Wichtige Hinweise"
msgid "Log Files"
msgstr "Log Dateien"
#: src/web/templates/base.html.j2:64
msgid "Add-on Config"
msgstr "Add-on Konfiguration"
#: src/web/templates/base.html.j2:65
msgid "Add-on Log"
msgstr "Add-on Protokoll"
#: src/web/templates/page_index.html.j2:3
msgid "TSUN Proxy - Connections"
msgstr "TSUN Proxy - Verbindungen"
@@ -120,6 +128,7 @@ msgid "TSUN Proxy - Log Files"
msgstr "TSUN Proxy - Log Dateien"
#: src/web/templates/page_logging.html.j2:10
#, python-format
msgid "Do you really want to delete the log file: <br>%(file)s ?"
msgstr "Soll die Datei: <br>%(file)s<br>wirklich gelöscht werden?"

View File

@@ -29,27 +29,23 @@ target "_common" {
"type =sbom,generator=docker/scout-sbom-indexer:latest"
]
annotations = [
"index:io.hass.version=${VERSION}",
"index:io.hass.type=addon",
"index:io.hass.arch=armhf|aarch64|i386|amd64",
"index:org.opencontainers.image.title=TSUN-Proxy",
"index:org.opencontainers.image.authors=Stefan Allius",
"index:org.opencontainers.image.created=${BUILD_DATE}",
"index:org.opencontainers.image.version=${VERSION}",
"index:org.opencontainers.image.revision=${BRANCH}",
"index:org.opencontainers.image.description=${DESCRIPTION}",
"index:io.hass.arch=aarch64|amd64",
"index,manifest-descriptor:org.opencontainers.image.title=TSUN-Proxy",
"index,manifest-descriptor:org.opencontainers.image.authors=Stefan Allius",
"index,manifest-descriptor:org.opencontainers.image.created=${BUILD_DATE}",
"index,manifest-descriptor:org.opencontainers.image.version=${VERSION}",
"index,manifest-descriptor:org.opencontainers.image.description=${DESCRIPTION}",
"index:org.opencontainers.image.licenses=BSD-3-Clause",
"index:org.opencontainers.image.source=https://github.com/s-allius/tsun-gen3-proxy/ha_addons/ha_addon"
"index:org.opencontainers.image.source=https://github.com/s-allius/tsun-gen3-proxy/ha_addons/ha_addon",
]
labels = {
"io.hass.version" = "${VERSION}"
"io.hass.type" = "addon"
"io.hass.arch" = "armhf|aarch64|i386|amd64"
"io.hass.arch" = "aarch64|amd64"
"org.opencontainers.image.title" = "TSUN-Proxy"
"org.opencontainers.image.authors" = "Stefan Allius"
"org.opencontainers.image.created" = "${BUILD_DATE}"
"org.opencontainers.image.version" = "${VERSION}"
"org.opencontainers.image.revision" = "${BRANCH}"
"org.opencontainers.image.description" = "${DESCRIPTION}"
"org.opencontainers.image.licenses" = "BSD-3-Clause"
"org.opencontainers.image.source" = "https://github.com/s-allius/tsun-gen3-proxy/ha_addonsha_addon"
@@ -59,7 +55,7 @@ target "_common" {
]
no-cache = false
platforms = ["linux/amd64", "linux/arm64", "linux/arm/v7"]
platforms = ["linux/amd64", "linux/arm64"]
}
target "_debug" {

View File

@@ -13,12 +13,12 @@
# 1 Build Base Image #
######################
ARG BUILD_FROM="ghcr.io/hassio-addons/base:17.2.5"
ARG BUILD_FROM="ghcr.io/hassio-addons/base:18.0.3"
# hadolint ignore=DL3006
FROM $BUILD_FROM AS base
# Installiere Python, pip und virtuelle Umgebungstools
RUN apk add --no-cache python3=3.12.10-r1 py3-pip=24.3.1-r0 && \
RUN apk add --no-cache python3=3.12.11-r0 py3-pip=25.1.1-r0 && \
python -m venv /opt/venv && \
. /opt/venv/bin/activate

View File

@@ -1,18 +1,46 @@
#!/usr/bin/with-contenv bashio
echo "Add-on environment started"
bashio::log.blue "-----------------------------------------------------------"
bashio::log.blue "run.sh: info: setup Add-on environment"
bashio::cache.flush_all
MQTT_HOST=""
SLUG=""
HOSTNAME=""
if bashio::supervisor.ping; then
bashio::log "run.sh: info: check Home Assistant bashio for config values"
if bashio::services.available mqtt; then
MQTT_HOST=$(bashio::services mqtt "host")
MQTT_PORT=$(bashio::services mqtt "port")
MQTT_USER=$(bashio::services mqtt "username")
MQTT_PASSWORD=$(bashio::services mqtt "password")
else
bashio::log.yellow "run.sh: info: Home Assistant MQTT service not available!"
fi
SLUG=$(bashio::addon.repository)
HOSTNAME=$(bashio::addon.hostname)
else
bashio::log.red "run.sh: error: Home Assistant Supervisor API not available!"
fi
echo "check for Home Assistant MQTT"
MQTT_HOST=$(bashio::services mqtt "host")
MQTT_PORT=$(bashio::services mqtt "port")
MQTT_USER=$(bashio::services mqtt "username")
MQTT_PASSWORD=$(bashio::services mqtt "password")
if [ -z "$SLUG" ]; then
bashio::log.yellow "run.sh: info: addon slug not found"
else
bashio::log.green "run.sh: info: found addon slug: $SLUG"
export SLUG
fi
if [ -z "$HOSTNAME" ]; then
bashio::log.yellow "run.sh: info: addon hostname not found"
else
bashio::log.green "run.sh: info: found addon hostname: $HOSTNAME"
export HOSTNAME
fi
# if a MQTT was/not found, drop a note
if [ -z "$MQTT_HOST" ]; then
echo "MQTT not found"
bashio::log.yellow "run.sh: info: MQTT config not found"
else
echo "MQTT found"
bashio::log.green "run.sh: info: found MQTT config"
export MQTT_HOST
export MQTT_PORT
export MQTT_USER
@@ -29,5 +57,6 @@ cd /home/proxy || exit
export VERSION=$(cat /proxy-version.txt)
echo "Start Proxyserver..."
bashio::log.blue "run.sh: info: Start Proxyserver..."
bashio::log.blue "-----------------------------------------------------------"
python3 server.py --rel_urls --json_config=/data/options.json --log_path=/homeassistant/tsun-proxy/logs/ --config_path=/homeassistant/tsun-proxy/ --log_backups=2

View File

@@ -10,8 +10,6 @@ init: false
arch:
- aarch64
- amd64
- armhf
- armv7
startup: services
homeassistant_api: true
map: