Compare commits

..

22 Commits

Author SHA1 Message Date
Stefan Allius
6d4d32b3ca S allius/update python (#430)
* add-on: bump python to version 3.12.10-r1 (#429)
2025-05-25 03:46:41 +02:00
Stefan Allius
a43e6f85ac add-on: bump python to version 3.12.10-r1 (#429) 2025-05-25 03:15:21 +02:00
Stefan Allius
8e0c6915c7 add-on: bump python to version 3.12.10-r1 2025-05-25 02:23:36 +02:00
Stefan Allius
f69f9c6d63 mock the aiomqtt library and increse coverage (#428)
* mock the aiomqtt library and increse coverage

* test inv response for a mb scan request
2025-05-25 01:34:22 +02:00
Stefan Allius
321c66838d set no of pv modules for MS800 GEN3PLUS inverters (#424)
* set no of pv modules for MS800 GEN3PLUS inverters

* fix unit test

* increase test coverage

* change the PV module handling

- in default we set the number of modules now to
  two. So with the first data from the inverter
  we only register two modules. After we determine
  the inverter module, the number can increase to
  four and more PV modules will be registered.

  With the default value of 4, we register always
  4 modules and can't reduce the number of areas
  when we detect that the inverter only supoorts
  two PV modules
2025-05-24 23:12:55 +02:00
renovate[bot]
0a8e708735 Update dependency coverage to v7.8.2 (#426)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-05-23 22:53:24 +02:00
Stefan Allius
bd88647f0b fix the paths to copy the config.example.toml file (#425) 2025-05-22 21:29:41 +02:00
renovate[bot]
bb2250bca1 Update dependency coverage to v7.8.1 (#419)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-05-21 20:51:16 +02:00
Stefan Allius
e25aa5f922 S allius/issue397 (#418)
* change icon for notes
2025-05-20 23:40:26 +02:00
Stefan Allius
46945d55e1 add dcu_power MQTT topic (#416)
* add dcu_power MQTT topic

* add DCU_COMMAND counter

* test invalid dcu_power values

* handle and test DCU Command responses

* test dcu commands from the TSUN cloud

* cleanup MQTT topic handling

* update changelog

* test MQTT error and exception handling

* increase test coverage

* test dispatcher exceptions

* fix full_topic definition in dispatch test
2025-05-20 19:54:24 +02:00
Stefan Allius
c1bdec0844 S allius/issue396 (#413)
* improve translation of delete modal
2025-05-13 22:53:37 +02:00
Stefan Allius
4371f3dadb S allius/issue396 (#412)
* add title to table icons

* optimize datetime formatting

* change icons

* translate n/a
2025-05-13 21:38:33 +02:00
Stefan Allius
907dcb1623 S allius/issue409 (#411)
* scan log files for timestamp as creating timestamp

* increase test coverage

* add an empty file for unit tests

- the empty file is needed for unit tests to force
  an exception on the try to scan the first line
  for an timestamp

* set timezone of scanned creation time
2025-05-13 00:38:06 +02:00
renovate[bot]
2292c5e39e Update ghcr.io/hassio-addons/base Docker tag to v17.2.5 (#407)
* Update ghcr.io/hassio-addons/base Docker tag to v17.2.5

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-05-10 20:26:00 +02:00
Stefan Allius
48965ffda9 S allius/issue398 (#406)
* setup logger for hypercorn and dashboard

* use logger.ini to setup dashboard logger

* workaround: restore the hypercorn logger config

- quart/hyercorn overwrites the logger config.
  as a workaround we restore the config at the
  beginning of a request

* fix the hypercorn log handler only once

* change proxy into a ASGI application

- move Quart init from server.py into app.py
- create Server class for config and loggin setup
- restore hypercorn logging configuration after
  start of Quart/Hypercorn

* move get_log_level into Server class

* define config in test_emu_init_close

* remove Web() instance from the testcase

- with importing app.py The blueprint Web() will
  automatically created and a second call in test-
  cases must avoided

* add unit tests

* move code from app.py into server.py

* test the init_logging_system() method

* add HypercornLogHndl tests

* fix deprecated pytest async warning

- Cleanup pending async tasks
- fix deprecated warning about event_loop

* add unit test for error handling in build_config()

* coverage: ignore quart template files

* check print output in test_save_and_restore

* update changelog
2025-05-10 19:32:13 +02:00
renovate[bot]
f1628a0629 Update dependency aiomqtt to v2.4.0 (#404)
* Update dependency aiomqtt to v2.4.0

* update changelog

---------

Co-authored-by: Stefan Allius <122395479+s-allius@users.noreply.github.com>
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-05-04 19:20:45 +02:00
Stefan Allius
888e1475e4 S allius/issue397 (#405)
* add Dashboards log handler to all known loggers

* add list of last 3 warnings/errors to page

* add note list to page

* create LogHandler for the dashboard

- simple memory log handler which stores the last
  64 warnings/errors for the dashboard

* render warnings/errors as note list

* add page for warnings and errors

* fix double defined build target

* add well done message if no errors in the logs

* translate page titles

* more translations

* add Notes page and table for important messages

* add unit tests
2025-05-04 18:50:31 +02:00
Stefan Allius
e15db8c92a S allius/issue393 (#403)
* display proxy version on dashboard

* add MQTT page

* styles adjusted on the different pages

- use same colors
- add bordered shadow to all cards and tables

* fix unit tests

* migrate the conn table to a general table

- rename the template file
- get headline from table description

* remove footer from index page

* make version string translateable

* cleanup

* remove stripped table rows

* add mqtt info table

* translate mqtt page

* don't fetch notes list for the log-page

* fix Mqtt init call for unit tests

* add mqtt-fetch test

* check received counter in unit test
2025-05-03 23:45:10 +02:00
Stefan Allius
41515f4be3 S allius/issue401 (#402)
* add route for log file deletion

* add modal for senity check before file deletion

* add trash icon which unhide the modal

* add more translations

* increase test coverage

* cleanup
2025-05-02 19:47:16 +02:00
Stefan Allius
aadbe6855e S allius/issue394 (#400)
* store logging path in Config class

* rename template files and page files

* jump to referer page

- after changing the language, we jump to
  the referer page, if the attribute exists

* build and send list of log-files

* rename Download page into Log files

* initialize log-path in test config

* improve dashboard unit tests

 - add log file tests
 - check content-languages after language switch

* initialize config structure for log-file tests

* add test log file to project

* add sub_dir to test log path

- non files must be skipped. To test this we add
  a sub directory to the test log directory

* add german translations

* set quart debug flag for debug versions

* update changelog
2025-05-01 19:34:46 +02:00
Stefan Allius
7542c112f7 S allius/issue395 (#399)
* add button for languages setting

* build a web module for the dashboard

- load all python module from local dir
- initialize Blueprint and Babel

* set a default key for secure cookies

* add translations to docker container

* improve translation build  

- clean target erases the *.pot
- don't modify the resurt of url_for() calls
- don't translate the language description

* translate connection table, fix icon

* build relative urls for HA ingress

* fix unit test, increase coverage
2025-04-29 00:07:59 +02:00
Stefan Allius
093ec03d60 S allius/issue391 (#392)
* design counter on connection board

* display time of last update and add reload button

* chance `Updated` field to a real button

* Provide counter values for the dashboard

* change background color ot the top branch

- use dark-grey instead of black to reduce the contrast

* change color of counter tiles

* test proxy connection counter handling

* prepare conn-table and notes list building

* remove obsolete menue points

* store client mode for dashboard

* store inverters serial number for the dashboard

* store inverters serial number

* build connection table for dashboard

* add connection table to dashboard

* fix responsiveness of the tiles

* adapt unit tests

* remove test fake code

* increase test coverage, remove obsolete if statement
2025-04-24 23:12:26 +02:00
58 changed files with 2286 additions and 555 deletions

View File

@@ -1,2 +1,3 @@
[run]
branch = True
omit = app/src/web/templates/*.html.j2

View File

@@ -7,6 +7,18 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
## [unreleased]
- add-on: bump python to version 3.12.10-r1
- set no of pv modules for MS800 GEN3PLUS inverters
- fix the paths to copy the config.example.toml file during proxy start
- add MQTT topic `dcu_power` for setting output power on DCUs
- Update ghcr.io/hassio-addons/base Docker tag to v17.2.5
- fix a lot of pytest-asyncio problems in the unit tests
- Cleanup startup code for Quart and the Proxy
- Redirect the hypercorn traces to a separate log-file
- Configure the dashboard trace handler by the logging.ini file
- Dashboard: add Notes page and table for important messages
- Dashboard: add Log-File page
- Dashboard: add Connection page
- add web UI to add-on
- allow `Y00` serial numbers for GEN3PLUS devices

View File

@@ -1,11 +1,19 @@
.PHONY: build babel clean addon-dev addon-debug addon-rc addon-rel debug dev preview rc rel check-docker-compose install
babel debug dev preview rc rel:
babel:
$(MAKE) -C app $@
clean build:
build:
$(MAKE) -C ha_addons $@
clean:
$(MAKE) -C app $@
$(MAKE) -C ha_addons $@
debug dev preview rc rel:
$(MAKE) -C app babel
$(MAKE) -C app $@
addon-dev addon-debug addon-rc addon-rel:
$(MAKE) -C app babel
$(MAKE) -C ha_addons $(patsubst addon-%,%,$@)

View File

@@ -60,6 +60,7 @@ RUN python -m pip install --no-cache-dir --no-cache --no-index /root/wheels/* &&
# copy the content of the local src and config directory to the working directory
COPY --chmod=0700 entrypoint.sh /root/entrypoint.sh
COPY src .
COPY translations ./translations
RUN echo ${VERSION} > /proxy-version.txt \
&& date > /build-date.txt
EXPOSE 5005 8127 10000

View File

@@ -21,6 +21,8 @@ export MAJOR := $(shell echo $(VERSION) | cut -f1 -d.)
PUBLIC_URL := $(shell echo $(PUBLIC_CONTAINER_REGISTRY) | cut -f1 -d/)
PUBLIC_USER :=$(shell echo $(PUBLIC_CONTAINER_REGISTRY) | cut -f2 -d/)
clean:
rm -f $(BABEL_TRANSLATIONS)/*.pot
dev debug:
@echo version: $(VERSION) build-date: $(BUILD_DATE) image: $(PRIVAT_CONTAINER_REGISTRY)$(IMAGE)
@@ -58,4 +60,4 @@ $(BABEL_TRANSLATIONS)/%/LC_MESSAGES/messages.po : $(BABEL_TRANSLATIONS)/messages
$(BABEL_TRANSLATIONS)/%/LC_MESSAGES/messages.mo : $(BABEL_TRANSLATIONS)/%/LC_MESSAGES/messages.po
@pybabel compile -d $(BABEL_TRANSLATIONS) -l $*
.PHONY: babel debug dev preview rc rel
.PHONY: babel clean debug dev preview rc rel

View File

@@ -23,7 +23,7 @@ if [ "$user" = '0' ]; then
echo "######################################################"
echo "#"
exec su-exec $SERVICE_NAME "$@"
exec su-exec $SERVICE_NAME "$@" -tr './translations/'
else
exec "$@"
fi

View File

@@ -4,5 +4,5 @@
pytest-cov==6.1.1
python-dotenv==1.1.0
mock==5.2.0
coverage==7.8.0
coverage==7.8.2
jinja2-cli==0.8.2

View File

@@ -1,4 +1,4 @@
aiomqtt==2.3.2
aiomqtt==2.4.0
schema==0.7.7
aiocron==2.1
quart==0.20

View File

@@ -162,22 +162,25 @@ class Config():
)
@classmethod
def init(cls, def_reader: ConfigIfc) -> None | str:
def init(cls, def_reader: ConfigIfc, log_path: str = '',
cnf_path: str = 'config') -> None | str:
'''Initialise the Proxy-Config
Copy the internal default config file into the config directory
and initialise the Config with the default configuration '''
cls.err = None
cls.log_path = log_path
cls.def_config = {}
try:
# make the default config transparaent by copying it
# in the config.example file
logging.debug('Copy Default Config to config.example.toml')
logging.info(
f'Copy Default Config to {cnf_path}config.example.toml')
shutil.copy2("default_config.toml",
"config/config.example.toml")
except Exception:
pass
shutil.copy2("cnf/default_config.toml",
cnf_path + "config.example.toml")
except Exception as e:
logging.error(e)
# read example config file as default configuration
try:
@@ -247,3 +250,7 @@ here. The default config reader is handled in the Config.init method'''
'''Check if the member is the default value'''
return cls.act_config.get(member) == cls.def_config.get(member)
@classmethod
def get_log_path(cls) -> str:
return cls.log_path

View File

@@ -216,7 +216,7 @@ class InfosG3P(Infos):
self.set_db_def_value(Register.MANUFACTURER, 'TSUN')
self.set_db_def_value(Register.EQUIPMENT_MODEL, 'TSOL-MSxx00')
self.set_db_def_value(Register.CHIP_TYPE, 'IGEN TECH')
self.set_db_def_value(Register.NO_INPUTS, 4)
self.set_db_def_value(Register.NO_INPUTS, 2)
def __hide_topic(self, row: dict) -> bool:
if 'dep' in row:

44
app/src/gen3plus/solarman_v5.py Normal file → Executable file
View File

@@ -247,6 +247,7 @@ class SolarmanBase(Message):
class SolarmanV5(SolarmanBase):
AT_CMD = 1
MB_RTU_CMD = 2
DCU_CMD = 5
AT_CMD_RSP = 8
MB_CLIENT_DATA_UP = 30
'''Data up time in client mode'''
@@ -532,6 +533,26 @@ class SolarmanV5(SolarmanBase):
except Exception:
self.ifc.tx_clear()
def send_dcu_cmd(self, pdu: bytearray):
if self.sensor_list != 0x3026:
logger.debug(f'[{self.node_id}] DCU CMD not allowed,'
f' for sensor: {self.sensor_list:#04x}')
return
if self.state != State.up:
logger.warning(f'[{self.node_id}] ignore DCU CMD,'
' cause the state is not UP anymore')
return
self.inverter.forward_dcu_cmd_resp = False
self._build_header(0x4510)
self.ifc.tx_add(struct.pack('<BHLLL', self.DCU_CMD,
self.sensor_list, 0, 0, 0))
self.ifc.tx_add(pdu)
self._finish_send_msg()
self.ifc.tx_log(logging.INFO, f'Send DCU CMD :{self.addr}:')
self.ifc.tx_flush()
def __forward_msg(self):
self.forward(self.ifc.rx_peek(), self.header_len+self.data_len+2)
@@ -541,12 +562,17 @@ class SolarmanV5(SolarmanBase):
rated = db.get_db_value(Register.RATED_POWER, 0)
model = None
if max_pow == 2000:
db.set_db_def_value(Register.NO_INPUTS, 4)
if rated == 800 or rated == 600:
model = f'TSOL-MS{max_pow}({rated})'
else:
model = f'TSOL-MS{max_pow}'
elif max_pow == 1800 or max_pow == 1600:
db.set_db_def_value(Register.NO_INPUTS, 4)
model = f'TSOL-MS{max_pow}'
elif max_pow <= 800:
model = f'TSOL-MS{max_pow}'
if model:
logger.info(f'Model: {model}')
self.db.set_db_def_value(Register.EQUIPMENT_MODEL, model)
@@ -647,6 +673,10 @@ class SolarmanV5(SolarmanBase):
self.inc_counter('AT_Command')
self.inverter.forward_at_cmd_resp = True
if ftype == self.DCU_CMD:
self.inc_counter('DCU_Command')
self.inverter.forward_dcu_cmd_resp = True
elif ftype == self.MB_RTU_CMD:
rstream = self.ifc.remote.stream
if rstream.mb.recv_req(data[15:],
@@ -670,6 +700,10 @@ class SolarmanV5(SolarmanBase):
if self.inverter.forward_at_cmd_resp:
return logging.INFO
return logging.DEBUG
elif ftype == self.DCU_CMD:
if self.inverter.forward_dcu_cmd_resp:
return logging.INFO
return logging.DEBUG
elif ftype == self.MB_RTU_CMD \
and self.server_side:
return self.mb.last_log_lvl
@@ -689,6 +723,16 @@ class SolarmanV5(SolarmanBase):
logger.info(f'{key}: {data_json}')
self.publish_mqtt(f'{Proxy.entity_prfx}{node_id}{key}', data_json) # noqa: E501
return
elif ftype == self.DCU_CMD:
if not self.inverter.forward_dcu_cmd_resp:
data_json = '+ok'
node_id = self.node_id
key = 'dcu_resp'
logger.info(f'{key}: {data_json}')
self.publish_mqtt(f'{Proxy.entity_prfx}{node_id}{key}', data_json) # noqa: E501
return
elif ftype == self.MB_RTU_CMD:
self.__modbus_command_rsp(data)
return

View File

@@ -44,6 +44,7 @@ class Register(Enum):
MODBUS_COMMAND = 60
AT_COMMAND_BLOCKED = 61
CLOUD_CONN_CNT = 62
DCU_COMMAND = 63
OUTPUT_POWER = 83
RATED_POWER = 84
INVERTER_TEMP = 85
@@ -625,6 +626,7 @@ class Infos:
Register.INVALID_MSG_FMT: {'name': ['proxy', 'Invalid_Msg_Format'], 'singleton': True, 'ha': {'dev': 'proxy', 'comp': 'sensor', 'dev_cla': None, 'stat_cla': None, 'id': 'inv_msg_fmt_', 'fmt': FMT_INT, 'name': 'Invalid Message Format', 'icon': COUNTER, 'ent_cat': 'diagnostic'}}, # noqa: E501
Register.AT_COMMAND: {'name': ['proxy', 'AT_Command'], 'singleton': True, 'ha': {'dev': 'proxy', 'comp': 'sensor', 'dev_cla': None, 'stat_cla': None, 'id': 'at_cmd_', 'fmt': FMT_INT, 'name': 'AT Command', 'icon': COUNTER, 'ent_cat': 'diagnostic'}}, # noqa: E501
Register.AT_COMMAND_BLOCKED: {'name': ['proxy', 'AT_Command_Blocked'], 'singleton': True, 'ha': {'dev': 'proxy', 'comp': 'sensor', 'dev_cla': None, 'stat_cla': None, 'id': 'at_cmd_blocked_', 'fmt': FMT_INT, 'name': 'AT Command Blocked', 'icon': COUNTER, 'ent_cat': 'diagnostic'}}, # noqa: E501
Register.DCU_COMMAND: {'name': ['proxy', 'DCU_Command'], 'singleton': True, 'ha': {'dev': 'proxy', 'comp': 'sensor', 'dev_cla': None, 'stat_cla': None, 'id': 'dcu_cmd_', 'fmt': FMT_INT, 'name': 'DCU Command', 'icon': COUNTER, 'ent_cat': 'diagnostic'}}, # noqa: E501
Register.MODBUS_COMMAND: {'name': ['proxy', 'Modbus_Command'], 'singleton': True, 'ha': {'dev': 'proxy', 'comp': 'sensor', 'dev_cla': None, 'stat_cla': None, 'id': 'modbus_cmd_', 'fmt': FMT_INT, 'name': 'Modbus Command', 'icon': COUNTER, 'ent_cat': 'diagnostic'}}, # noqa: E501
# 0xffffff03: {'name':['proxy', 'Voltage'], 'level': logging.DEBUG, 'unit': 'V', 'ha':{'dev':'proxy', 'dev_cla': 'voltage', 'stat_cla': 'measurement', 'id':'proxy_volt_', 'fmt':FMT_FLOAT,'name': 'Grid Voltage'}}, # noqa: E501

View File

@@ -1,16 +1,15 @@
[loggers]
keys=root,tracer,mesg,conn,data,mqtt,asyncio
keys=root,tracer,mesg,conn,data,mqtt,asyncio,hypercorn_access,hypercorn_error
[handlers]
keys=console_handler,file_handler_name1,file_handler_name2
keys=console_handler,file_handler_name1,file_handler_name2,file_handler_name3,dashboard
[formatters]
keys=console_formatter,file_formatter
[logger_root]
level=DEBUG
handlers=console_handler,file_handler_name1
handlers=console_handler,file_handler_name1,dashboard
[logger_conn]
level=DEBUG
@@ -20,13 +19,13 @@ qualname=conn
[logger_mqtt]
level=INFO
handlers=console_handler,file_handler_name1
handlers=console_handler,file_handler_name1,dashboard
propagate=0
qualname=mqtt
[logger_asyncio]
level=INFO
handlers=console_handler,file_handler_name1
handlers=console_handler,file_handler_name1,dashboard
propagate=0
qualname=asyncio
@@ -49,6 +48,18 @@ handlers=file_handler_name2
propagate=0
qualname=tracer
[logger_hypercorn_access]
level=INFO
handlers=file_handler_name3
propagate=0
qualname=hypercorn.access
[logger_hypercorn_error]
level=INFO
handlers=file_handler_name1,dashboard
propagate=0
qualname=hypercorn.error
[handler_console_handler]
class=StreamHandler
level=DEBUG
@@ -66,6 +77,16 @@ level=NOTSET
formatter=file_formatter
args=(handlers.log_path + 'trace.log', when:='midnight', backupCount:=handlers.log_backups)
[handler_file_handler_name3]
class=handlers.TimedRotatingFileHandler
level=NOTSET
formatter=file_formatter
args=(handlers.log_path + 'access.log', when:='midnight', backupCount:=handlers.log_backups)
[handler_dashboard]
level=WARNING
class=web.log_handler.LogHandler
[formatter_console_formatter]
format=%(asctime)s %(levelname)5s | %(name)4s | %(message)s
datefmt=%Y-%m-%d %H:%M:%S

146
app/src/mqtt.py Normal file → Executable file
View File

@@ -2,18 +2,25 @@ import asyncio
import logging
import aiomqtt
import traceback
import struct
import inspect
from modbus import Modbus
from messages import Message
from cnf.config import Config
from singleton import Singleton
from datetime import datetime
logger_mqtt = logging.getLogger('mqtt')
class Mqtt(metaclass=Singleton):
__client = None
__client: aiomqtt.Client = None
__cb_mqtt_is_up = None
ctime = None
published: int = 0
received: int = 0
def __init__(self, cb_mqtt_is_up):
logger_mqtt.debug('MQTT: __init__')
@@ -22,14 +29,27 @@ class Mqtt(metaclass=Singleton):
loop = asyncio.get_event_loop()
self.task = loop.create_task(self.__loop())
self.ha_restarts = 0
self.topic_defs = [
{'prefix': 'auto_conf_prefix', 'topic': '/status',
'fnc': self._ha_status, 'args': []},
{'prefix': 'entity_prefix', 'topic': '/+/rated_load',
'fnc': self._modbus_cmd,
'args': [Modbus.WRITE_SINGLE_REG, 1, 0x2008]},
{'prefix': 'entity_prefix', 'topic': '/+/out_coeff',
'fnc': self._out_coeff, 'args': []},
{'prefix': 'entity_prefix', 'topic': '/+/dcu_power',
'fnc': self._dcu_cmd, 'args': []},
{'prefix': 'entity_prefix', 'topic': '/+/modbus_read_regs',
'fnc': self._modbus_cmd, 'args': [Modbus.READ_REGS, 2]},
{'prefix': 'entity_prefix', 'topic': '/+/modbus_read_inputs',
'fnc': self._modbus_cmd, 'args': [Modbus.READ_INPUTS, 2]},
{'prefix': 'entity_prefix', 'topic': '/+/at_cmd',
'fnc': self._at_cmd, 'args': []},
]
ha = Config.get('ha')
self.ha_status_topic = f"{ha['auto_conf_prefix']}/status"
self.mb_rated_topic = f"{ha['entity_prefix']}/+/rated_load"
self.mb_out_coeff_topic = f"{ha['entity_prefix']}/+/out_coeff"
self.mb_reads_topic = f"{ha['entity_prefix']}/+/modbus_read_regs"
self.mb_inputs_topic = f"{ha['entity_prefix']}/+/modbus_read_inputs"
self.mb_at_cmd_topic = f"{ha['entity_prefix']}/+/at_cmd"
for entry in self.topic_defs:
entry['full_topic'] = f"{ha[entry['prefix']]}{entry['topic']}"
@property
def ha_restarts(self):
@@ -52,6 +72,7 @@ class Mqtt(metaclass=Singleton):
| int | float | None = None) -> None:
if self.__client:
await self.__client.publish(topic, payload)
self.published += 1
async def __loop(self) -> None:
mqtt = Config.get('mqtt')
@@ -69,21 +90,14 @@ class Mqtt(metaclass=Singleton):
try:
async with self.__client:
logger_mqtt.info('MQTT broker connection established')
if self.__cb_mqtt_is_up:
await self.__cb_mqtt_is_up()
await self.__client.subscribe(self.ha_status_topic)
await self.__client.subscribe(self.mb_rated_topic)
await self.__client.subscribe(self.mb_out_coeff_topic)
await self.__client.subscribe(self.mb_reads_topic)
await self.__client.subscribe(self.mb_inputs_topic)
await self.__client.subscribe(self.mb_at_cmd_topic)
await self._init_new_conn()
async for message in self.__client.messages:
await self.dispatch_msg(message)
except aiomqtt.MqttError:
self.ctime = None
if Config.is_default('mqtt'):
logger_mqtt.info(
"MQTT is unconfigured; Check your config.toml!")
@@ -101,49 +115,56 @@ class Mqtt(metaclass=Singleton):
return
except Exception:
# self.inc_counter('SW_Exception') # fixme
self.ctime = None
logger_mqtt.error(
f"Exception:\n"
f"{traceback.format_exc()}")
async def _init_new_conn(self):
self.ctime = datetime.now()
self.published = 0
self.received = 0
if self.__cb_mqtt_is_up:
await self.__cb_mqtt_is_up()
for entry in self.topic_defs:
await self.__client.subscribe(entry['full_topic'])
async def dispatch_msg(self, message):
if message.topic.matches(self.ha_status_topic):
status = message.payload.decode("UTF-8")
logger_mqtt.info('Home-Assistant Status:'
f' {status}')
if status == 'online':
self.ha_restarts += 1
self.received += 1
for entry in self.topic_defs:
if message.topic.matches(entry['full_topic']) \
and 'fnc' in entry:
fnc = entry['fnc']
if inspect.iscoroutinefunction(fnc):
await entry['fnc'](message, *entry['args'])
elif callable(fnc):
entry['fnc'](message, *entry['args'])
async def _ha_status(self, message):
status = message.payload.decode("UTF-8")
logger_mqtt.info('Home-Assistant Status:'
f' {status}')
if status == 'online':
self.ha_restarts += 1
if self.__cb_mqtt_is_up:
await self.__cb_mqtt_is_up()
if message.topic.matches(self.mb_rated_topic):
await self.modbus_cmd(message,
Modbus.WRITE_SINGLE_REG,
1, 0x2008)
if message.topic.matches(self.mb_out_coeff_topic):
payload = message.payload.decode("UTF-8")
try:
val = round(float(payload) * 1024/100)
if val < 0 or val > 1024:
logger_mqtt.error('out_coeff: value must be in'
'the range 0..100,'
f' got: {payload}')
else:
await self.modbus_cmd(message,
Modbus.WRITE_SINGLE_REG,
0, 0x202c, val)
except Exception:
pass
if message.topic.matches(self.mb_reads_topic):
await self.modbus_cmd(message,
Modbus.READ_REGS, 2)
if message.topic.matches(self.mb_inputs_topic):
await self.modbus_cmd(message,
Modbus.READ_INPUTS, 2)
if message.topic.matches(self.mb_at_cmd_topic):
await self.at_cmd(message)
async def _out_coeff(self, message):
payload = message.payload.decode("UTF-8")
try:
val = round(float(payload) * 1024/100)
if val < 0 or val > 1024:
logger_mqtt.error('out_coeff: value must be in'
'the range 0..100,'
f' got: {payload}')
else:
await self._modbus_cmd(message,
Modbus.WRITE_SINGLE_REG,
0, 0x202c, val)
except Exception:
pass
def each_inverter(self, message, func_name: str):
topic = str(message.topic)
@@ -161,7 +182,7 @@ class Mqtt(metaclass=Singleton):
else:
logger_mqtt.warning(f'Node_id: {node_id} not found')
async def modbus_cmd(self, message, func, params=0, addr=0, val=0):
async def _modbus_cmd(self, message, func, params=0, addr=0, val=0):
payload = message.payload.decode("UTF-8")
for fnc in self.each_inverter(message, "send_modbus_cmd"):
res = payload.split(',')
@@ -176,7 +197,22 @@ class Mqtt(metaclass=Singleton):
val = int(res[1]) # lenght
await fnc(func, addr, val, logging.INFO)
async def at_cmd(self, message):
async def _at_cmd(self, message):
payload = message.payload.decode("UTF-8")
for fnc in self.each_inverter(message, "send_at_cmd"):
await fnc(payload)
def _dcu_cmd(self, message):
payload = message.payload.decode("UTF-8")
try:
val = round(float(payload) * 10)
if val < 1000 or val > 8000:
logger_mqtt.error('dcu_power: value must be in'
'the range 100..800,'
f' got: {payload}')
else:
pdu = struct.pack('>BBBBBBH', 1, 1, 6, 1, 0, 1, val)
for fnc in self.each_inverter(message, "send_dcu_cmd"):
fnc(pdu)
except Exception:
pass

View File

@@ -1,26 +1,162 @@
import logging
import asyncio
import logging.handlers
from logging import config # noqa F401
import asyncio
from asyncio import StreamReader, StreamWriter
import os
import argparse
from asyncio import StreamReader, StreamWriter
from quart import Quart, Response, request
from quart_babel import Babel
from quart_babel.locale import get_locale
from logging import config # noqa F401
from quart import Quart, Response
from cnf.config import Config
from cnf.config_read_env import ConfigReadEnv
from cnf.config_read_toml import ConfigReadToml
from cnf.config_read_json import ConfigReadJson
from web import Web
from web.wrapper import url_for
from proxy import Proxy
from inverter_ifc import InverterIfc
from gen3.inverter_g3 import InverterG3
from gen3plus.inverter_g3p import InverterG3P
from scheduler import Schedule
from cnf.config import Config
from cnf.config_read_env import ConfigReadEnv
from cnf.config_read_toml import ConfigReadToml
from cnf.config_read_json import ConfigReadJson
from web.routes import web_routes
from modbus_tcp import ModbusTcp
class Server():
serv_name = ''
version = ''
src_dir = ''
####
# The following default values are used for the unit tests only, since
# `Server.parse_args()' will not be called during test setup.
# Ofcorse, we can call `Server.parse_args()' in a test case explicitly
# to overwrite this values
config_path = './config/'
json_config = ''
toml_config = ''
trans_path = '../translations/'
rel_urls = False
log_path = './log/'
log_backups = 0
log_level = None
def __init__(self, app, parse_args: bool):
''' Applikation Setup
1. Read cli arguments
2. Init the logging system by the ini file
3. Log the config parms
4. Set the log-levels
5. Read the build the config for the app
'''
self.serv_name = os.getenv('SERVICE_NAME', 'proxy')
self.version = os.getenv('VERSION', 'unknown')
self.src_dir = os.path.dirname(__file__) + '/'
if parse_args: # pragma: no cover
self.parse_args(None)
self.init_logging_system()
self.build_config()
@app.context_processor
def utility_processor():
return dict(version=self.version)
def parse_args(self, arg_list: list[str] | None):
parser = argparse.ArgumentParser()
parser.add_argument('-c', '--config_path', type=str,
default='./config/',
help='set path for the configuration files')
parser.add_argument('-j', '--json_config', type=str,
help='read user config from json-file')
parser.add_argument('-t', '--toml_config', type=str,
help='read user config from toml-file')
parser.add_argument('-l', '--log_path', type=str,
default='./log/',
help='set path for the logging files')
parser.add_argument('-b', '--log_backups', type=int,
default=0,
help='set max number of daily log-files')
parser.add_argument('-tr', '--trans_path', type=str,
default='../translations/',
help='set path for the translations files')
parser.add_argument('-r', '--rel_urls', action="store_true",
help='use relative dashboard urls')
args = parser.parse_args(arg_list)
self.config_path = args.config_path
self.json_config = args.json_config
self.toml_config = args.toml_config
self.trans_path = args.trans_path
self.rel_urls = args.rel_urls
self.log_path = args.log_path
self.log_backups = args.log_backups
def init_logging_system(self):
setattr(logging.handlers, "log_path", self.log_path)
setattr(logging.handlers, "log_backups", self.log_backups)
os.makedirs(self.log_path, exist_ok=True)
logging.config.fileConfig(self.src_dir + 'logging.ini')
logging.info(
f'Server "{self.serv_name} - {self.version}" will be started')
logging.info(f'current dir: {os.getcwd()}')
logging.info(f"config_path: {self.config_path}")
logging.info(f"json_config: {self.json_config}")
logging.info(f"toml_config: {self.toml_config}")
logging.info(f"trans_path: {self.trans_path}")
logging.info(f"rel_urls: {self.rel_urls}")
logging.info(f"log_path: {self.log_path}")
if self.log_backups == 0:
logging.info("log_backups: unlimited")
else:
logging.info(f"log_backups: {self.log_backups} days")
self.log_level = self.get_log_level()
logging.info('******')
if self.log_level:
# set lowest-severity for 'root', 'msg', 'conn' and 'data' logger
logging.getLogger().setLevel(self.log_level)
logging.getLogger('msg').setLevel(self.log_level)
logging.getLogger('conn').setLevel(self.log_level)
logging.getLogger('data').setLevel(self.log_level)
logging.getLogger('tracer').setLevel(self.log_level)
logging.getLogger('asyncio').setLevel(self.log_level)
# logging.getLogger('mqtt').setLevel(self.log_level)
def build_config(self):
# read config file
Config.init(ConfigReadToml(self.src_dir + "cnf/default_config.toml"),
log_path=self.log_path,
cnf_path=self.config_path)
ConfigReadEnv()
ConfigReadJson(self.config_path + "config.json")
ConfigReadToml(self.config_path + "config.toml")
ConfigReadJson(self.json_config)
ConfigReadToml(self.toml_config)
config_err = Config.get_error()
if config_err is not None:
logging.info(f'config_err: {config_err}')
return
logging.info('******')
def get_log_level(self) -> int | None:
'''checks if LOG_LVL is set in the environment and returns the
corresponding logging.LOG_LEVEL'''
switch = {
'DEBUG': logging.DEBUG,
'WARN': logging.WARNING,
'INFO': logging.INFO,
'ERROR': logging.ERROR,
}
log_lvl = os.getenv('LOG_LVL', None)
logging.info(f"LOG_LVL : {log_lvl}")
return switch.get(log_lvl, None)
class ProxyState:
_is_up = False
@@ -33,31 +169,48 @@ class ProxyState:
ProxyState._is_up = value
def my_get_locale():
# check how to get the locale form for the add-on - hass.selectedLanguage
# logging.info("get_locale(%s)", request.accept_languages)
return request.accept_languages.best_match(
['de', 'en']
)
class HypercornLogHndl:
access_hndl = []
error_hndl = []
must_fix = False
HYPERC_ERR = 'hypercorn.error'
HYPERC_ACC = 'hypercorn.access'
@classmethod
def save(cls):
cls.access_hndl = logging.getLogger(
cls.HYPERC_ACC).handlers
cls.error_hndl = logging.getLogger(
cls.HYPERC_ERR).handlers
cls.must_fix = True
def my_get_tz():
return 'CET'
@classmethod
def restore(cls):
if not cls.must_fix:
return
cls.must_fix = False
access_hndl = logging.getLogger(
cls.HYPERC_ACC).handlers
if access_hndl != cls.access_hndl:
print(' * Fix hypercorn.access setting')
logging.getLogger(
cls.HYPERC_ACC).handlers = cls.access_hndl
error_hndl = logging.getLogger(
cls.HYPERC_ERR).handlers
if error_hndl != cls.error_hndl:
print(' * Fix hypercorn.error setting')
logging.getLogger(
cls.HYPERC_ERR).handlers = cls.error_hndl
app = Quart(__name__,
template_folder='web/templates',
static_folder='web/static')
babel = Babel(app,
locale_selector=my_get_locale,
timezone_selector=my_get_tz,
default_translation_directories='../translations')
app.register_blueprint(web_routes)
@app.context_processor
def utility_processor():
return dict(lang=get_locale())
app.secret_key = 'JKLdks.dajlKKKdladkflKwolafallsdfl'
app.jinja_env.globals.update(url_for=url_for)
server = Server(app, __name__ == "__main__")
Web(app, server.trans_path, server.rel_urls)
@app.route('/-/ready')
@@ -87,13 +240,36 @@ async def healthy():
return Response(status=200, response="I'm fine")
async def handle_client(reader: StreamReader, writer: StreamWriter, inv_class):
async def handle_client(reader: StreamReader,
writer: StreamWriter,
inv_class): # pragma: no cover
'''Handles a new incoming connection and starts an async loop'''
with inv_class(reader, writer) as inv:
await inv.local.ifc.server_loop()
@app.before_serving
async def startup_app(): # pragma: no cover
HypercornLogHndl.save()
loop = asyncio.get_event_loop()
Proxy.class_init()
Schedule.start()
ModbusTcp(loop)
for inv_class, port in [(InverterG3, 5005), (InverterG3P, 10000)]:
logging.info(f'listen on port: {port} for inverters')
loop.create_task(asyncio.start_server(lambda r, w, i=inv_class:
handle_client(r, w, i),
'0.0.0.0', port))
ProxyState.set_up(True)
@app.before_request
async def startup_request():
HypercornLogHndl.restore()
@app.after_serving
async def handle_shutdown(): # pragma: no cover
'''Close all TCP connections and stop the event loop'''
@@ -110,121 +286,15 @@ async def handle_shutdown(): # pragma: no cover
logging.info('Proxy disconnecting done')
#
# now cancel all remaining (pending) tasks
#
for task in asyncio.all_tasks():
if task == asyncio.current_task():
continue
task.cancel()
logging.info('Proxy cancelling done')
await Proxy.class_close(loop)
def get_log_level() -> int | None:
'''checks if LOG_LVL is set in the environment and returns the
corresponding logging.LOG_LEVEL'''
switch = {
'DEBUG': logging.DEBUG,
'WARN': logging.WARNING,
'INFO': logging.INFO,
'ERROR': logging.ERROR,
}
log_level = os.getenv('LOG_LVL', None)
logging.info(f"LOG_LVL : {log_level}")
if __name__ == "__main__": # pragma: no cover
return switch.get(log_level, None)
def main(): # pragma: no cover
parser = argparse.ArgumentParser()
parser.add_argument('-c', '--config_path', type=str,
default='./config/',
help='set path for the configuration files')
parser.add_argument('-j', '--json_config', type=str,
help='read user config from json-file')
parser.add_argument('-t', '--toml_config', type=str,
help='read user config from toml-file')
parser.add_argument('-l', '--log_path', type=str,
default='./log/',
help='set path for the logging files')
parser.add_argument('-b', '--log_backups', type=int,
default=0,
help='set max number of daily log-files')
args = parser.parse_args()
#
# Setup our daily, rotating logger
#
serv_name = os.getenv('SERVICE_NAME', 'proxy')
version = os.getenv('VERSION', 'unknown')
setattr(logging.handlers, "log_path", args.log_path)
setattr(logging.handlers, "log_backups", args.log_backups)
os.makedirs(args.log_path, exist_ok=True)
src_dir = os.path.dirname(__file__) + '/'
logging.config.fileConfig(src_dir + 'logging.ini')
logging.info(f'Server "{serv_name} - {version}" will be started')
logging.info(f'current dir: {os.getcwd()}')
logging.info(f"config_path: {args.config_path}")
logging.info(f"json_config: {args.json_config}")
logging.info(f"toml_config: {args.toml_config}")
logging.info(f"log_path: {args.log_path}")
if args.log_backups == 0:
logging.info("log_backups: unlimited")
else:
logging.info(f"log_backups: {args.log_backups} days")
log_level = get_log_level()
logging.info('******')
if log_level:
# set lowest-severity for 'root', 'msg', 'conn' and 'data' logger
logging.getLogger().setLevel(log_level)
logging.getLogger('msg').setLevel(log_level)
logging.getLogger('conn').setLevel(log_level)
logging.getLogger('data').setLevel(log_level)
logging.getLogger('tracer').setLevel(log_level)
logging.getLogger('asyncio').setLevel(log_level)
# logging.getLogger('mqtt').setLevel(log_level)
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
# read config file
Config.init(ConfigReadToml(src_dir + "cnf/default_config.toml"))
ConfigReadEnv()
ConfigReadJson(args.config_path + "config.json")
ConfigReadToml(args.config_path + "config.toml")
ConfigReadJson(args.json_config)
ConfigReadToml(args.toml_config)
config_err = Config.get_error()
if config_err is not None:
logging.info(f'config_err: {config_err}')
return
logging.info('******')
Proxy.class_init()
Schedule.start()
ModbusTcp(loop)
#
# Create tasks for our listening servers. These must be tasks! If we call
# start_server directly out of our main task, the eventloop will be blocked
# and we can't receive and handle the UNIX signals!
#
for inv_class, port in [(InverterG3, 5005), (InverterG3P, 10000)]:
logging.info(f'listen on port: {port} for inverters')
loop.create_task(asyncio.start_server(lambda r, w, i=inv_class:
handle_client(r, w, i),
'0.0.0.0', port))
loop.set_debug(log_level == logging.DEBUG)
try:
ProxyState.set_up(True)
logging.info("Start Quart")
app.run(host='0.0.0.0', port=8127, use_reloader=False, loop=loop)
app.run(host='0.0.0.0', port=8127, use_reloader=False,
debug=server.log_level == logging.DEBUG)
logging.info("Quart stopped")
except KeyboardInterrupt:
@@ -233,10 +303,4 @@ def main(): # pragma: no cover
logging.info("Quart cancelled")
finally:
logging.debug('Close event loop')
loop.close()
logging.info(f'Finally, exit Server "{serv_name}"')
if __name__ == "__main__": # pragma: no cover
main()
logging.info(f'Finally, exit Server "{server.serv_name}"')

25
app/src/utils/__init__.py Normal file
View File

@@ -0,0 +1,25 @@
import mimetypes
from importlib import import_module
from pathlib import Path
from collections.abc import Callable
class SourceFileLoader:
""" Represents a SouceFileLoader (__loader__)"""
name: str
get_resource_reader: Callable
def load_modules(loader: SourceFileLoader):
"""Load the entire modules from a SourceFileLoader (__loader__)"""
pkg = loader.name
for load in loader.get_resource_reader().contents():
if "python" not in str(mimetypes.guess_type(load)[0]):
continue
mod = Path(load).stem
if mod == "__init__":
continue
import_module(pkg + "." + mod, pkg)

32
app/src/web/__init__.py Normal file
View File

@@ -0,0 +1,32 @@
'''Quart blueprint for the proxy webserver with the dashboard
Usage:
app = Quart(__name__, ...)
Web(app)
'''
from quart import Quart, Blueprint
from quart_babel import Babel
from utils import load_modules
web = Blueprint('web', __name__)
load_modules(__loader__)
class Web:
'''Helper Class to register the Blueprint at Quart and
initializing Babel'''
def __init__(self,
app: Quart,
translation_directories: str | list[str],
rel_urls: bool):
web.build_relative_urls = rel_urls
app.register_blueprint(web)
from .i18n import get_locale, get_tz
global babel
babel = Babel(
app,
locale_selector=get_locale,
timezone_selector=get_tz,
default_translation_directories=translation_directories)

View File

@@ -1,57 +1,65 @@
from inverter_base import InverterBase
from quart import render_template
from quart_babel import format_datetime, _
from infos import Infos
from . import web
from .log_handler import LogHandler
def _get_device_icon(client_mode: bool):
'''returns the icon for the device conntection'''
if client_mode:
return 'fa-download fa-rotate-180'
return 'fa-download fa-rotate-180', 'Server Mode'
return 'fa-upload fa-rotate-180'
return 'fa-upload fa-rotate-180', 'Client Mode'
def _get_cloud_icon(emu_mode: bool):
'''returns the icon for the cloud conntection'''
if emu_mode:
return 'fa-cloud-arrow-down-alt'
return 'fa-cloud-arrow-up-alt', 'Emu Mode'
return 'fa-cloud'
return 'fa-cloud', 'Proxy Mode'
def _get_row(inv: InverterBase):
'''build one row for the connection table'''
client_mode = inv.client_mode
inv_serial = inv.local.stream.inv_serial
icon1 = _get_device_icon(client_mode)
icon1, descr1 = _get_device_icon(client_mode)
ip1, port1 = inv.addr
icon2 = ''
descr2 = ''
ip2 = '--'
port2 = '--'
if inv.remote.ifc:
ip2, port2 = inv.remote.ifc.r_addr
icon2 = _get_cloud_icon(client_mode)
icon2, descr2 = _get_cloud_icon(client_mode)
row = []
row.append(f'<i class="fa {icon1}"></i> {ip1}:{port1}')
row.append(f'<i class="fa {icon1}"></i> {ip1}')
row.append(f'<i class="fa {icon1}" title="{_(descr1)}"></i> {ip1}:{port1}')
row.append(f'<i class="fa {icon1}" title="{_(descr1)}"></i> {ip1}')
row.append(inv_serial)
row.append(f'<i class="fa {icon2}"></i> {ip2}:{port2}')
row.append(f'<i class="fa {icon2}"></i> {ip2}')
row.append(f'<i class="fa {icon2}" title="{_(descr2)}"></i> {ip2}:{port2}')
row.append(f'<i class="fa {icon2}" title="{_(descr2)}"></i> {ip2}')
return row
def get_table_data():
'''build the connection table'''
table = {
"headline": _('Connections'),
"col_classes": [
"w3-hide-small w3-hide-medium", "w3-hide-large",
"",
"w3-hide-small w3-hide-medium", "w3-hide-large",
],
"thead": [[
'Device-IP:Port', 'Device-IP',
"Serial-No",
"Cloud-IP:Port", "Cloud-IP"
_('Device-IP:Port'), _('Device-IP'),
_("Serial-No"),
_("Cloud-IP:Port"), _("Cloud-IP")
]],
"tbody": []
}
@@ -59,3 +67,22 @@ def get_table_data():
table['tbody'].append(_get_row(inverter))
return table
@web.route('/data-fetch')
async def data_fetch():
data = {
"update-time": format_datetime(format="medium"),
"server-cnt": f"<h3>{Infos.get_counter('ServerMode_Cnt')}</h3>",
"client-cnt": f"<h3>{Infos.get_counter('ClientMode_Cnt')}</h3>",
"proxy-cnt": f"<h3>{Infos.get_counter('ProxyMode_Cnt')}</h3>",
"emulation-cnt": f"<h3>{Infos.get_counter('EmuMode_Cnt')}</h3>",
}
data["conn-table"] = await render_template('templ_table.html.j2',
table=get_table_data())
data["notes-list"] = await render_template(
'templ_notes_list.html.j2',
notes=LogHandler().get_buffer(3),
hide_if_empty=True)
return data

37
app/src/web/favicon.py Normal file
View File

@@ -0,0 +1,37 @@
import os
from quart import send_from_directory
from . import web
async def get_icon(file: str, mime: str = 'image/png'):
return await send_from_directory(
os.path.join(web.root_path, 'static/images'),
file,
mimetype=mime)
@web.route('/favicon-96x96.png')
async def favicon():
return await get_icon('favicon-96x96.png')
@web.route('/favicon.ico')
async def favicon_ico():
return await get_icon('favicon.ico', 'image/x-icon')
@web.route('/favicon.svg')
async def favicon_svg():
return await get_icon('favicon.svg', 'image/svg+xml')
@web.route('/apple-touch-icon.png')
async def apple_touch():
return await get_icon('apple-touch-icon.png')
@web.route('/site.webmanifest')
async def webmanifest():
return await get_icon('site.webmanifest', 'application/manifest+json')

45
app/src/web/i18n.py Normal file
View File

@@ -0,0 +1,45 @@
from quart import request, session, redirect, abort
from quart_babel.locale import get_locale as babel_get_locale
from . import web
LANGUAGES = {
'en': 'English',
'de': 'Deutsch',
# 'fr': 'Français'
}
def get_locale():
try:
language = session['language']
except KeyError:
language = None
if language is not None:
return language
# check how to get the locale form for the add-on - hass.selectedLanguage
# logging.info("get_locale(%s)", request.accept_languages)
return request.accept_languages.best_match(LANGUAGES.keys())
def get_tz():
return 'CET'
@web.context_processor
def utility_processor():
return dict(lang=babel_get_locale(),
lang_str=LANGUAGES.get(str(babel_get_locale()), "English"),
languages=LANGUAGES)
@web.route('/language/<language>')
async def set_language(language=None):
if language in LANGUAGES:
session['language'] = language
rsp = redirect(request.referrer if request.referrer else '../#')
rsp.content_language = language
return rsp
return abort(404)

92
app/src/web/log_files.py Normal file
View File

@@ -0,0 +1,92 @@
from quart import render_template
from quart_babel import format_datetime, format_decimal, _
from quart.helpers import send_from_directory
from werkzeug.utils import secure_filename
from cnf.config import Config
from datetime import datetime
from os import DirEntry
import os
from dateutil import tz
from . import web
def _get_birth_from_log(path: str) -> None | datetime:
'''read timestamp from the first line of a log file'''
dt = None
try:
with open(path) as f:
first_line = f.readline()
first_line = first_line.lstrip("'")
fmt = "%Y-%m-%d %H:%M:%S" if first_line[4] == '-' \
else "%d-%m-%Y %H:%M:%S"
dt = datetime.strptime(first_line[0:19], fmt). \
replace(tzinfo=tz.tzlocal())
except Exception:
pass
return dt
def _get_file(file: DirEntry) -> dict:
'''build one row for the connection table'''
entry = {}
entry['name'] = file.name
stat = file.stat()
entry['size'] = format_decimal(stat.st_size)
try:
dt = stat.st_birthtime
except Exception:
dt = _get_birth_from_log(file.path)
if dt:
entry['created'] = format_datetime(dt, format="short")
# sort by creating date, if available
entry['date'] = dt if isinstance(dt, float) else dt.timestamp()
else:
entry['created'] = _('n/a')
entry['date'] = stat.st_mtime
entry['modified'] = format_datetime(stat.st_mtime, format="short")
return entry
def get_list_data() -> list:
'''build the connection table'''
file_list = []
with os.scandir(Config.get_log_path()) as it:
for entry in it:
if entry.is_file():
file_list.append(_get_file(entry))
file_list.sort(key=lambda x: x['date'], reverse=True)
return file_list
@web.route('/file-fetch')
async def file_fetch():
data = {
"update-time": format_datetime(format="medium"),
}
data["file-list"] = await render_template('templ_log_files_list.html.j2',
dir_list=get_list_data())
return data
@web.route('/send-file/<file>')
async def send(file):
return await send_from_directory(
directory=Config.get_log_path(),
file_name=secure_filename(file),
as_attachment=True)
@web.route('/del-file/<file>', methods=['DELETE'])
async def delete(file):
try:
os.remove(Config.get_log_path() + secure_filename(file))
except OSError:
return 'File not found', 404
return '', 204

View File

@@ -0,0 +1,24 @@
from logging import Handler
from logging import LogRecord
import logging
from collections import deque
from singleton import Singleton
class LogHandler(Handler, metaclass=Singleton):
def __init__(self, capacity=64):
super().__init__(logging.WARNING)
self.capacity = capacity
self.buffer = deque(maxlen=capacity)
def emit(self, record: LogRecord):
self.buffer.append({
'ctime': record.created,
'level': record.levelno,
'lname': record.levelname,
'msg': record.getMessage()
})
def get_buffer(self, elms=0) -> list:
return list(self.buffer)[-elms:]

67
app/src/web/mqtt_table.py Normal file
View File

@@ -0,0 +1,67 @@
from inverter_base import InverterBase
from quart import render_template
from quart_babel import format_datetime, _
from mqtt import Mqtt
from . import web
from .log_handler import LogHandler
def _get_row(inv: InverterBase):
'''build one row for the connection table'''
entity_prfx = inv.entity_prfx
inv_serial = inv.local.stream.inv_serial
node_id = inv.local.stream.node_id
sug_area = inv.local.stream.sug_area
row = []
row.append(inv_serial)
row.append(entity_prfx+node_id)
row.append(sug_area)
return row
def get_table_data():
'''build the connection table'''
table = {
"headline": _('MQTT devices'),
"col_classes": [
"",
"",
"",
],
"thead": [[
_("Serial-No"),
_('Node-ID'),
_('HA-Area'),
]],
"tbody": []
}
for inverter in InverterBase:
table['tbody'].append(_get_row(inverter))
return table
@web.route('/mqtt-fetch')
async def mqtt_fetch():
mqtt = Mqtt(None)
cdatetime = format_datetime(dt=mqtt.ctime, format='d.MM. HH:mm')
data = {
"update-time": format_datetime(format="medium"),
"mqtt-ctime": f"""
<h3 class="w3-hide-small w3-hide-medium">{cdatetime}</h3>
<h4 class="w3-hide-large">{cdatetime}</h4>
""",
"mqtt-tx": f"<h3>{mqtt.published}</h3>",
"mqtt-rx": f"<h3>{mqtt.received}</h3>",
}
data["mqtt-table"] = await render_template('templ_table.html.j2',
table=get_table_data())
data["notes-list"] = await render_template(
'templ_notes_list.html.j2',
notes=LogHandler().get_buffer(3),
hide_if_empty=True)
return data

19
app/src/web/notes_list.py Normal file
View File

@@ -0,0 +1,19 @@
from quart import render_template
from quart_babel import format_datetime
from . import web
from .log_handler import LogHandler
@web.route('/notes-fetch')
async def notes_fetch():
data = {
"update-time": format_datetime(format="medium"),
}
data["notes-list"] = await render_template(
'templ_notes_list.html.j2',
notes=LogHandler().get_buffer(),
hide_if_empty=False)
return data

32
app/src/web/pages.py Normal file
View File

@@ -0,0 +1,32 @@
from quart import render_template
from .wrapper import url_for
from . import web
@web.route('/')
async def index():
return await render_template(
'page_index.html.j2',
fetch_url=url_for('.data_fetch'))
@web.route('/mqtt')
async def mqtt():
return await render_template(
'page_mqtt.html.j2',
fetch_url=url_for('.mqtt_fetch'))
@web.route('/notes')
async def notes():
return await render_template(
'page_notes.html.j2',
fetch_url=url_for('.notes_fetch'))
@web.route('/logging')
async def logging():
return await render_template(
'page_logging.html.j2',
fetch_url=url_for('.file_fetch'))

View File

@@ -1,69 +0,0 @@
from quart import Blueprint
from quart import render_template, url_for
from quart import send_from_directory
from quart_babel import format_datetime
from infos import Infos
from web.conn_table import get_table_data
import os
web_routes = Blueprint('web_routes', __name__)
async def get_icon(file: str, mime: str = 'image/png'):
return await send_from_directory(
os.path.join(web_routes.root_path, 'static/images'),
file,
mimetype=mime)
@web_routes.route('/')
async def index():
return await render_template(
'index.html.j2',
fetch_url='.'+url_for('web_routes.data_fetch'))
@web_routes.route('/page')
async def empty():
return await render_template('empty.html.j2')
@web_routes.route('/data-fetch')
async def data_fetch():
data = {
"update-time": format_datetime(format="medium"),
"server-cnt": f"<h3>{Infos.get_counter('ServerMode_Cnt')}</h3>",
"client-cnt": f"<h3>{Infos.get_counter('ClientMode_Cnt')}</h3>",
"proxy-cnt": f"<h3>{Infos.get_counter('ProxyMode_Cnt')}</h3>",
"emulation-cnt": f"<h3>{Infos.get_counter('EmuMode_Cnt')}</h3>",
}
data["conn-table"] = await render_template('conn_table.html.j2',
table=get_table_data())
data["notes-list"] = await render_template('notes_list.html.j2')
return data
@web_routes.route('/favicon-96x96.png')
async def favicon():
return await get_icon('favicon-96x96.png')
@web_routes.route('/favicon.ico')
async def favicon_ico():
return await get_icon('favicon.ico', 'image/x-icon')
@web_routes.route('/favicon.svg')
async def favicon_svg():
return await get_icon('favicon.svg', 'image/svg+xml')
@web_routes.route('/apple-touch-icon.png')
async def apple_touch():
return await get_icon('apple-touch-icon.png')
@web_routes.route('/site.webmanifest')
async def webmanifest():
return await get_icon('site.webmanifest', 'application/manifest+json')

View File

@@ -4,8 +4,8 @@
<title>{% block title %}{% endblock title %}</title>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<link rel="stylesheet" href=".{{ url_for('static', filename= 'css/style.css') }}">
<link rel="stylesheet" href=".{{ url_for('static', filename= 'font-awesome/css/all.min.css') }}">
<link rel="stylesheet" href="{{ url_for('static', filename= 'css/style.css') }}">
<link rel="stylesheet" href="{{ url_for('static', filename= 'font-awesome/css/all.min.css') }}">
<link rel="icon" type="image/png" href="/favicon-96x96.png" sizes="96x96" />
<link rel="icon" type="image/svg+xml" href="/favicon.svg" />
<link rel="shortcut icon" href="/favicon.ico" />
@@ -14,7 +14,7 @@
<style>
@font-face {
font-family: Roboto;
src: url(".{{ url_for('static', filename= 'font/roboto-light.ttf') }}");
src: url("{{ url_for('static', filename= 'font/roboto-light.ttf') }}");
}
html,body,h1,h2,h3,h4,h5 {font-family: Roboto, sans-serif}
</style>
@@ -22,23 +22,31 @@
<body class="w3-light-grey">
<!-- Top container -->
<div class="w3-bar w3-top w3-dark-grey w3-large" style="z-index:4">
<button class="w3-bar-item w3-button w3-hide-large w3-hover-none w3-hover-text-light-grey" onclick="w3_open();"><i class="fa fa-bars"></i>  Menu</button>
<div class="w3-bar w3-dark-grey w3-large" style="z-index:4">
<button class="w3-bar-item w3-button w3-hide-large" onclick="w3_open();"><i class="fa fa-bars"></i>  Menu</button>
<div class="w3-dropdown-hover w3-right">
<button class="w3-button">{{lang_str}}</button>
<div class="w3-dropdown-content w3-bar-block w3-card-4" style="right:0">
{% for language in languages %}
<a href="{{url_for('web.set_language', language=language)}}" class="w3-bar-item w3-button">{{languages[language]}}</a>
{% endfor %}
</div>
</div>
{% if fetch_url is defined %}
<button class="w3-bar-item w3-button w3-hover-none w3-hover-text-light-grey w3-right" onclick="fetch_data();"><span class="w3-hide-small">{{_('Updated:')}}  </span><span id="update-time"></span>  <i class="fa fa-rotate-right w3-medium"></i></button>
{% else %}
<span class="w3-bar-item w3-right">Logo</span>
<button class="w3-bar-item w3-button w3-right" onclick="fetch_data();"><span class="w3-hide-small">{{_('Updated:')}}  </span><span id="update-time"></span>  <i class="w3-hover fa fa-rotate-right w3-medium"></i></button>
{% endif %}
<div class="w3-clear"></div>
</div>
<!-- Sidebar/menu -->
<nav class="w3-sidebar w3-collapse w3-white" style="z-index:3;width:250px;" id="mySidebar"><br>
<div class="w3-container w3-row">
<div class="w3-col s4">
<img src=".{{ url_for('static', filename= 'images/favicon.svg') }}" alt="" class="w3-circle w3-margin-right" style="width:60px">
<div class="w3-container w3-cell-row">
<div class="w3-cell w3-cell-middle">
<img src="{{url_for('static', filename= 'images/favicon.svg') }}" alt="" class="w3-circle w3-margin-right" style="width:60px">
</div>
<div class="w3-col s8 w3-bar">
<h3>TSUN-Proxy</h3><br>
<div class="w3-cell">
<span><b class="w3-xlarge">TSUN-Proxy</b><br>{{_('Version:')}} {{version}}</span>
</div>
</div>
<hr>
@@ -47,16 +55,17 @@
</div>
<div class="w3-bar-block">
<button href="#" class="w3-bar-item w3-button w3-padding-16 w3-hide-large w3-dark-grey w3-hover-black" onclick="w3_close()" title="close menu"><i class="fa fa-remove fa-fw"></i>  Close Menu</button>
<a href=".{{ url_for('web_routes.index')}}" class="w3-bar-item w3-button w3-padding {% block menu1_class %}{% endblock %}"><i class="fa fa-network-wired fa-fw"></i>  {{_('Connections')}}</a>
<a href=".{{ url_for('web_routes.empty')}}" class="w3-bar-item w3-button w3-padding {% block menu2_class %}{% endblock %}"><i class="fa fa-database fa-fw"></i>  MQTT</a>
<a href=".{{ url_for('web_routes.empty')}}" class="w3-bar-item w3-button w3-padding"><i class="fa fa-file-export fa-fw {% block menu3_class %}{% endblock %}"></i>  Downloads</a>
<a href="{{ url_for('.index')}}" class="w3-bar-item w3-button w3-padding {% block menu1_class %}{% endblock %}"><i class="fa fa-network-wired fa-fw"></i>  {{_('Connections')}}</a>
<a href="{{ url_for('.mqtt')}}" class="w3-bar-item w3-button w3-padding {% block menu2_class %}{% endblock %}"><i class="fa fa-database fa-fw"></i>  MQTT</a>
<a href="{{ url_for('.notes')}}" class="w3-bar-item w3-button w3-padding {% block menu3_class %}{% endblock %}"><i class="fa fa-info fa-fw"></i>  {{_('Important Messages')}}</a>
<a href="{{ url_for('.logging')}}" class="w3-bar-item w3-button w3-padding {% block menu4_class %}{% endblock %}"><i class="fa fa-file-export fa-fw"></i>  {{_('Log Files')}}</a>
</div>
</nav>
<!-- Overlay effect when opening sidebar on small screens -->
<button class="w3-overlay w3-hide-large w3-animate-opacity" onclick="w3_close()" style="cursor:pointer" title="close side menu" id="myOverlay"></button>
<!-- !PAGE CONTENT! -->
<div class="w3-main" style="margin-left:250px;margin-top:43px;">

View File

@@ -1,9 +0,0 @@
{% extends 'base.html.j2' %}
{% block title %} TSUN Proxy - View {% endblock title%}
{% block menu2_class %}w3-blue{% endblock %}
{% block content %}
{% endblock content%}
{% block footer %}{% endblock footer %}

View File

@@ -1,12 +1,13 @@
{% extends 'base.html.j2' %}
{% block title %} TSUN Proxy - Connections {% endblock title%}
{% block title %}{{_("TSUN Proxy - Connections")}}{% endblock title %}
{% block menu1_class %}w3-blue{% endblock %}
{% block headline %}<i class="fa fa-network-wired"></i>  {{_('Proxy Connection Overview')}}{% endblock headline %}
{% block content %}
<div class="w3-row-padding w3-margin-bottom">
<div class="w3-quarter">
<div class="w3-card-4">
<div class="w3-container w3-indigo w3-padding-16">
<div class="w3-left"><i class="fa fa-upload w3-xxxlarge fa-rotate-180"></i></div>
<div id = "server-cnt" class="w3-right">
@@ -16,8 +17,10 @@
<h4>{{_('Server Mode')}}</h4>
<div class="w3-hide-small w3-hide-medium" style="min-height:50px">{{_('Established from device to proxy')}}</div>
</div>
</div>
</div>
<div class="w3-quarter">
<div class="w3-card-4">
<div class="w3-container w3-purple w3-padding-16">
<div class="w3-left"><i class="fa fa-download w3-xxxlarge fa-rotate-180"></i></div>
<div id = "client-cnt" class="w3-right">
@@ -27,8 +30,10 @@
<h4>{{_('Client Mode')}}</h4>
<div class="w3-hide-small w3-hide-medium" style="min-height:50px">{{_('Established from proxy to device')}}</div>
</div>
</div>
</div>
<div class="w3-quarter">
<div class="w3-card-4">
<div class="w3-container w3-orange w3-text-white w3-padding-16">
<div class="w3-left"><i class="fa fa-cloud w3-xxxlarge"></i></div>
<div id = "proxy-cnt" class="w3-right">
@@ -38,8 +43,10 @@
<h4>{{_('Proxy Mode')}}</h4>
<div class="w3-hide-small w3-hide-medium" style="min-height:50px">{{_('Forwarding data to cloud')}}</div>
</div>
</div>
</div>
<div class="w3-quarter">
<div class="w3-card-4">
<div class="w3-container w3-teal w3-padding-16">
<div class="w3-left"><i class="fa fa-cloud-arrow-up-alt w3-xxxlarge"></i></div>
<div id = "emulation-cnt" class="w3-right">
@@ -49,9 +56,12 @@
<h4>{{_('Emu Mode')}}</h4>
<div class="w3-hide-small w3-hide-medium" style="min-height:50px">{{_('Emulation sends data to cloud')}}</div>
</div>
</div>
</div>
</div>
<div class="w3-container" id="notes-list"></div>
<div class="w3-container" id="conn-table"></div>
<div id="notes-list"></div>
<div id="conn-table"></div>
{% endblock content%}
{% block footer %}{% endblock footer %}

View File

@@ -0,0 +1,30 @@
{% extends 'base.html.j2' %}
{% block title %}{{_("TSUN Proxy - Log Files")}}{% endblock title %}
{% block menu4_class %}w3-blue{% endblock %}
{% block headline %}<i class="fa fa-file-export fa-fw"></i>  {{_('Log Files')}}{% endblock headline %}
{% block content %}
<div id="id01" class="w3-modal">
<div class="w3-modal-content" style="width:600px">
<div class="w3-container w3-padding-24">
<h2>{{_('Do you really want to delete the log file: <br>%(file)s ?', file='<b><span id="id03"></span></b>')}}</h2>
<div class="w3-bar">
<button id="id02" class="w3-button w3-red" onclick="deleteFile(); document.getElementById('id01').style.display='none'">{{_('Delete File')}}</button>
<button class="w3-button w3-grey w3-right" onclick="document.getElementById('id01').style.display='none'">{{_('Abort')}}</button>
</div>
</div>
</div>
</div>
<div id="file-list"></div>
<script>
function deleteFile() {
fname = document.getElementById('id02').href;
fetch(fname, {method: 'DELETE'})
.then(fetch_data())
}
</script>
{% endblock content%}
{% block footer %}{% endblock footer %}

View File

@@ -0,0 +1,52 @@
{% extends 'base.html.j2' %}
{% block title %}{{_("TSUN Proxy - MQTT Status")}}{% endblock title %}
{% block menu2_class %}w3-blue{% endblock %}
{% block headline %}<i class="fa fa-database"></i>  {{_('MQTT Overview')}}{% endblock headline %}
{% block content %}
<div class="w3-row-padding w3-margin-bottom">
<div class="w3-third">
<div class="w3-card-4">
<div class="w3-container w3-indigo w3-padding-16">
<div class="w3-left"><i class="fa fa-business-time w3-xxxlarge"></i></div>
<div id = "mqtt-ctime" class="w3-right">
<h3>-</h3>
</div>
<div class="w3-clear"></div>
<h4>{{_('Connection Time')}}</h4>
<div class="w3-hide-small w3-hide-medium" style="min-height:50px">{{_('Time at which the connection was established')}}</div>
</div>
</div>
</div>
<div class="w3-third">
<div class="w3-card-4">
<div class="w3-container w3-purple w3-padding-16">
<div class="w3-left"><i class="fa fa-angle-double-right w3-xxxlarge"></i></div>
<div id = "mqtt-tx" class="w3-right">
<h3>-</h3>
</div>
<div class="w3-clear"></div>
<h4>{{_('Published Topics')}}</h4>
<div class="w3-hide-small w3-hide-medium" style="min-height:50px">{{_('Number of published topics')}}</div>
</div>
</div>
</div>
<div class="w3-third">
<div class="w3-card-4">
<div class="w3-container w3-orange w3-text-white w3-padding-16">
<div class="w3-left"><i class="fa fa-angle-double-left w3-xxxlarge"></i></div>
<div id = "mqtt-rx" class="w3-right">
<h3>-</h3>
</div>
<div class="w3-clear"></div>
<h4>{{_('Received Topics')}}</h4>
<div class="w3-hide-small w3-hide-medium" style="min-height:50px">{{_('Number of topics received')}}</div>
</div>
</div>
</div>
</div>
<div id="notes-list"></div>
<div id="mqtt-table"></div>
{% endblock content%}
{% block footer %}{% endblock footer %}

View File

@@ -0,0 +1,10 @@
{% extends 'base.html.j2' %}
{% block title %}{{_("TSUN Proxy - Important Messages")}}{% endblock title %}
{% block menu3_class %}w3-blue{% endblock %}
{% block headline %}<i class="fa fa-info fa-fw"></i>  {{_('Important Messages')}}{% endblock headline %}
{% block content %}
<div id="notes-list"></div>
{% endblock content%}
{% block footer %}{% endblock footer %}

View File

@@ -0,0 +1,33 @@
<div class="w3-row-padding w3-margin-bottom">
{% for file in dir_list %}
<div class="w3-quarter w3-margin-bottom">
<div class="w3-card-4">
<header class="w3-container w3-teal" style="min-height:80px">
<h4>{{file.name}}</h4>
</header>
<table class="w3-table">
{% for idx, name in [('created',_('Created')), ('modified', _('Modified')), ('size', _('Size'))]%}
<tr>
<td>{{_(name)}}:</td>
<td>{{file[idx]}}</td>
</tr>
{% endfor %}
</table>
<footer class="w3-teal">
<a href="{{ url_for('.send',file=file.name)}}" class="w3-button w3-hover-teal w3-hover-text-black"><i class="fa fa-file-download"></i>  {{_('Download File')}}</a>
<a class="w3-button w3-right w3-hover-teal w3-hover-text-black"
onclick="document.getElementById('id03').innerHTML='{{file.name}}'; document.getElementById('id02').href='{{ url_for('.delete',file=file.name)}}'; document.getElementById('id01').style.display='block';"><i class="fa fa-trash"></i></a>
</footer>
</div>
</div>
{% if 0 == (loop.index%4) and not last %}
</div>
<div class="w3-row-padding w3-margin-bottom">
{% endif %}
{% endfor %}
</div>

View File

@@ -0,0 +1,23 @@
{% if notes|length > 0 %}
<div class="w3-container w3-margin-bottom">
<h5>{{_("Warnings and error messages")}} </h5>
<ul class="w3-ul w3-card-4">
{% for note in notes %}
<li class="{% if note.level is le(30) %}w3-leftbar w3-rightbar w3-pale-blue w3-border-blue{% else %}w3-leftbar w3-rightbar w3-pale-red w3-border-red{% endif %}">
<span class="w3-col" style="width:150px">{{note.ctime|datetimeformat(format='short')}}</span>
<span class="w3-col w3-hide-small" style="width:100px">{{note.lname|e}}</span>
<span class="w3-rest">{{note.msg|e}}</span>
</li>
{% endfor %}
</ul>
</div>
{% elif not hide_if_empty %}
<div class="w3-container w3-margin-bottom">
<div class="w3-leftbar w3-rightbar w3-pale-green w3-border-green">
<div class="w3-container">
<h2>{{_("Well done!")}}</h2>
<p>{{_("No warnings or errors have been logged since the last proxy start.")}}</p>
</div>
</div>
</div>
{% endif %}

View File

@@ -12,8 +12,11 @@
{% endif %}
{%- endmacro%}
<h5>Connections</h5>
<table class="w3-table w3-striped w3-bordered w3-border w3-hoverable w3-white">
<div class="w3-container w3-margin-bottom">
<h5>{{table.headline}}</h5>
<div class="w3-card-4">
<table class="w3-table w3-bordered w3-hoverable w3-white">
{% if table.thead is defined%}
<thead>
{% for row in table.thead %}
@@ -35,3 +38,5 @@
{% endfor %}
</tbody>
</table>
</div>
</div>

26
app/src/web/wrapper.py Normal file
View File

@@ -0,0 +1,26 @@
from quart import url_for as quart_url_for
from . import web
def url_for(*args, **kwargs):
"""Return the url for a specific endpoint.
This wrapper optionally convert into a relative url.
This is most useful in templates and redirects to create a URL
that can be used in the browser.
Arguments:
endpoint: The endpoint to build a url for, if prefixed with
``.`` it targets endpoint's in the current blueprint.
_anchor: Additional anchor text to append (i.e. #text).
_external: Return an absolute url for external (to app) usage.
_method: The method to consider alongside the endpoint.
_scheme: A specific scheme to use.
values: The values to build into the URL, as specified in
the endpoint rule.
"""
url = quart_url_for(*args, **kwargs)
if '/' == url[0] and web.build_relative_urls:
url = '.' + url
return url

View File

@@ -0,0 +1 @@
mqtt.port = ":1883"

20
app/tests/conftest.py Normal file
View File

@@ -0,0 +1,20 @@
import pytest_asyncio
import asyncio
@pytest_asyncio.fixture
async def my_loop():
event_loop = asyncio.get_running_loop()
yield event_loop
# Collect all tasks and cancel those that are not 'done'.
tasks = asyncio.all_tasks(event_loop)
tasks = [t for t in tasks if not t.done()]
for task in tasks:
task.cancel()
# Wait for all tasks to complete, ignoring any CancelledErrors
try:
await asyncio.wait(tasks)
except asyncio.exceptions.CancelledError:
pass

View File

19
app/tests/log/test.txt Normal file
View File

@@ -0,0 +1,19 @@
2025-04-30 00:01:23 INFO | root | Server "proxy - unknown" will be started
2025-04-30 00:01:23 INFO | root | current dir: /Users/sallius/tsun/tsun-gen3-proxy
2025-04-30 00:01:23 INFO | root | config_path: ./config/
2025-04-30 00:01:23 INFO | root | json_config: None
2025-04-30 00:01:23 INFO | root | toml_config: None
2025-04-30 00:01:23 INFO | root | trans_path: ../translations/
2025-04-30 00:01:23 INFO | root | rel_urls: False
2025-04-30 00:01:23 INFO | root | log_path: ./log/
2025-04-30 00:01:23 INFO | root | log_backups: unlimited
2025-04-30 00:01:23 INFO | root | LOG_LVL : None
2025-04-30 00:01:23 INFO | root | ******
2025-04-30 00:01:23 INFO | root | Read from /Users/sallius/tsun/tsun-gen3-proxy/app/src/cnf/default_config.toml => ok
2025-04-30 00:01:23 INFO | root | Read from environment => ok
2025-04-30 00:01:23 INFO | root | Read from ./config/config.json => n/a
2025-04-30 00:01:23 INFO | root | Read from ./config/config.toml => n/a
2025-04-30 00:01:23 INFO | root | ******
2025-04-30 00:01:23 INFO | root | listen on port: 5005 for inverters
2025-04-30 00:01:23 INFO | root | listen on port: 10000 for inverters
2025-04-30 00:01:23 INFO | root | Start Quart

View File

@@ -17,13 +17,13 @@ def test_statistic_counter():
assert val == None or val == 0
i.static_init() # initialize counter
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 0, "Cloud_Conn_Cnt": 0, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "Modbus_Command": 0}})
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 0, "Cloud_Conn_Cnt": 0, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "DCU_Command": 0, "Modbus_Command": 0}})
val = i.dev_value(Register.INVERTER_CNT) # valid and initiliazed addr
assert val == 0
i.inc_counter('Inverter_Cnt')
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 1, "Cloud_Conn_Cnt": 0, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "Modbus_Command": 0}})
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 1, "Cloud_Conn_Cnt": 0, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "DCU_Command": 0, "Modbus_Command": 0}})
val = i.dev_value(Register.INVERTER_CNT)
assert val == 1

View File

@@ -109,7 +109,7 @@ def test_default_db():
i = InfosG3P(client_mode=False)
assert json.dumps(i.db) == json.dumps({
"inverter": {"Manufacturer": "TSUN", "Equipment_Model": "TSOL-MSxx00", "No_Inputs": 4},
"inverter": {"Manufacturer": "TSUN", "Equipment_Model": "TSOL-MSxx00", "No_Inputs": 2},
"collector": {"Chip_Type": "IGEN TECH"},
})
@@ -271,7 +271,7 @@ def test_build_ha_conf1():
elif id == 'inv_count_456':
assert False
assert tests==7
assert tests==5
def test_build_ha_conf2():
i = InfosG3P(client_mode=False)
@@ -346,7 +346,7 @@ def test_build_ha_conf3():
elif id == 'inv_count_456':
assert False
assert tests==7
assert tests==5
def test_build_ha_conf4():
i = InfosG3P(client_mode=True)

View File

@@ -113,7 +113,9 @@ def patch_unhealthy_remote():
with patch.object(AsyncStreamClient, 'healthy', new_healthy) as conn:
yield conn
def test_inverter_iter():
@pytest.mark.asyncio
async def test_inverter_iter(my_loop):
_ = my_loop
InverterBase._registry.clear()
cnt = 0
reader = FakeReader()
@@ -216,7 +218,8 @@ def test_unhealthy_remote(patch_unhealthy_remote):
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_conn(config_conn, patch_open_connection):
async def test_remote_conn(my_loop, config_conn, patch_open_connection):
_ = my_loop
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -242,8 +245,9 @@ async def test_remote_conn(config_conn, patch_open_connection):
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_conn_to_private(config_conn, patch_open_connection):
async def test_remote_conn_to_private(my_loop, config_conn, patch_open_connection):
'''check DNS resolving of the TSUN FQDN to a local address'''
_ = my_loop
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -280,8 +284,9 @@ async def test_remote_conn_to_private(config_conn, patch_open_connection):
@pytest.mark.asyncio
async def test_remote_conn_to_loopback(config_conn, patch_open_connection):
async def test_remote_conn_to_loopback(my_loop, config_conn, patch_open_connection):
'''check DNS resolving of the TSUN FQDN to the loopback address'''
_ = my_loop
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -317,8 +322,9 @@ async def test_remote_conn_to_loopback(config_conn, patch_open_connection):
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_conn_to_none(config_conn, patch_open_connection):
async def test_remote_conn_to_none(my_loop, config_conn, patch_open_connection):
'''check if get_extra_info() return None in case of an error'''
_ = my_loop
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -354,7 +360,8 @@ async def test_remote_conn_to_none(config_conn, patch_open_connection):
assert cnt == 0
@pytest.mark.asyncio
async def test_unhealthy_remote(config_conn, patch_open_connection, patch_unhealthy_remote):
async def test_unhealthy_remote(my_loop, config_conn, patch_open_connection, patch_unhealthy_remote):
_ = my_loop
_ = config_conn
_ = patch_open_connection
_ = patch_unhealthy_remote
@@ -391,10 +398,10 @@ async def test_unhealthy_remote(config_conn, patch_open_connection, patch_unheal
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_disc(config_conn, patch_open_connection):
async def test_remote_disc(my_loop, config_conn, patch_open_connection):
_ = my_loop
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
reader = FakeReader()
writer = FakeWriter()

View File

@@ -99,7 +99,8 @@ def patch_healthy():
with patch.object(AsyncStream, 'healthy') as conn:
yield conn
def test_method_calls(patch_healthy):
@pytest.mark.asyncio
async def test_method_calls(my_loop, patch_healthy):
spy = patch_healthy
reader = FakeReader()
writer = FakeWriter()
@@ -119,7 +120,7 @@ def test_method_calls(patch_healthy):
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_conn(config_conn, patch_open_connection):
async def test_remote_conn(my_loop, config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -137,7 +138,7 @@ async def test_remote_conn(config_conn, patch_open_connection):
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_except(config_conn, patch_open_connection):
async def test_remote_except(my_loop, config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -164,7 +165,7 @@ async def test_remote_except(config_conn, patch_open_connection):
assert cnt == 0
@pytest.mark.asyncio
async def test_mqtt_publish(config_conn, patch_open_connection):
async def test_mqtt_publish(my_loop, config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -191,7 +192,7 @@ async def test_mqtt_publish(config_conn, patch_open_connection):
assert Infos.new_stat_data['proxy'] == False
@pytest.mark.asyncio
async def test_mqtt_err(config_conn, patch_open_connection, patch_mqtt_err):
async def test_mqtt_err(my_loop, config_conn, patch_open_connection, patch_mqtt_err):
_ = config_conn
_ = patch_open_connection
_ = patch_mqtt_err
@@ -208,7 +209,7 @@ async def test_mqtt_err(config_conn, patch_open_connection, patch_mqtt_err):
assert stream.new_data['inverter'] == True
@pytest.mark.asyncio
async def test_mqtt_except(config_conn, patch_open_connection, patch_mqtt_except):
async def test_mqtt_except(my_loop, config_conn, patch_open_connection, patch_mqtt_except):
_ = config_conn
_ = patch_open_connection
_ = patch_mqtt_except

View File

@@ -37,6 +37,7 @@ def config_conn():
},
'solarman':{'enabled': True, 'host': 'test_cloud.local', 'port': 1234}, 'inverters':{'allow_all':True}
}
Config.log_path='app/tests/log/'
@pytest.fixture(scope="module", autouse=True)
def module_init():
@@ -93,7 +94,8 @@ def patch_open_connection():
with patch.object(asyncio, 'open_connection', new_open) as conn:
yield conn
def test_method_calls(config_conn):
@pytest.mark.asyncio
async def test_method_calls(my_loop, config_conn):
_ = config_conn
reader = FakeReader()
writer = FakeWriter()
@@ -104,7 +106,7 @@ def test_method_calls(config_conn):
assert inverter.local.ifc
@pytest.mark.asyncio
async def test_remote_conn(config_conn, patch_open_connection):
async def test_remote_conn(my_loop, config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -115,7 +117,7 @@ async def test_remote_conn(config_conn, patch_open_connection):
assert inverter.remote.stream
@pytest.mark.asyncio
async def test_remote_except(config_conn, patch_open_connection):
async def test_remote_except(my_loop, config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -137,7 +139,7 @@ async def test_remote_except(config_conn, patch_open_connection):
@pytest.mark.asyncio
async def test_mqtt_publish(config_conn, patch_open_connection):
async def test_mqtt_publish(my_loop, config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -164,7 +166,7 @@ async def test_mqtt_publish(config_conn, patch_open_connection):
assert Infos.new_stat_data['proxy'] == False
@pytest.mark.asyncio
async def test_mqtt_err(config_conn, patch_open_connection, patch_mqtt_err):
async def test_mqtt_err(my_loop, config_conn, patch_open_connection, patch_mqtt_err):
_ = config_conn
_ = patch_open_connection
_ = patch_mqtt_err
@@ -181,7 +183,7 @@ async def test_mqtt_err(config_conn, patch_open_connection, patch_mqtt_err):
assert stream.new_data['inverter'] == True
@pytest.mark.asyncio
async def test_mqtt_except(config_conn, patch_open_connection, patch_mqtt_except):
async def test_mqtt_except(my_loop, config_conn, patch_open_connection, patch_mqtt_except):
_ = config_conn
_ = patch_open_connection
_ = patch_mqtt_except

View File

@@ -19,7 +19,8 @@ class ModbusTestHelper(Modbus):
def resp_handler(self):
self.recv_responses += 1
def test_modbus_crc():
@pytest.mark.asyncio
async def test_modbus_crc():
'''Check CRC-16 calculation'''
mb = Modbus(None)
assert 0x0b02 == mb._Modbus__calc_crc(b'\x01\x06\x20\x08\x00\x04')
@@ -37,7 +38,8 @@ def test_modbus_crc():
msg += b'\x00\x00\x00\x00\x00\x00\x00\xe6\xef'
assert 0 == mb._Modbus__calc_crc(msg)
def test_build_modbus_pdu():
@pytest.mark.asyncio
async def test_build_modbus_pdu():
'''Check building and sending a MODBUS RTU'''
mb = ModbusTestHelper()
mb.build_msg(1,6,0x2000,0x12)
@@ -49,7 +51,8 @@ def test_build_modbus_pdu():
assert mb.last_len == 18
assert mb.err == 0
def test_recv_req():
@pytest.mark.asyncio
async def test_recv_req():
'''Receive a valid request, which must transmitted'''
mb = ModbusTestHelper()
assert mb.recv_req(b'\x01\x06\x20\x00\x00\x12\x02\x07')
@@ -58,7 +61,8 @@ def test_recv_req():
assert mb.last_len == 0x12
assert mb.err == 0
def test_recv_req_crc_err():
@pytest.mark.asyncio
async def test_recv_req_crc_err():
'''Receive a request with invalid CRC, which must be dropped'''
mb = ModbusTestHelper()
assert not mb.recv_req(b'\x01\x06\x20\x00\x00\x12\x02\x08')
@@ -68,7 +72,8 @@ def test_recv_req_crc_err():
assert mb.last_len == 0
assert mb.err == 1
def test_recv_resp_crc_err():
@pytest.mark.asyncio
async def test_recv_resp_crc_err():
'''Receive a response with invalid CRC, which must be dropped'''
mb = ModbusTestHelper()
# simulate a transmitted request
@@ -89,7 +94,8 @@ def test_recv_resp_crc_err():
mb._Modbus__stop_timer()
assert not mb.req_pend
def test_recv_resp_invalid_addr():
@pytest.mark.asyncio
async def test_recv_resp_invalid_addr():
'''Receive a response with wrong server addr, which must be dropped'''
mb = ModbusTestHelper()
mb.req_pend = True
@@ -113,7 +119,8 @@ def test_recv_resp_invalid_addr():
mb._Modbus__stop_timer()
assert not mb.req_pend
def test_recv_recv_fcode():
@pytest.mark.asyncio
async def test_recv_recv_fcode():
'''Receive a response with wrong function code, which must be dropped'''
mb = ModbusTestHelper()
mb.build_msg(1,4,0x300e,2)
@@ -135,7 +142,8 @@ def test_recv_recv_fcode():
mb._Modbus__stop_timer()
assert not mb.req_pend
def test_recv_resp_len():
@pytest.mark.asyncio
async def test_recv_resp_len():
'''Receive a response with wrong data length, which must be dropped'''
mb = ModbusTestHelper()
mb.build_msg(1,3,0x300e,3)
@@ -158,7 +166,8 @@ def test_recv_resp_len():
mb._Modbus__stop_timer()
assert not mb.req_pend
def test_recv_unexpect_resp():
@pytest.mark.asyncio
async def test_recv_unexpect_resp():
'''Receive a response when we havb't sent a request'''
mb = ModbusTestHelper()
assert not mb.req_pend
@@ -174,7 +183,8 @@ def test_recv_unexpect_resp():
assert mb.req_pend == False
assert mb.que.qsize() == 0
def test_parse_resp():
@pytest.mark.asyncio
async def test_parse_resp():
'''Receive matching response and parse the values'''
mb = ModbusTestHelper()
mb.build_msg(1,3,0x3007,6)
@@ -200,7 +210,8 @@ def test_parse_resp():
assert mb.que.qsize() == 0
assert not mb.req_pend
def test_queue():
@pytest.mark.asyncio
async def test_queue():
mb = ModbusTestHelper()
mb.build_msg(1,3,0x3022,4)
assert mb.que.qsize() == 0
@@ -218,7 +229,8 @@ def test_queue():
mb._Modbus__stop_timer()
assert not mb.req_pend
def test_queue2():
@pytest.mark.asyncio
async def test_queue2():
'''Check queue handling for build_msg() calls'''
mb = ModbusTestHelper()
mb.build_msg(1,3,0x3007,6)
@@ -267,7 +279,8 @@ def test_queue2():
assert mb.que.qsize() == 0
assert not mb.req_pend
def test_queue3():
@pytest.mark.asyncio
async def test_queue3():
'''Check queue handling for recv_req() calls'''
mb = ModbusTestHelper()
assert mb.recv_req(b'\x01\x03\x30\x07\x00\x06{\t', mb.resp_handler)
@@ -324,7 +337,7 @@ def test_queue3():
assert not mb.req_pend
@pytest.mark.asyncio
async def test_timeout():
async def test_timeout(my_loop):
'''Test MODBUS response timeout and RTU retransmitting'''
assert asyncio.get_running_loop()
mb = ModbusTestHelper()
@@ -371,7 +384,8 @@ async def test_timeout():
assert mb.retry_cnt == 0
assert mb.send_calls == 4
def test_recv_unknown_data():
@pytest.mark.asyncio
async def test_recv_unknown_data():
'''Receive a response with an unknwon register'''
mb = ModbusTestHelper()
assert 0x9000 not in mb.mb_reg_mapping
@@ -390,7 +404,8 @@ def test_recv_unknown_data():
del mb.mb_reg_mapping[0x9000]
def test_close():
@pytest.mark.asyncio
async def test_close():
'''Check queue handling for build_msg() calls'''
mb = ModbusTestHelper()
mb.build_msg(1,3,0x3007,6)

216
app/tests/test_mqtt.py Normal file → Executable file
View File

@@ -3,8 +3,10 @@ import pytest
import asyncio
import aiomqtt
import logging
from aiomqtt import MqttError, MessagesIterator
from aiomqtt import Message as AiomqttMessage
from mock import patch, Mock
from async_stream import AsyncIfcImpl
from singleton import Singleton
from mqtt import Mqtt
@@ -17,7 +19,7 @@ NO_MOSQUITTO_TEST = False
pytest_plugins = ('pytest_asyncio',)
@pytest.fixture(scope="module", autouse=True)
@pytest.fixture(scope="function", autouse=True)
def module_init():
Singleton._instances.clear()
yield
@@ -33,6 +35,26 @@ def test_hostname():
# else:
return 'test.mosquitto.org'
@pytest.fixture(scope="function")
def aiomqtt_mock(monkeypatch):
recv_que = asyncio.Queue()
async def my_aenter(self):
return self
async def my_subscribe(self, *arg):
return
async def my_anext(self):
return await recv_que.get()
async def my_receive(self, topic: str, payload: bytes):
msg = AiomqttMessage(topic, payload,qos=0, retain=False, mid=0, properties=None)
await recv_que.put(msg)
await asyncio.sleep(0) # dispath the msg
monkeypatch.setattr(aiomqtt.Client, "__aenter__", my_aenter)
monkeypatch.setattr(aiomqtt.Client, "subscribe", my_subscribe)
monkeypatch.setattr(MessagesIterator, "__anext__", my_anext)
monkeypatch.setattr(Mqtt, "receive", my_receive, False)
@pytest.fixture
def config_mqtt_conn(test_hostname, test_port):
Config.act_config = {'mqtt':{'host': test_hostname, 'port': test_port, 'user': '', 'passwd': ''},
@@ -44,6 +66,14 @@ def config_no_conn(test_port):
Config.act_config = {'mqtt':{'host': "", 'port': test_port, 'user': '', 'passwd': ''},
'ha':{'auto_conf_prefix': 'homeassistant','discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun'}
}
Config.def_config = {}
@pytest.fixture
def config_def_conn(test_port):
Config.act_config = {'mqtt':{'host': "unknown_url", 'port': test_port, 'user': '', 'passwd': ''},
'ha':{'auto_conf_prefix': 'homeassistant','discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun'}
}
Config.def_config = Config.act_config
@pytest.fixture
def spy_at_cmd():
@@ -69,6 +99,14 @@ def spy_modbus_cmd_client():
yield wrapped_conn
conn.close()
@pytest.fixture
def spy_dcu_cmd():
conn = SolarmanV5(None, ('test.local', 1234), server_side=True, client_mode= False, ifc=AsyncIfcImpl())
conn.node_id = 'inv_3/'
with patch.object(conn, 'send_dcu_cmd', wraps=conn.send_dcu_cmd) as wrapped_conn:
yield wrapped_conn
conn.close()
def test_native_client(test_hostname, test_port):
"""Sanity check: Make sure the paho-mqtt client can connect to the test
MQTT server. Otherwise the test set NO_MOSQUITTO_TEST to True and disable
@@ -140,6 +178,7 @@ async def test_ha_reconnect(config_mqtt_conn):
assert on_connect.is_set()
finally:
assert m.received == 2
await m.close()
@pytest.mark.asyncio
@@ -167,65 +206,162 @@ async def test_mqtt_no_config(config_no_conn):
await m.close()
@pytest.mark.asyncio
async def test_msg_dispatch(config_mqtt_conn, spy_modbus_cmd):
async def test_mqtt_except_no_config(config_no_conn, monkeypatch, caplog):
_ = config_no_conn
assert asyncio.get_running_loop()
async def my_aenter(self):
raise MqttError('TestException') from None
monkeypatch.setattr(aiomqtt.Client, "__aenter__", my_aenter)
LOGGER = logging.getLogger("mqtt")
LOGGER.propagate = True
LOGGER.setLevel(logging.INFO)
with caplog.at_level(logging.INFO):
m = Mqtt(None)
assert m.task
await asyncio.sleep(0)
try:
await m.publish('homeassistant/status', 'online')
assert False
except MqttError:
pass
except Exception:
assert False
finally:
await m.close()
assert 'Connection lost; Reconnecting in 5 seconds' in caplog.text
@pytest.mark.asyncio
async def test_mqtt_except_def_config(config_def_conn, monkeypatch, caplog):
_ = config_def_conn
assert asyncio.get_running_loop()
on_connect = asyncio.Event()
async def cb():
on_connect.set()
async def my_aenter(self):
raise MqttError('TestException') from None
monkeypatch.setattr(aiomqtt.Client, "__aenter__", my_aenter)
LOGGER = logging.getLogger("mqtt")
LOGGER.propagate = True
LOGGER.setLevel(logging.INFO)
with caplog.at_level(logging.INFO):
m = Mqtt(cb)
assert m.task
await asyncio.sleep(0)
assert not on_connect.is_set()
try:
await m.publish('homeassistant/status', 'online')
assert False
except MqttError:
pass
except Exception:
assert False
finally:
await m.close()
assert 'MQTT is unconfigured; Check your config.toml!' in caplog.text
@pytest.mark.asyncio
async def test_mqtt_dispatch(config_mqtt_conn, aiomqtt_mock, spy_modbus_cmd):
_ = config_mqtt_conn
_ = aiomqtt_mock
spy = spy_modbus_cmd
try:
m = Mqtt(None)
msg = aiomqtt.Message(topic= 'tsun/inv_1/rated_load', payload= b'2', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
assert m.ha_restarts == 0
await m.receive('homeassistant/status', b'online') # send the message
assert m.ha_restarts == 1
await m.receive(topic= 'tsun/inv_1/rated_load', payload= b'2')
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x2008, 2, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'100', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_1/out_coeff', payload= b'100')
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 1024, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'50', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_1/out_coeff', payload= b'50')
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 512, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/modbus_read_regs', payload= b'0x3000, 10', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_1/modbus_read_regs', payload= b'0x3000, 10')
spy.assert_awaited_once_with(Modbus.READ_REGS, 0x3000, 10, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/modbus_read_inputs', payload= b'0x3000, 10', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_1/modbus_read_inputs', payload= b'0x3000, 10')
spy.assert_awaited_once_with(Modbus.READ_INPUTS, 0x3000, 10, logging.INFO)
# test dispatching with empty mapping table
m.topic_defs.clear()
spy.reset_mock()
await m.receive(topic= 'tsun/inv_1/modbus_read_inputs', payload= b'0x3000, 10')
spy.assert_not_called()
# test dispatching with incomplete mapping table - invalid fnc defined
m.topic_defs.append(
{'prefix': 'entity_prefix', 'topic': '/+/modbus_read_inputs',
'full_topic': 'tsun/+/modbus_read_inputs', 'fnc': 'addr'}
)
spy.reset_mock()
await m.receive(topic= 'tsun/inv_1/modbus_read_inputs', payload= b'0x3000, 10')
spy.assert_not_called()
except MqttError:
assert False
except Exception:
assert False
finally:
await m.close()
@pytest.mark.asyncio
async def test_msg_dispatch_err(config_mqtt_conn, spy_modbus_cmd):
async def test_mqtt_dispatch_err(config_mqtt_conn, aiomqtt_mock, spy_modbus_cmd, caplog):
_ = config_mqtt_conn
_ = aiomqtt_mock
spy = spy_modbus_cmd
LOGGER = logging.getLogger("mqtt")
LOGGER.propagate = True
LOGGER.setLevel(logging.INFO)
try:
m = Mqtt(None)
# test out of range param
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'-1', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_1/out_coeff', payload= b'-1')
spy.assert_not_called()
# test unknown node_id
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_2/out_coeff', payload= b'2', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_2/out_coeff', payload= b'2')
spy.assert_not_called()
# test invalid fload param
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'2, 3', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_1/out_coeff', payload= b'2, 3')
spy.assert_not_called()
await m.receive(topic= 'tsun/inv_1/modbus_read_regs', payload= b'0x3000, 10, 7')
spy.assert_not_called()
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/modbus_read_regs', payload= b'0x3000, 10, 7', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
await m.receive(topic= 'tsun/inv_1/dcu_power', payload= b'100W')
spy.assert_not_called()
with caplog.at_level(logging.INFO):
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'2', qos= 0, retain = False, mid= 0, properties= None)
for _ in m.each_inverter(msg, "addr"):
pass # do nothing here
assert 'Cmd not supported by: inv_1/' in caplog.text
except MqttError:
assert False
except Exception:
assert False
finally:
await m.close()
@@ -266,3 +402,31 @@ async def test_at_cmd_dispatch(config_mqtt_conn, spy_at_cmd):
finally:
await m.close()
@pytest.mark.asyncio
async def test_dcu_dispatch(config_mqtt_conn, spy_dcu_cmd):
_ = config_mqtt_conn
spy = spy_dcu_cmd
try:
m = Mqtt(None)
msg = aiomqtt.Message(topic= 'tsun/inv_3/dcu_power', payload= b'100.0', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_called_once_with(b'\x01\x01\x06\x01\x00\x01\x03\xe8')
finally:
await m.close()
@pytest.mark.asyncio
async def test_dcu_inv_value(config_mqtt_conn, spy_dcu_cmd):
_ = config_mqtt_conn
spy = spy_dcu_cmd
try:
m = Mqtt(None)
msg = aiomqtt.Message(topic= 'tsun/inv_3/dcu_power', payload= b'99.9', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_not_called()
msg = aiomqtt.Message(topic= 'tsun/inv_3/dcu_power', payload= b'800.1', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_not_called()
finally:
await m.close()

View File

@@ -3,66 +3,216 @@ import pytest
import logging
import os
from mock import patch
from server import get_log_level, app, ProxyState
from server import app, Server, ProxyState, HypercornLogHndl
pytest_plugins = ('pytest_asyncio',)
def test_get_log_level():
with patch.dict(os.environ, {}):
log_lvl = get_log_level()
assert log_lvl == None
class TestServerClass:
class FakeServer(Server):
def __init__(self):
pass # don't call the suoer(.__init__ for unit tests
with patch.dict(os.environ, {'LOG_LVL': 'DEBUG'}):
log_lvl = get_log_level()
assert log_lvl == logging.DEBUG
def test_get_log_level(self):
s = self.FakeServer()
with patch.dict(os.environ, {'LOG_LVL': 'INFO'}):
log_lvl = get_log_level()
assert log_lvl == logging.INFO
with patch.dict(os.environ, {}):
log_lvl = s.get_log_level()
assert log_lvl == None
with patch.dict(os.environ, {'LOG_LVL': 'WARN'}):
log_lvl = get_log_level()
assert log_lvl == logging.WARNING
with patch.dict(os.environ, {'LOG_LVL': 'DEBUG'}):
log_lvl = s.get_log_level()
assert log_lvl == logging.DEBUG
with patch.dict(os.environ, {'LOG_LVL': 'ERROR'}):
log_lvl = get_log_level()
assert log_lvl == logging.ERROR
with patch.dict(os.environ, {'LOG_LVL': 'INFO'}):
log_lvl = s.get_log_level()
assert log_lvl == logging.INFO
with patch.dict(os.environ, {'LOG_LVL': 'UNKNOWN'}):
log_lvl = get_log_level()
assert log_lvl == None
with patch.dict(os.environ, {'LOG_LVL': 'WARN'}):
log_lvl = s.get_log_level()
assert log_lvl == logging.WARNING
@pytest.mark.asyncio
async def test_ready():
"""Test the ready route."""
with patch.dict(os.environ, {'LOG_LVL': 'ERROR'}):
log_lvl = s.get_log_level()
assert log_lvl == logging.ERROR
ProxyState.set_up(False)
client = app.test_client()
response = await client.get('/-/ready')
assert response.status_code == 503
result = await response.get_data()
assert result == b"Not ready"
with patch.dict(os.environ, {'LOG_LVL': 'UNKNOWN'}):
log_lvl = s.get_log_level()
assert log_lvl == None
ProxyState.set_up(True)
response = await client.get('/-/ready')
assert response.status_code == 200
result = await response.get_data()
assert result == b"Is ready"
def test_default_args(self):
s = self.FakeServer()
assert s.config_path == './config/'
assert s.json_config == ''
assert s.toml_config == ''
assert s.trans_path == '../translations/'
assert s.rel_urls == False
assert s.log_path == './log/'
assert s.log_backups == 0
@pytest.mark.asyncio
async def test_healthy():
"""Test the healthy route."""
def test_parse_args_empty(self):
s = self.FakeServer()
s.parse_args([])
assert s.config_path == './config/'
assert s.json_config == None
assert s.toml_config == None
assert s.trans_path == '../translations/'
assert s.rel_urls == False
assert s.log_path == './log/'
assert s.log_backups == 0
ProxyState.set_up(False)
client = app.test_client()
response = await client.get('/-/healthy')
assert response.status_code == 200
result = await response.get_data()
assert result == b"I'm fine"
def test_parse_args_short(self):
s = self.FakeServer()
s.parse_args(['-r', '-c', '/tmp/my-config', '-j', 'cnf.jsn', '-t', 'cnf.tml', '-tr', '/my/trans/', '-l', '/my_logs/', '-b', '3'])
assert s.config_path == '/tmp/my-config'
assert s.json_config == 'cnf.jsn'
assert s.toml_config == 'cnf.tml'
assert s.trans_path == '/my/trans/'
assert s.rel_urls == True
assert s.log_path == '/my_logs/'
assert s.log_backups == 3
def test_parse_args_long(self):
s = self.FakeServer()
s.parse_args(['--rel_urls', '--config_path', '/tmp/my-config', '--json_config', 'cnf.jsn',
'--toml_config', 'cnf.tml', '--trans_path', '/my/trans/', '--log_path', '/my_logs/',
'--log_backups', '3'])
assert s.config_path == '/tmp/my-config'
assert s.json_config == 'cnf.jsn'
assert s.toml_config == 'cnf.tml'
assert s.trans_path == '/my/trans/'
assert s.rel_urls == True
assert s.log_path == '/my_logs/'
assert s.log_backups == 3
def test_parse_args_invalid(self):
s = self.FakeServer()
with pytest.raises(SystemExit) as exc_info:
s.parse_args(['--inalid', '/tmp/my-config'])
assert exc_info.value.code == 2
def test_init_logging_system(self):
s = self.FakeServer()
s.src_dir = 'app/src/'
s.init_logging_system()
assert s.log_backups == 0
assert s.log_level == None
assert logging.handlers.log_path == './log/'
assert logging.handlers.log_backups == 0
assert logging.getLogger().level == logging.DEBUG
assert logging.getLogger('msg').level == logging.DEBUG
assert logging.getLogger('conn').level == logging.DEBUG
assert logging.getLogger('data').level == logging.DEBUG
assert logging.getLogger('tracer').level == logging.INFO
assert logging.getLogger('asyncio').level == logging.INFO
assert logging.getLogger('hypercorn.access').level == logging.INFO
assert logging.getLogger('hypercorn.error').level == logging.INFO
os.environ["LOG_LVL"] = "WARN"
s.parse_args(['--log_backups', '3'])
s.init_logging_system()
assert s.log_backups == 3
assert s.log_level == logging.WARNING
assert logging.handlers.log_backups == 3
assert logging.getLogger().level == s.log_level
assert logging.getLogger('msg').level == s.log_level
assert logging.getLogger('conn').level == s.log_level
assert logging.getLogger('data').level == s.log_level
assert logging.getLogger('tracer').level == s.log_level
assert logging.getLogger('asyncio').level == s.log_level
assert logging.getLogger('hypercorn.access').level == logging.INFO
assert logging.getLogger('hypercorn.error').level == logging.INFO
def test_build_config_error(self, caplog):
s = self.FakeServer()
s.src_dir = 'app/src/'
s.toml_config = 'app/tests/cnf/invalid_config.toml'
with caplog.at_level(logging.ERROR):
s.build_config()
assert "Can't read from app/tests/cnf/invalid_config.toml" in caplog.text
assert "Key 'port' error:" in caplog.text
class TestHypercornLogHndl:
class FakeServer(Server):
def __init__(self):
pass # don't call the suoer(.__init__ for unit tests
def test_save_and_restore(self, capsys):
s = self.FakeServer()
s.src_dir = 'app/src/'
s.init_logging_system()
h = HypercornLogHndl()
assert h.must_fix == False
assert len(h.access_hndl) == 0
assert len(h.error_hndl) == 0
h.save()
assert h.must_fix == True
assert len(h.access_hndl) == 1
assert len(h.error_hndl) == 2
assert h.access_hndl == logging.getLogger('hypercorn.access').handlers
assert h.error_hndl == logging.getLogger('hypercorn.error').handlers
logging.getLogger('hypercorn.access').handlers = []
logging.getLogger('hypercorn.error').handlers = []
h.restore()
assert h.must_fix == False
assert h.access_hndl == logging.getLogger('hypercorn.access').handlers
assert h.error_hndl == logging.getLogger('hypercorn.error').handlers
output = capsys.readouterr().out.rstrip()
assert "* Fix hypercorn.access setting" in output
assert "* Fix hypercorn.error setting" in output
h.restore() # second restore do nothing
assert h.must_fix == False
output = capsys.readouterr().out.rstrip()
assert output == ''
h.save() # save the same values second time
assert h.must_fix == True
h.restore() # restore without changing the handlers
assert h.must_fix == False
output = capsys.readouterr().out.rstrip()
assert output == ''
class TestApp:
@pytest.mark.asyncio
async def test_ready(self):
"""Test the ready route."""
ProxyState.set_up(False)
client = app.test_client()
response = await client.get('/-/ready')
assert response.status_code == 503
result = await response.get_data()
assert result == b"Not ready"
ProxyState.set_up(True)
response = await client.get('/-/ready')
assert response.status_code == 200
result = await response.get_data()
assert result == b"Is ready"
@pytest.mark.asyncio
async def test_healthy(self):
"""Test the healthy route."""
ProxyState.set_up(False)
client = app.test_client()
response = await client.get('/-/healthy')
assert response.status_code == 200
result = await response.get_data()
assert result == b"I'm fine"
ProxyState.set_up(True)
response = await client.get('/-/healthy')
assert response.status_code == 200
result = await response.get_data()
assert result == b"I'm fine"
ProxyState.set_up(True)
response = await client.get('/-/healthy')
assert response.status_code == 200
result = await response.get_data()
assert result == b"I'm fine"

444
app/tests/test_solarman.py Normal file → Executable file
View File

@@ -79,6 +79,7 @@ class MemoryStream(SolarmanV5):
self.key = ''
self.data = ''
self.msg_recvd = []
def write_cb(self):
if self.test_exception_async_write:
@@ -461,6 +462,39 @@ def inverter_ind_msg800(): # 0x4210 rated Power 800W
msg += b'\x15'
return msg
@pytest.fixture
def inverter_ind_msg900(): # 0x4210 rated Power 900W
msg = b'\xa5\x99\x01\x10\x42\xe6\x9e' +get_sn() +b'\x01\xb0\x02\xbc\xc8'
msg += b'\x24\x32\x6c\x1f\x00\x00\xa0\x47\xe4\x33\x01\x00\x03\x08\x00\x00'
msg += b'\x59\x31\x37\x45\x37\x41\x30\x46\x30\x31\x30\x42\x30\x31\x33\x45'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x01\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x40\x10\x08\xc8\x00\x49\x13\x8d\x00\x36\x00\x00\x03\x84\x06\x7a'
msg += b'\x01\x61\x00\xa8\x02\x54\x01\x5a\x00\x8a\x01\xe4\x01\x5a\x00\xbd'
msg += b'\x02\x8f\x00\x11\x00\x01\x00\x00\x00\x0b\x00\x00\x27\x98\x00\x04'
msg += b'\x00\x00\x0c\x04\x00\x03\x00\x00\x0a\xe7\x00\x05\x00\x00\x0c\x75'
msg += b'\x00\x00\x00\x00\x06\x16\x02\x00\x00\x00\x55\xaa\x00\x01\x00\x00'
msg += b'\x00\x00\x00\x00\xff\xff\x03\x84\x00\x03\x04\x00\x04\x00\x04\x00'
msg += b'\x04\x00\x00\x01\xff\xff\x00\x01\x00\x06\x00\x68\x00\x68\x05\x00'
msg += b'\x09\xcd\x07\xb6\x13\x9c\x13\x24\x00\x01\x07\xae\x04\x0f\x00\x41'
msg += b'\x00\x0f\x0a\x64\x0a\x64\x00\x06\x00\x06\x09\xf6\x12\x8c\x12\x8c'
msg += b'\x00\x10\x00\x10\x14\x52\x14\x52\x00\x10\x00\x10\x01\x51\x00\x05'
msg += b'\x04\x00\x00\x01\x13\x9c\x0f\xa0\x00\x4e\x00\x66\x03\xe8\x04\x00'
msg += b'\x09\xce\x07\xa8\x13\x9c\x13\x26\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x04\x00\x04\x00\x00\x00\x00\x00\xff\xff\x00\x00'
msg += b'\x00\x00\x00\x00'
msg += correct_checksum(msg)
msg += b'\x15'
return msg
@pytest.fixture
def inverter_ind_msg_81(): # 0x4210 fcode 0x81
msg = b'\xa5\x99\x01\x10\x42\x02\x03' +get_sn() +b'\x81\xb0\x02\xbc\xc8'
@@ -675,6 +709,19 @@ def msg_modbus_rsp(): # 0x1510
msg += b'\x15'
return msg
@pytest.fixture
def msg_modbus_rsp_mb_4(): # 0x1510, MODBUS Type:4
msg = b'\xa5\x3b\x00\x10\x15\x03\x03' +get_sn() +b'\x02\x01'
msg += total()
msg += hb()
msg += b'\x0a\xe2\xfa\x33\x01\x04\x28\x40\x10\x08\xd8'
msg += b'\x00\x00\x13\x87\x00\x31\x00\x68\x02\x58\x00\x00\x01\x53\x00\x02'
msg += b'\x00\x00\x01\x52\x00\x02\x00\x00\x01\x53\x00\x03\x00\x00\x00\x04'
msg += b'\x00\x01\x00\x00\x9e\xa4'
msg += correct_checksum(msg)
msg += b'\x15'
return msg
@pytest.fixture
def msg_modbus_interim_rsp(): # 0x0510
msg = b'\xa5\x3b\x00\x10\x15\x03\x03' +get_sn() +b'\x02\x01'
@@ -811,6 +858,26 @@ def dcu_data_rsp_msg(): # 0x1210
msg += b'\x15'
return msg
@pytest.fixture
def dcu_command_ind_msg(): # 0x4510
msg = b'\xa5\x17\x00\x10\x45\x94\x02' +get_dcu_sn() +b'\x05\x26\x30'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x01\x01\x06\x01\x00\x01\x03\xe8'
msg += correct_checksum(msg)
msg += b'\x15'
return msg
@pytest.fixture
def dcu_command_rsp_msg(): # 0x1510
msg = b'\xa5\x11\x00\x10\x15\x94\x03' +get_dcu_sn() +b'\x05\x01'
msg += total()
msg += hb()
msg += b'\x00\x00\x00\x00'
msg += b'\x01\x01\x01'
msg += correct_checksum(msg)
msg += b'\x15'
return msg
@pytest.fixture
def config_tsun_allow_all():
Config.act_config = {
@@ -853,9 +920,20 @@ def config_tsun_scan_dcu():
@pytest.fixture
def config_tsun_dcu1():
Config.act_config = {'solarman':{'enabled': True},'batteries':{'4100000000000001':{'monitor_sn': 2070233888, 'node_id':'inv1/', 'modbus_polling': True, 'suggested_area':'roof', 'sensor_list': 0}}}
Config.act_config = {
'ha':{
'auto_conf_prefix': 'homeassistant',
'discovery_prefix': 'homeassistant',
'entity_prefix': 'tsun',
'proxy_node_id': 'test_1',
'proxy_unique_id': ''
},
'solarman':{'enabled': True, 'host': 'test_cloud.local', 'port': 1234},'batteries':{'4100000000000001':{'monitor_sn': 2070233888, 'node_id':'inv1/', 'modbus_polling': True, 'suggested_area':'roof', 'sensor_list': 0}}}
Proxy.class_init()
Proxy.mqtt = Mqtt()
def test_read_message(device_ind_msg):
@pytest.mark.asyncio
async def test_read_message(device_ind_msg):
Config.act_config = {'solarman':{'enabled': True}}
m = MemoryStream(device_ind_msg, (0,))
m.read() # read complete msg, and dispatch msg
@@ -873,10 +951,12 @@ def test_read_message(device_ind_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_invalid_start_byte(invalid_start_byte, device_ind_msg):
@pytest.mark.asyncio
async def test_invalid_start_byte(invalid_start_byte, device_ind_msg):
# received a message with wrong start byte plus an valid message
# the complete receive buffer must be cleared to
# find the next valid message
Config.act_config = {'solarman':{'enabled': True}}
m = MemoryStream(invalid_start_byte, (0,))
m.append_msg(device_ind_msg)
m.read() # read complete msg, and dispatch msg
@@ -894,10 +974,12 @@ def test_invalid_start_byte(invalid_start_byte, device_ind_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 1
m.close()
def test_invalid_stop_byte(invalid_stop_byte):
@pytest.mark.asyncio
async def test_invalid_stop_byte(invalid_stop_byte):
# received a message with wrong stop byte
# the complete receive buffer must be cleared to
# find the next valid message
Config.act_config = {'solarman':{'enabled': True}}
m = MemoryStream(invalid_stop_byte, (0,))
m.read() # read complete msg, and dispatch msg
assert not m.header_valid # must be invalid, since start byte is wrong
@@ -914,9 +996,11 @@ def test_invalid_stop_byte(invalid_stop_byte):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 1
m.close()
def test_invalid_stop_byte2(invalid_stop_byte, device_ind_msg):
@pytest.mark.asyncio
async def test_invalid_stop_byte2(invalid_stop_byte, device_ind_msg):
# received a message with wrong stop byte plus an valid message
# only the first message must be discarded
Config.act_config = {'solarman':{'enabled': True}}
m = MemoryStream(invalid_stop_byte, (0,))
m.append_msg(device_ind_msg)
@@ -939,11 +1023,13 @@ def test_invalid_stop_byte2(invalid_stop_byte, device_ind_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 1
m.close()
def test_invalid_stop_start_byte(invalid_stop_byte, invalid_start_byte):
@pytest.mark.asyncio
async def test_invalid_stop_start_byte(invalid_stop_byte, invalid_start_byte):
# received a message with wrong stop byte plus an invalid message
# with fron start byte
# the complete receive buffer must be cleared to
# find the next valid message
Config.act_config = {'solarman':{'enabled': True}}
m = MemoryStream(invalid_stop_byte, (0,))
m.append_msg(invalid_start_byte)
m.read() # read complete msg, and dispatch msg
@@ -961,9 +1047,11 @@ def test_invalid_stop_start_byte(invalid_stop_byte, invalid_start_byte):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 1
m.close()
def test_invalid_checksum(invalid_checksum, device_ind_msg):
@pytest.mark.asyncio
async def test_invalid_checksum(invalid_checksum, device_ind_msg):
# received a message with wrong checksum plus an valid message
# only the first message must be discarded
Config.act_config = {'solarman':{'enabled': True}}
m = MemoryStream(invalid_checksum, (0,))
m.append_msg(device_ind_msg)
@@ -985,7 +1073,8 @@ def test_invalid_checksum(invalid_checksum, device_ind_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 1
m.close()
def test_read_message_twice(config_no_tsun_inv1, device_ind_msg, device_rsp_msg):
@pytest.mark.asyncio
async def test_read_message_twice(config_no_tsun_inv1, device_ind_msg, device_rsp_msg):
_ = config_no_tsun_inv1
m = MemoryStream(device_ind_msg, (0,))
m.append_msg(device_ind_msg)
@@ -1006,7 +1095,9 @@ def test_read_message_twice(config_no_tsun_inv1, device_ind_msg, device_rsp_msg)
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_read_message_in_chunks(device_ind_msg):
@pytest.mark.asyncio
async def test_read_message_in_chunks(device_ind_msg):
Config.act_config = {'solarman':{'enabled': True}}
m = MemoryStream(device_ind_msg, (4,11,0))
m.read() # read 4 bytes, header incomplere
assert not m.header_valid # must be invalid, since header not complete
@@ -1027,7 +1118,8 @@ def test_read_message_in_chunks(device_ind_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_read_message_in_chunks2(config_tsun_inv1, device_ind_msg):
@pytest.mark.asyncio
async def test_read_message_in_chunks2(my_loop, config_tsun_inv1, device_ind_msg):
_ = config_tsun_inv1
m = MemoryStream(device_ind_msg, (4,10,0))
m.read() # read 4 bytes, header incomplere
@@ -1052,7 +1144,8 @@ def test_read_message_in_chunks2(config_tsun_inv1, device_ind_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_read_two_messages(config_tsun_allow_all, device_ind_msg, device_rsp_msg, inverter_ind_msg, inverter_rsp_msg):
@pytest.mark.asyncio
async def test_read_two_messages(my_loop, config_tsun_allow_all, device_ind_msg, device_rsp_msg, inverter_ind_msg, inverter_rsp_msg):
_ = config_tsun_allow_all
m = MemoryStream(device_ind_msg, (0,))
m.append_msg(inverter_ind_msg)
@@ -1080,7 +1173,8 @@ def test_read_two_messages(config_tsun_allow_all, device_ind_msg, device_rsp_msg
assert m.ifc.tx_fifo.get()==b''
m.close()
def test_read_two_messages2(config_tsun_allow_all, inverter_ind_msg, inverter_ind_msg_81, inverter_rsp_msg, inverter_rsp_msg_81):
@pytest.mark.asyncio
async def test_read_two_messages2(my_loop, config_tsun_allow_all, inverter_ind_msg, inverter_ind_msg_81, inverter_rsp_msg, inverter_rsp_msg_81):
_ = config_tsun_allow_all
m = MemoryStream(inverter_ind_msg, (0,))
m.append_msg(inverter_ind_msg_81)
@@ -1105,7 +1199,8 @@ def test_read_two_messages2(config_tsun_allow_all, inverter_ind_msg, inverter_in
assert m.ifc.tx_fifo.get()==b''
m.close()
def test_read_two_messages3(config_tsun_allow_all, device_ind_msg2, device_rsp_msg2, inverter_ind_msg, inverter_rsp_msg):
@pytest.mark.asyncio
async def test_read_two_messages3(my_loop, config_tsun_allow_all, device_ind_msg2, device_rsp_msg2, inverter_ind_msg, inverter_rsp_msg):
# test device message received after the inverter masg
_ = config_tsun_allow_all
m = MemoryStream(inverter_ind_msg, (0,))
@@ -1134,7 +1229,8 @@ def test_read_two_messages3(config_tsun_allow_all, device_ind_msg2, device_rsp_m
assert m.ifc.tx_fifo.get()==b''
m.close()
def test_read_two_messages4(config_tsun_dcu1, dcu_dev_ind_msg, dcu_dev_rsp_msg, dcu_data_ind_msg, dcu_data_rsp_msg):
@pytest.mark.asyncio
async def test_read_two_messages4(my_loop, config_tsun_dcu1, dcu_dev_ind_msg, dcu_dev_rsp_msg, dcu_data_ind_msg, dcu_data_rsp_msg):
_ = config_tsun_dcu1
m = MemoryStream(dcu_dev_ind_msg, (0,))
m.append_msg(dcu_data_ind_msg)
@@ -1162,7 +1258,8 @@ def test_read_two_messages4(config_tsun_dcu1, dcu_dev_ind_msg, dcu_dev_rsp_msg,
assert m.ifc.tx_fifo.get()==b''
m.close()
def test_unkown_frame_code(config_tsun_inv1, inverter_ind_msg_81, inverter_rsp_msg_81):
@pytest.mark.asyncio
async def test_unkown_frame_code(my_loop, config_tsun_inv1, inverter_ind_msg_81, inverter_rsp_msg_81):
_ = config_tsun_inv1
m = MemoryStream(inverter_ind_msg_81, (0,))
m.read() # read complete msg, and dispatch msg
@@ -1180,7 +1277,8 @@ def test_unkown_frame_code(config_tsun_inv1, inverter_ind_msg_81, inverter_rsp_m
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_unkown_message(config_tsun_inv1, unknown_msg):
@pytest.mark.asyncio
async def test_unkown_message(my_loop, config_tsun_inv1, unknown_msg):
_ = config_tsun_inv1
m = MemoryStream(unknown_msg, (0,))
m.read() # read complete msg, and dispatch msg
@@ -1198,7 +1296,8 @@ def test_unkown_message(config_tsun_inv1, unknown_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_device_rsp(config_tsun_inv1, device_rsp_msg):
@pytest.mark.asyncio
async def test_device_rsp(my_loop, config_tsun_inv1, device_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(device_rsp_msg, (0,), False)
m.read() # read complete msg, and dispatch msg
@@ -1216,7 +1315,8 @@ def test_device_rsp(config_tsun_inv1, device_rsp_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_inverter_rsp(config_tsun_inv1, inverter_rsp_msg):
@pytest.mark.asyncio
async def test_inverter_rsp(my_loop, config_tsun_inv1, inverter_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(inverter_rsp_msg, (0,), False)
m.read() # read complete msg, and dispatch msg
@@ -1234,7 +1334,8 @@ def test_inverter_rsp(config_tsun_inv1, inverter_rsp_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_heartbeat_ind(config_tsun_inv1, heartbeat_ind_msg, heartbeat_rsp_msg):
@pytest.mark.asyncio
async def test_heartbeat_ind(my_loop, config_tsun_inv1, heartbeat_ind_msg, heartbeat_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(heartbeat_ind_msg, (0,))
m.read() # read complete msg, and dispatch msg
@@ -1251,7 +1352,8 @@ def test_heartbeat_ind(config_tsun_inv1, heartbeat_ind_msg, heartbeat_rsp_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_heartbeat_ind2(config_tsun_inv1, heartbeat_ind_msg, heartbeat_rsp_msg):
@pytest.mark.asyncio
async def test_heartbeat_ind2(my_loop, config_tsun_inv1, heartbeat_ind_msg, heartbeat_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(heartbeat_ind_msg, (0,))
m.no_forwarding = True
@@ -1269,7 +1371,8 @@ def test_heartbeat_ind2(config_tsun_inv1, heartbeat_ind_msg, heartbeat_rsp_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_heartbeat_rsp(config_tsun_inv1, heartbeat_rsp_msg):
@pytest.mark.asyncio
async def test_heartbeat_rsp(my_loop, config_tsun_inv1, heartbeat_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(heartbeat_rsp_msg, (0,), False)
m.read() # read complete msg, and dispatch msg
@@ -1287,7 +1390,8 @@ def test_heartbeat_rsp(config_tsun_inv1, heartbeat_rsp_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_sync_start_ind(config_tsun_inv1, sync_start_ind_msg, sync_start_rsp_msg, sync_start_fwd_msg):
@pytest.mark.asyncio
async def test_sync_start_ind(my_loop, config_tsun_inv1, sync_start_ind_msg, sync_start_rsp_msg, sync_start_fwd_msg):
_ = config_tsun_inv1
m = MemoryStream(sync_start_ind_msg, (0,))
m.read() # read complete msg, and dispatch msg
@@ -1310,7 +1414,8 @@ def test_sync_start_ind(config_tsun_inv1, sync_start_ind_msg, sync_start_rsp_msg
m.close()
def test_sync_start_rsp(config_tsun_inv1, sync_start_rsp_msg):
@pytest.mark.asyncio
async def test_sync_start_rsp(my_loop, config_tsun_inv1, sync_start_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(sync_start_rsp_msg, (0,), False)
m.read() # read complete msg, and dispatch msg
@@ -1328,7 +1433,8 @@ def test_sync_start_rsp(config_tsun_inv1, sync_start_rsp_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_sync_end_ind(config_tsun_inv1, sync_end_ind_msg, sync_end_rsp_msg):
@pytest.mark.asyncio
async def test_sync_end_ind(my_loop, config_tsun_inv1, sync_end_ind_msg, sync_end_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(sync_end_ind_msg, (0,))
m.read() # read complete msg, and dispatch msg
@@ -1345,7 +1451,8 @@ def test_sync_end_ind(config_tsun_inv1, sync_end_ind_msg, sync_end_rsp_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_sync_end_rsp(config_tsun_inv1, sync_end_rsp_msg):
@pytest.mark.asyncio
async def test_sync_end_rsp(my_loop, config_tsun_inv1, sync_end_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(sync_end_rsp_msg, (0,), False)
m.read() # read complete msg, and dispatch msg
@@ -1363,7 +1470,8 @@ def test_sync_end_rsp(config_tsun_inv1, sync_end_rsp_msg):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_build_modell_600(config_tsun_allow_all, inverter_ind_msg):
@pytest.mark.asyncio
async def test_build_modell_600(my_loop, config_tsun_allow_all, inverter_ind_msg):
_ = config_tsun_allow_all
m = MemoryStream(inverter_ind_msg, (0,))
assert 0 == m.sensor_list
@@ -1373,6 +1481,7 @@ def test_build_modell_600(config_tsun_allow_all, inverter_ind_msg):
m.read() # read complete msg, and dispatch msg
assert 2000 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 600 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 4 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MS2000(600)' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
assert '02b0' == m.db.get_db_value(Register.SENSOR_LIST, None)
assert 0 == m.sensor_list # must not been set by an inverter data ind
@@ -1382,7 +1491,8 @@ def test_build_modell_600(config_tsun_allow_all, inverter_ind_msg):
assert m.ifc.tx_fifo.get()==b''
m.close()
def test_build_modell_1600(config_tsun_allow_all, inverter_ind_msg1600):
@pytest.mark.asyncio
async def test_build_modell_1600(my_loop, config_tsun_allow_all, inverter_ind_msg1600):
_ = config_tsun_allow_all
m = MemoryStream(inverter_ind_msg1600, (0,))
assert 0 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
@@ -1391,10 +1501,12 @@ def test_build_modell_1600(config_tsun_allow_all, inverter_ind_msg1600):
m.read() # read complete msg, and dispatch msg
assert 1600 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 1600 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 4 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MS1600' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
m.close()
def test_build_modell_1800(config_tsun_allow_all, inverter_ind_msg1800):
@pytest.mark.asyncio
async def test_build_modell_1800(my_loop, config_tsun_allow_all, inverter_ind_msg1800):
_ = config_tsun_allow_all
m = MemoryStream(inverter_ind_msg1800, (0,))
assert 0 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
@@ -1403,10 +1515,12 @@ def test_build_modell_1800(config_tsun_allow_all, inverter_ind_msg1800):
m.read() # read complete msg, and dispatch msg
assert 1800 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 1800 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 4 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MS1800' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
m.close()
def test_build_modell_2000(config_tsun_allow_all, inverter_ind_msg2000):
@pytest.mark.asyncio
async def test_build_modell_2000(my_loop, config_tsun_allow_all, inverter_ind_msg2000):
_ = config_tsun_allow_all
m = MemoryStream(inverter_ind_msg2000, (0,))
assert 0 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
@@ -1415,10 +1529,12 @@ def test_build_modell_2000(config_tsun_allow_all, inverter_ind_msg2000):
m.read() # read complete msg, and dispatch msg
assert 2000 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 2000 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 4 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MS2000' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
m.close()
def test_build_modell_800(config_tsun_allow_all, inverter_ind_msg800):
@pytest.mark.asyncio
async def test_build_modell_800(my_loop, config_tsun_allow_all, inverter_ind_msg800):
_ = config_tsun_allow_all
m = MemoryStream(inverter_ind_msg800, (0,))
assert 0 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
@@ -1427,10 +1543,26 @@ def test_build_modell_800(config_tsun_allow_all, inverter_ind_msg800):
m.read() # read complete msg, and dispatch msg
assert 800 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 800 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 2 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MS800' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
m.close()
@pytest.mark.asyncio
async def test_build_modell_900(my_loop, config_tsun_allow_all, inverter_ind_msg900):
_ = config_tsun_allow_all
m = MemoryStream(inverter_ind_msg900, (0,))
assert 0 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert None == m.db.get_db_value(Register.RATED_POWER, None)
assert None == m.db.get_db_value(Register.INVERTER_TEMP, None)
m.read() # read complete msg, and dispatch msg
assert 900 == m.db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
assert 900 == m.db.get_db_value(Register.RATED_POWER, 0)
assert 2 == m.db.get_db_value(Register.NO_INPUTS, 0)
assert 'TSOL-MSxx00' == m.db.get_db_value(Register.EQUIPMENT_MODEL, 0)
m.close()
def test_build_logger_modell(config_tsun_allow_all, device_ind_msg):
@pytest.mark.asyncio
async def test_build_logger_modell(my_loop, config_tsun_allow_all, device_ind_msg):
_ = config_tsun_allow_all
m = MemoryStream(device_ind_msg, (0,))
assert 0 == m.db.get_db_value(Register.COLLECTOR_FW_VERSION, 0)
@@ -1441,7 +1573,8 @@ def test_build_logger_modell(config_tsun_allow_all, device_ind_msg):
assert 'V1.1.00.0B' == m.db.get_db_value(Register.COLLECTOR_FW_VERSION, 0).rstrip('\00')
m.close()
def test_msg_iterator():
@pytest.mark.asyncio
async def test_msg_iterator(my_loop, config_tsun_inv1):
Message._registry.clear()
m1 = SolarmanV5(None, ('test1.local', 1234), ifc=AsyncIfcImpl(), server_side=True, client_mode=False)
m2 = SolarmanV5(None, ('test2.local', 1234), ifc=AsyncIfcImpl(), server_side=True, client_mode=False)
@@ -1462,7 +1595,8 @@ def test_msg_iterator():
assert test1 == 1
assert test2 == 1
def test_proxy_counter():
@pytest.mark.asyncio
async def test_proxy_counter(my_loop, config_tsun_inv1):
m = SolarmanV5(None, ('test.local', 1234), ifc=AsyncIfcImpl(), server_side=True, client_mode=False)
assert m.new_data == {}
m.db.stat['proxy']['Unknown_Msg'] = 0
@@ -1481,7 +1615,7 @@ def test_proxy_counter():
m.close()
@pytest.mark.asyncio
async def test_msg_build_modbus_req(config_tsun_inv1, device_ind_msg, device_rsp_msg, inverter_ind_msg, inverter_rsp_msg, msg_modbus_cmd):
async def test_msg_build_modbus_req(my_loop, config_tsun_inv1, device_ind_msg, device_rsp_msg, inverter_ind_msg, inverter_rsp_msg, msg_modbus_cmd):
_ = config_tsun_inv1
m = MemoryStream(device_ind_msg, (0,), True)
m.read()
@@ -1516,7 +1650,7 @@ async def test_msg_build_modbus_req(config_tsun_inv1, device_ind_msg, device_rsp
m.close()
@pytest.mark.asyncio
async def test_at_cmd(config_tsun_allow_all, device_ind_msg, device_rsp_msg, inverter_ind_msg, inverter_rsp_msg, at_command_ind_msg, at_command_rsp_msg):
async def test_at_cmd(my_loop, config_tsun_allow_all, device_ind_msg, device_rsp_msg, inverter_ind_msg, inverter_rsp_msg, at_command_ind_msg, at_command_rsp_msg):
_ = config_tsun_allow_all
m = MemoryStream(device_ind_msg, (0,), True)
m.read() # read device ind
@@ -1576,7 +1710,7 @@ async def test_at_cmd(config_tsun_allow_all, device_ind_msg, device_rsp_msg, inv
m.close()
@pytest.mark.asyncio
async def test_at_cmd_blocked(config_tsun_allow_all, device_ind_msg, device_rsp_msg, inverter_ind_msg, inverter_rsp_msg, at_command_ind_msg):
async def test_at_cmd_blocked(my_loop, config_tsun_allow_all, device_ind_msg, device_rsp_msg, inverter_ind_msg, inverter_rsp_msg, at_command_ind_msg):
_ = config_tsun_allow_all
m = MemoryStream(device_ind_msg, (0,), True)
m.read()
@@ -1610,7 +1744,8 @@ async def test_at_cmd_blocked(config_tsun_allow_all, device_ind_msg, device_rsp_
assert Proxy.mqtt.data == "'AT+WEBU' is forbidden"
m.close()
def test_at_cmd_ind(config_tsun_inv1, at_command_ind_msg, at_command_rsp_msg):
@pytest.mark.asyncio
async def test_at_cmd_ind(my_loop, config_tsun_inv1, at_command_ind_msg, at_command_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(at_command_ind_msg, (0,), False)
m.db.stat['proxy']['Unknown_Ctrl'] = 0
@@ -1645,7 +1780,8 @@ def test_at_cmd_ind(config_tsun_inv1, at_command_ind_msg, at_command_rsp_msg):
m.close()
def test_at_cmd_ind_block(config_tsun_inv1, at_command_ind_msg_block):
@pytest.mark.asyncio
async def test_at_cmd_ind_block(my_loop, config_tsun_inv1, at_command_ind_msg_block):
_ = config_tsun_inv1
m = MemoryStream(at_command_ind_msg_block, (0,), False)
m.db.stat['proxy']['Unknown_Ctrl'] = 0
@@ -1673,7 +1809,8 @@ def test_at_cmd_ind_block(config_tsun_inv1, at_command_ind_msg_block):
assert Proxy.mqtt.data == ""
m.close()
def test_msg_at_command_rsp1(config_tsun_inv1, at_command_rsp_msg):
@pytest.mark.asyncio
async def test_msg_at_command_rsp1(my_loop, config_tsun_inv1, at_command_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(at_command_rsp_msg)
m.db.stat['proxy']['Unknown_Ctrl'] = 0
@@ -1692,7 +1829,8 @@ def test_msg_at_command_rsp1(config_tsun_inv1, at_command_rsp_msg):
assert m.db.stat['proxy']['Modbus_Command'] == 0
m.close()
def test_msg_at_command_rsp2(config_tsun_inv1, at_command_rsp_msg):
@pytest.mark.asyncio
async def test_msg_at_command_rsp2(my_loop, config_tsun_inv1, at_command_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(at_command_rsp_msg)
m.db.stat['proxy']['Unknown_Ctrl'] = 0
@@ -1713,7 +1851,8 @@ def test_msg_at_command_rsp2(config_tsun_inv1, at_command_rsp_msg):
assert Proxy.mqtt.data == "+ok"
m.close()
def test_msg_at_command_rsp3(config_tsun_inv1, at_command_interim_rsp_msg):
@pytest.mark.asyncio
async def test_msg_at_command_rsp3(my_loop, config_tsun_inv1, at_command_interim_rsp_msg):
_ = config_tsun_inv1
m = MemoryStream(at_command_interim_rsp_msg)
m.db.stat['proxy']['Unknown_Ctrl'] = 0
@@ -1738,7 +1877,8 @@ def test_msg_at_command_rsp3(config_tsun_inv1, at_command_interim_rsp_msg):
assert Proxy.mqtt.data == ""
m.close()
def test_msg_modbus_req(config_tsun_inv1, msg_modbus_cmd, msg_modbus_cmd_fwd):
@pytest.mark.asyncio
async def test_msg_modbus_req(my_loop, config_tsun_inv1, msg_modbus_cmd, msg_modbus_cmd_fwd):
_ = config_tsun_inv1
m = MemoryStream(b'')
m.snr = get_sn_int()
@@ -1766,7 +1906,8 @@ def test_msg_modbus_req(config_tsun_inv1, msg_modbus_cmd, msg_modbus_cmd_fwd):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_msg_modbus_req_seq(config_tsun_inv1, msg_modbus_cmd_seq):
@pytest.mark.asyncio
async def test_msg_modbus_req_seq(my_loop, config_tsun_inv1, msg_modbus_cmd_seq):
_ = config_tsun_inv1
m = MemoryStream(b'')
m.snr = get_sn_int()
@@ -1794,7 +1935,8 @@ def test_msg_modbus_req_seq(config_tsun_inv1, msg_modbus_cmd_seq):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_msg_modbus_req2(config_tsun_inv1, msg_modbus_cmd_crc_err):
@pytest.mark.asyncio
async def test_msg_modbus_req2(my_loop, config_tsun_inv1, msg_modbus_cmd_crc_err):
_ = config_tsun_inv1
m = MemoryStream(b'')
m.snr = get_sn_int()
@@ -1821,7 +1963,8 @@ def test_msg_modbus_req2(config_tsun_inv1, msg_modbus_cmd_crc_err):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 1
m.close()
def test_msg_unknown_cmd_req(config_tsun_inv1, msg_unknown_cmd):
@pytest.mark.asyncio
async def test_msg_unknown_cmd_req(my_loop, config_tsun_inv1, msg_unknown_cmd):
_ = config_tsun_inv1
m = MemoryStream(msg_unknown_cmd, (0,), False)
m.db.stat['proxy']['Unknown_Ctrl'] = 0
@@ -1843,7 +1986,8 @@ def test_msg_unknown_cmd_req(config_tsun_inv1, msg_unknown_cmd):
assert m.db.stat['proxy']['Invalid_Msg_Format'] == 0
m.close()
def test_msg_modbus_rsp1(config_tsun_inv1, msg_modbus_rsp):
@pytest.mark.asyncio
async def test_msg_modbus_rsp1(my_loop, config_tsun_inv1, msg_modbus_rsp):
'''Modbus response without a valid Modbus request must be dropped'''
_ = config_tsun_inv1
m = MemoryStream(msg_modbus_rsp)
@@ -1862,7 +2006,8 @@ def test_msg_modbus_rsp1(config_tsun_inv1, msg_modbus_rsp):
assert m.db.stat['proxy']['Modbus_Command'] == 0
m.close()
def test_msg_modbus_rsp2(config_tsun_inv1, msg_modbus_rsp):
@pytest.mark.asyncio
async def test_msg_modbus_rsp2(my_loop, config_tsun_inv1, msg_modbus_rsp):
'''Modbus response with a valid Modbus request must be forwarded'''
_ = config_tsun_inv1 # setup config structure
m = MemoryStream(msg_modbus_rsp)
@@ -1899,7 +2044,8 @@ def test_msg_modbus_rsp2(config_tsun_inv1, msg_modbus_rsp):
m.close()
def test_msg_modbus_rsp3(config_tsun_inv1, msg_modbus_rsp):
@pytest.mark.asyncio
async def test_msg_modbus_rsp3(my_loop, config_tsun_inv1, msg_modbus_rsp):
'''Modbus response with a valid Modbus request must be forwarded'''
_ = config_tsun_inv1
m = MemoryStream(msg_modbus_rsp)
@@ -1935,7 +2081,8 @@ def test_msg_modbus_rsp3(config_tsun_inv1, msg_modbus_rsp):
m.close()
def test_msg_unknown_rsp(config_tsun_inv1, msg_unknown_cmd_rsp):
@pytest.mark.asyncio
async def test_msg_unknown_rsp(my_loop, config_tsun_inv1, msg_unknown_cmd_rsp):
_ = config_tsun_inv1
m = MemoryStream(msg_unknown_cmd_rsp)
m.db.stat['proxy']['Unknown_Ctrl'] = 0
@@ -1953,7 +2100,8 @@ def test_msg_unknown_rsp(config_tsun_inv1, msg_unknown_cmd_rsp):
assert m.db.stat['proxy']['Modbus_Command'] == 0
m.close()
def test_msg_modbus_invalid(config_tsun_inv1, msg_modbus_invalid):
@pytest.mark.asyncio
async def test_msg_modbus_invalid(my_loop, config_tsun_inv1, msg_modbus_invalid):
_ = config_tsun_inv1
m = MemoryStream(msg_modbus_invalid, (0,), False)
m.db.stat['proxy']['Unknown_Ctrl'] = 0
@@ -1967,7 +2115,8 @@ def test_msg_modbus_invalid(config_tsun_inv1, msg_modbus_invalid):
assert m.db.stat['proxy']['Modbus_Command'] == 0
m.close()
def test_msg_modbus_fragment(config_tsun_inv1, msg_modbus_rsp):
@pytest.mark.asyncio
async def test_msg_modbus_fragment(my_loop, config_tsun_inv1, msg_modbus_rsp):
_ = config_tsun_inv1
# receive more bytes than expected (7 bytes from the next msg)
m = MemoryStream(msg_modbus_rsp+b'\x00\x00\x00\x45\x10\x52\x31', (0,))
@@ -1993,7 +2142,7 @@ def test_msg_modbus_fragment(config_tsun_inv1, msg_modbus_rsp):
m.close()
@pytest.mark.asyncio
async def test_modbus_polling(config_tsun_inv1, heartbeat_ind_msg, heartbeat_rsp_msg):
async def test_modbus_polling(my_loop, config_tsun_inv1, heartbeat_ind_msg, heartbeat_rsp_msg):
_ = config_tsun_inv1
assert asyncio.get_running_loop()
m = MemoryStream(heartbeat_ind_msg, (0,))
@@ -2106,7 +2255,62 @@ async def test_modbus_scaning(config_tsun_scan, heartbeat_ind_msg, heartbeat_rsp
m.close()
@pytest.mark.asyncio
async def test_start_client_mode(config_tsun_inv1, str_test_ip):
async def test_modbus_scaning_inv_rsp(config_tsun_scan, heartbeat_ind_msg, heartbeat_rsp_msg, msg_modbus_rsp_mb_4):
_ = config_tsun_scan
assert asyncio.get_running_loop()
m = MemoryStream(heartbeat_ind_msg, (0x15,0x56,0))
m.append_msg(msg_modbus_rsp_mb_4)
assert m.mb_scan == False
assert asyncio.get_running_loop() == m.mb_timer.loop
m.db.stat['proxy']['Unknown_Ctrl'] = 0
assert m.mb_timer.tim == None
m.read() # read complete msg, and dispatch msg
assert m.mb_scan == True
assert m.mb_start_reg == 0xff80
assert m.mb_step == 0x40
assert m.mb_bytes == 0x14
assert asyncio.get_running_loop() == m.mb_timer.loop
assert not m.header_valid # must be invalid, since msg was handled and buffer flushed
assert m.msg_count == 1
assert m.snr == 2070233889
assert m.control == 0x4710
assert m.msg_recvd[0]['control']==0x4710
assert m.msg_recvd[0]['seq']=='84:11'
assert m.msg_recvd[0]['data_len']==0x1
assert m.ifc.tx_fifo.get()==heartbeat_rsp_msg
assert m.ifc.fwd_fifo.get()==heartbeat_ind_msg
assert m.db.stat['proxy']['Unknown_Ctrl'] == 0
m.ifc.tx_clear() # clear send buffer for next test
assert isclose(m.mb_timeout, 0.5)
assert next(m.mb_timer.exp_count) == 0
await asyncio.sleep(0.5)
assert m.sent_pdu==b'\xa5\x17\x00\x10E\x12\x84!Ce{\x02\xb0\x02\x00\x00\x00\x00\x00\x00' \
b'\x00\x00\x00\x00\x00\x00\x01\x03\xff\xc0\x00\x14\x75\xed\x33\x15'
assert m.ifc.tx_fifo.get()==b''
m.read() # read complete msg, and dispatch msg
assert not m.header_valid # must be invalid, since msg was handled and buffer flushed
assert m.msg_count == 2
assert m.msg_recvd[1]['control']==0x1510
assert m.msg_recvd[1]['seq']=='03:03'
assert m.msg_recvd[1]['data_len']==0x3b
assert m.mb.last_addr == 1
assert m.mb.last_fcode == 3
assert m.mb.last_reg == 0xffc0 # mb_start_reg + mb_step
assert m.mb.last_len == 20
assert m.mb.err == 3
assert next(m.mb_timer.exp_count) == 2
m.close()
@pytest.mark.asyncio
async def test_start_client_mode(my_loop, config_tsun_inv1, str_test_ip):
_ = config_tsun_inv1
assert asyncio.get_running_loop()
m = MemoryStream(b'')
@@ -2210,7 +2414,8 @@ async def test_start_client_mode_scan(config_tsun_scan_dcu, str_test_ip, dcu_mod
m.close()
def test_timeout(config_tsun_inv1):
@pytest.mark.asyncio
async def test_timeout(my_loop, config_tsun_inv1):
_ = config_tsun_inv1
m = MemoryStream(b'')
assert m.state == State.init
@@ -2223,7 +2428,8 @@ def test_timeout(config_tsun_inv1):
m.state = State.closed
m.close()
def test_fnc_dispatch():
@pytest.mark.asyncio
async def test_fnc_dispatch(my_loop, config_tsun_inv1):
def msg():
return
@@ -2244,7 +2450,8 @@ def test_fnc_dispatch():
assert _obj == m.msg_unknown
assert _str == "'msg_unknown'"
def test_timestamp():
@pytest.mark.asyncio
async def test_timestamp(my_loop, config_tsun_inv1):
m = MemoryStream(b'')
ts = m._timestamp()
ts_emu = m._emu_timestamp()
@@ -2271,7 +2478,7 @@ class InverterTest(InverterBase):
@pytest.mark.asyncio
async def test_proxy_at_cmd(config_tsun_inv1, patch_open_connection, at_command_ind_msg, at_command_rsp_msg):
async def test_proxy_at_cmd(my_loop, config_tsun_inv1, patch_open_connection, at_command_ind_msg, at_command_rsp_msg):
_ = config_tsun_inv1
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -2309,7 +2516,7 @@ async def test_proxy_at_cmd(config_tsun_inv1, patch_open_connection, at_command_
assert Proxy.mqtt.data == ""
@pytest.mark.asyncio
async def test_proxy_at_blocked(config_tsun_inv1, patch_open_connection, at_command_ind_msg_block, at_command_rsp_msg):
async def test_proxy_at_blocked(my_loop, config_tsun_inv1, patch_open_connection, at_command_ind_msg_block, at_command_rsp_msg):
_ = config_tsun_inv1
_ = patch_open_connection
assert asyncio.get_running_loop()
@@ -2345,3 +2552,124 @@ async def test_proxy_at_blocked(config_tsun_inv1, patch_open_connection, at_comm
assert Proxy.mqtt.key == 'tsun/inv1/at_resp'
assert Proxy.mqtt.data == "+ok"
@pytest.mark.asyncio
async def test_dcu_cmd(my_loop, config_tsun_allow_all, dcu_dev_ind_msg, dcu_dev_rsp_msg, dcu_data_ind_msg, dcu_data_rsp_msg, dcu_command_ind_msg, dcu_command_rsp_msg):
'''test dcu_power command fpr a DCU device with sensor 0x3026'''
_ = config_tsun_allow_all
m = MemoryStream(dcu_dev_ind_msg, (0,), True)
m.read() # read device ind
assert m.control == 0x4110
assert str(m.seq) == '01:92'
assert m.ifc.tx_fifo.get()==dcu_dev_rsp_msg
assert m.ifc.fwd_fifo.get()==dcu_dev_ind_msg
m.send_dcu_cmd(b'\x01\x01\x06\x01\x00\x01\x03\xe8')
assert m.ifc.tx_fifo.get()==b''
assert m.ifc.fwd_fifo.get()==b''
assert m.sent_pdu == b''
assert str(m.seq) == '01:92'
assert Proxy.mqtt.key == ''
assert Proxy.mqtt.data == ""
m.append_msg(dcu_data_ind_msg)
m.read() # read inverter ind
assert m.control == 0x4210
assert str(m.seq) == '02:93'
assert m.ifc.tx_fifo.get()==dcu_data_rsp_msg
assert m.ifc.fwd_fifo.get()==dcu_data_ind_msg
m.send_dcu_cmd(b'\x01\x01\x06\x01\x00\x01\x03\xe8')
assert m.ifc.fwd_fifo.get() == b''
assert m.ifc.tx_fifo.get()== b''
assert m.sent_pdu == dcu_command_ind_msg
m.sent_pdu = bytearray()
assert str(m.seq) == '02:94'
assert Proxy.mqtt.key == ''
assert Proxy.mqtt.data == ""
m.append_msg(dcu_command_rsp_msg)
m.read() # read at resp
assert m.control == 0x1510
assert str(m.seq) == '03:94'
assert m.ifc.rx_get()==b''
assert m.ifc.tx_fifo.get()==b''
assert m.ifc.fwd_fifo.get()==b''
assert Proxy.mqtt.key == 'tsun/dcu_resp'
assert Proxy.mqtt.data == "+ok"
Proxy.mqtt.clear() # clear last test result
@pytest.mark.asyncio
async def test_dcu_cmd_not_supported(my_loop, config_tsun_allow_all, device_ind_msg, device_rsp_msg, inverter_ind_msg, inverter_rsp_msg):
'''test that an inverter don't accept the dcu_power command'''
_ = config_tsun_allow_all
m = MemoryStream(device_ind_msg, (0,), True)
m.read() # read device ind
assert m.control == 0x4110
assert str(m.seq) == '01:01'
assert m.ifc.tx_fifo.get()==device_rsp_msg
assert m.ifc.fwd_fifo.get()==device_ind_msg
m.send_dcu_cmd(b'\x01\x01\x06\x01\x00\x01\x03\xe8')
assert m.ifc.tx_fifo.get()==b''
assert m.ifc.fwd_fifo.get()==b''
assert m.sent_pdu == b''
assert str(m.seq) == '01:01'
assert Proxy.mqtt.key == ''
assert Proxy.mqtt.data == ""
m.append_msg(inverter_ind_msg)
m.read() # read inverter ind
assert m.control == 0x4210
assert str(m.seq) == '02:02'
assert m.ifc.tx_fifo.get()==inverter_rsp_msg
assert m.ifc.fwd_fifo.get()==inverter_ind_msg
m.send_dcu_cmd(b'\x01\x01\x06\x01\x00\x01\x03\xe8')
assert m.ifc.fwd_fifo.get() == b''
assert m.ifc.tx_fifo.get()== b''
assert m.sent_pdu == b''
Proxy.mqtt.clear() # clear last test result
@pytest.mark.asyncio
async def test_proxy_dcu_cmd(my_loop, config_tsun_dcu1, patch_open_connection, dcu_command_ind_msg, dcu_command_rsp_msg):
_ = config_tsun_inv1
_ = patch_open_connection
assert asyncio.get_running_loop()
with InverterTest(FakeReader(), FakeWriter(), client_mode=False) as inverter:
await inverter.create_remote()
await asyncio.sleep(0)
r = inverter.remote.stream
l = inverter.local.stream
l.db.stat['proxy']['DCU_Command'] = 0
l.db.stat['proxy']['AT_Command'] = 0
l.db.stat['proxy']['Unknown_Ctrl'] = 0
l.db.stat['proxy']['AT_Command_Blocked'] = 0
l.db.stat['proxy']['Modbus_Command'] = 0
inverter.forward_dcu_cmd_resp = False
r.append_msg(dcu_command_ind_msg)
r.read() # read complete msg, and dispatch msg
assert inverter.forward_dcu_cmd_resp
inverter.forward(r,l)
assert l.ifc.tx_fifo.get()==dcu_command_ind_msg
assert l.db.stat['proxy']['Invalid_Msg_Format'] == 0
assert l.db.stat['proxy']['DCU_Command'] == 1
assert l.db.stat['proxy']['AT_Command'] == 0
assert l.db.stat['proxy']['AT_Command_Blocked'] == 0
assert l.db.stat['proxy']['Modbus_Command'] == 0
assert 2 == l.db.get_db_value(Register.NO_INPUTS, 0)
l.append_msg(dcu_command_rsp_msg)
l.read() # read at resp
assert l.ifc.fwd_fifo.peek()==dcu_command_rsp_msg
inverter.forward(l,r)
assert r.ifc.tx_fifo.get()==dcu_command_rsp_msg
assert Proxy.mqtt.key == ''
assert Proxy.mqtt.data == ""

View File

@@ -9,6 +9,9 @@ from infos import Infos, Register
from test_solarman import FakeIfc, FakeInverter, MemoryStream, get_sn_int, get_sn, correct_checksum, config_tsun_inv1, msg_modbus_rsp
from test_infos_g3p import str_test_ip, bytes_test_ip
pytest_plugins = ('pytest_asyncio',)
timestamp = 0x3224c8bc
class InvStream(MemoryStream):
@@ -125,17 +128,17 @@ def heartbeat_ind():
msg = b'\xa5\x01\x00\x10G\x00\x01\x00\x00\x00\x00\x00Y\x15'
return msg
def test_emu_init_close():
# received a message with wrong start byte plus an valid message
# the complete receive buffer must be cleared to
# find the next valid message
@pytest.mark.asyncio
async def test_emu_init_close(my_loop, config_tsun_inv1):
_ = config_tsun_inv1
assert asyncio.get_running_loop()
inv = InvStream()
cld = CldStream(inv)
cld.close()
@pytest.mark.asyncio
async def test_emu_start(config_tsun_inv1, msg_modbus_rsp, str_test_ip, device_ind_msg):
async def test_emu_start(my_loop, config_tsun_inv1, msg_modbus_rsp, str_test_ip, device_ind_msg):
_ = config_tsun_inv1
assert asyncio.get_running_loop()
inv = InvStream(msg_modbus_rsp)
@@ -152,7 +155,8 @@ async def test_emu_start(config_tsun_inv1, msg_modbus_rsp, str_test_ip, device_i
assert inv.ifc.fwd_fifo.peek() == device_ind_msg
cld.close()
def test_snd_hb(config_tsun_inv1, heartbeat_ind):
@pytest.mark.asyncio
async def test_snd_hb(my_loop, config_tsun_inv1, heartbeat_ind):
_ = config_tsun_inv1
inv = InvStream()
cld = CldStream(inv)
@@ -163,7 +167,7 @@ def test_snd_hb(config_tsun_inv1, heartbeat_ind):
cld.close()
@pytest.mark.asyncio
async def test_snd_inv_data(config_tsun_inv1, inverter_ind_msg, inverter_rsp_msg):
async def test_snd_inv_data(my_loop, config_tsun_inv1, inverter_ind_msg, inverter_rsp_msg):
_ = config_tsun_inv1
inv = InvStream()
inv.db.set_db_def_value(Register.INVERTER_STATUS, 1)
@@ -205,7 +209,7 @@ async def test_snd_inv_data(config_tsun_inv1, inverter_ind_msg, inverter_rsp_msg
cld.close()
@pytest.mark.asyncio
async def test_rcv_invalid(config_tsun_inv1, inverter_ind_msg, inverter_rsp_msg):
async def test_rcv_invalid(my_loop, config_tsun_inv1, inverter_ind_msg, inverter_rsp_msg):
_ = config_tsun_inv1
inv = InvStream()
assert asyncio.get_running_loop() == inv.mb_timer.loop

View File

@@ -1048,7 +1048,8 @@ def msg_inverter_ms3000_ind(): # Data indication from the controller
msg += b'\x53\x00\x66' # | S.f'
return msg
def test_read_message(msg_contact_info):
@pytest.mark.asyncio
async def test_read_message(msg_contact_info):
Config.act_config = {'tsun':{'enabled': True}}
m = MemoryStream(msg_contact_info, (0,))
m.read() # read complete msg, and dispatch msg

View File

@@ -1,13 +1,24 @@
# test_with_pytest.py
import pytest
from server import app
from web import Web, web
from async_stream import AsyncStreamClient
from gen3plus.inverter_g3p import InverterG3P
from test_inverter_g3p import FakeReader, FakeWriter, config_conn
from cnf.config import Config
from mock import patch
from proxy import Proxy
import os, errno
from os import DirEntry, stat_result
import datetime
pytest_plugins = ('pytest_asyncio',)
@pytest.fixture(scope="session")
def client():
app.secret_key = 'super secret key'
return app.test_client()
@pytest.fixture
def create_inverter(config_conn):
_ = config_conn
@@ -36,65 +47,80 @@ def create_inverter_client(config_conn):
return inv
@pytest.mark.asyncio
async def test_home():
async def test_home(client):
"""Test the home route."""
client = app.test_client()
response = await client.get('/')
assert response.status_code == 200
assert response.mimetype == 'text/html'
@pytest.mark.asyncio
async def test_page():
"""Test the empty page route."""
client = app.test_client()
response = await client.get('/page')
async def test_page(client):
"""Test the mqtt page route."""
response = await client.get('/mqtt')
assert response.status_code == 200
assert response.mimetype == 'text/html'
@pytest.mark.asyncio
async def test_rel_page(client):
"""Test the mqtt route."""
web.build_relative_urls = True
response = await client.get('/mqtt')
assert response.status_code == 200
assert response.mimetype == 'text/html'
web.build_relative_urls = False
@pytest.mark.asyncio
async def test_favicon96():
async def test_notes(client):
"""Test the notes page route."""
response = await client.get('/notes')
assert response.status_code == 200
assert response.mimetype == 'text/html'
@pytest.mark.asyncio
async def test_logging(client):
"""Test the logging page route."""
response = await client.get('/logging')
assert response.status_code == 200
assert response.mimetype == 'text/html'
@pytest.mark.asyncio
async def test_favicon96(client):
"""Test the favicon-96x96.png route."""
client = app.test_client()
response = await client.get('/favicon-96x96.png')
assert response.status_code == 200
assert response.mimetype == 'image/png'
@pytest.mark.asyncio
async def test_favicon():
async def test_favicon(client):
"""Test the favicon.ico route."""
client = app.test_client()
response = await client.get('/favicon.ico')
assert response.status_code == 200
assert response.mimetype == 'image/x-icon'
@pytest.mark.asyncio
async def test_favicon_svg():
async def test_favicon_svg(client):
"""Test the favicon.svg route."""
client = app.test_client()
response = await client.get('/favicon.svg')
assert response.status_code == 200
assert response.mimetype == 'image/svg+xml'
@pytest.mark.asyncio
async def test_apple_touch_icon():
async def test_apple_touch_icon(client):
"""Test the apple-touch-icon.png route."""
client = app.test_client()
response = await client.get('/apple-touch-icon.png')
assert response.status_code == 200
assert response.mimetype == 'image/png'
@pytest.mark.asyncio
async def test_manifest():
async def test_manifest(client):
"""Test the site.webmanifest route."""
client = app.test_client()
response = await client.get('/site.webmanifest')
assert response.status_code == 200
assert response.mimetype == 'application/manifest+json'
@pytest.mark.asyncio
async def test_data_fetch(create_inverter):
"""Test the healthy route."""
"""Test the data-fetch route."""
_ = create_inverter
client = app.test_client()
response = await client.get('/data-fetch')
@@ -105,7 +131,7 @@ async def test_data_fetch(create_inverter):
@pytest.mark.asyncio
async def test_data_fetch1(create_inverter_server):
"""Test the healthy route."""
"""Test the data-fetch route with server connection."""
_ = create_inverter_server
client = app.test_client()
response = await client.get('/data-fetch')
@@ -116,7 +142,7 @@ async def test_data_fetch1(create_inverter_server):
@pytest.mark.asyncio
async def test_data_fetch2(create_inverter_client):
"""Test the healthy route."""
"""Test the data-fetch route with client connection."""
_ = create_inverter_client
client = app.test_client()
response = await client.get('/data-fetch')
@@ -124,3 +150,144 @@ async def test_data_fetch2(create_inverter_client):
response = await client.get('/data-fetch')
assert response.status_code == 200
@pytest.mark.asyncio
async def test_language_en(client):
"""Test the language/en route and cookie."""
response = await client.get('/language/en', headers={'referer': '/index'})
assert response.status_code == 302
assert response.content_language.pop() == 'en'
assert response.location == '/index'
assert response.mimetype == 'text/html'
client.set_cookie('test', key='language', value='de')
response = await client.get('/mqtt')
assert response.status_code == 200
assert response.mimetype == 'text/html'
@pytest.mark.asyncio
async def test_language_de(client):
"""Test the language/de route."""
response = await client.get('/language/de', headers={'referer': '/'})
assert response.status_code == 302
assert response.content_language.pop() == 'de'
assert response.location == '/'
assert response.mimetype == 'text/html'
@pytest.mark.asyncio
async def test_language_unknown(client):
"""Test the language/unknown route."""
response = await client.get('/language/unknown')
assert response.status_code == 404
assert response.mimetype == 'text/html'
@pytest.mark.asyncio
async def test_mqtt_fetch(client, create_inverter):
"""Test the mqtt-fetch route."""
_ = create_inverter
Proxy.class_init()
response = await client.get('/mqtt-fetch')
assert response.status_code == 200
@pytest.mark.asyncio
async def test_notes_fetch(client, config_conn):
"""Test the notes-fetch route."""
_ = create_inverter
response = await client.get('/notes-fetch')
assert response.status_code == 200
@pytest.mark.asyncio
async def test_file_fetch(client, config_conn, monkeypatch):
"""Test the data-fetch route."""
_ = config_conn
assert Config.log_path == 'app/tests/log/'
def my_stat1(*arg):
stat = stat_result
stat.st_size = 20
stat.st_birthtime = datetime.datetime(2024, 1, 31, 10, 30, 15)
stat.st_mtime = datetime.datetime(2024, 1, 1, 1, 30, 15).timestamp()
return stat
monkeypatch.setattr(DirEntry, "stat", my_stat1)
response = await client.get('/file-fetch')
assert response.status_code == 200
def my_stat2(*arg):
stat = stat_result
stat.st_size = 20
stat.st_mtime = datetime.datetime(2024, 1, 1, 1, 30, 15).timestamp()
return stat
monkeypatch.setattr(DirEntry, "stat", my_stat2)
monkeypatch.delattr(stat_result, "st_birthtime")
response = await client.get('/file-fetch')
assert response.status_code == 200
@pytest.mark.asyncio
async def test_send_file(client, config_conn):
"""Test the send-file route."""
_ = config_conn
assert Config.log_path == 'app/tests/log/'
response = await client.get('/send-file/test.txt')
assert response.status_code == 200
@pytest.mark.asyncio
async def test_missing_send_file(client, config_conn):
"""Test the send-file route (file not found)."""
_ = config_conn
assert Config.log_path == 'app/tests/log/'
response = await client.get('/send-file/no_file.log')
assert response.status_code == 404
@pytest.mark.asyncio
async def test_invalid_send_file(client, config_conn):
"""Test the send-file route (invalid filename)."""
_ = config_conn
assert Config.log_path == 'app/tests/log/'
response = await client.get('/send-file/../test_web_route.py')
assert response.status_code == 404
@pytest.fixture
def patch_os_remove_err():
def new_remove(file_path: str):
raise OSError(errno.ENOENT, os.strerror(errno.ENOENT), file_path)
with patch.object(os, 'remove', new_remove) as wrapped_os:
yield wrapped_os
@pytest.fixture
def patch_os_remove_ok():
def new_remove(file_path: str):
return
with patch.object(os, 'remove', new_remove) as wrapped_os:
yield wrapped_os
@pytest.mark.asyncio
async def test_del_file_ok(client, config_conn, patch_os_remove_ok):
"""Test the del-file route with no error."""
_ = config_conn
_ = patch_os_remove_ok
assert Config.log_path == 'app/tests/log/'
response = await client.delete ('/del-file/test.txt')
assert response.status_code == 204
@pytest.mark.asyncio
async def test_del_file_err(client, config_conn, patch_os_remove_err):
"""Test the send-file route with OSError."""
_ = config_conn
_ = patch_os_remove_err
assert Config.log_path == 'app/tests/log/'
response = await client.delete ('/del-file/test.txt')
assert response.status_code == 404

View File

@@ -8,7 +8,7 @@ msgid ""
msgstr ""
"Project-Id-Version: tsun-gen3-proxy 0.14.0\n"
"Report-Msgid-Bugs-To: EMAIL@ADDRESS\n"
"POT-Creation-Date: 2025-04-20 21:21+0200\n"
"POT-Creation-Date: 2025-05-13 22:34+0200\n"
"PO-Revision-Date: 2025-04-18 16:24+0200\n"
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
"Language: de\n"
@@ -19,47 +19,181 @@ msgstr ""
"Content-Transfer-Encoding: 8bit\n"
"Generated-By: Babel 2.17.0\n"
#: src/web/templates/base.html.j2:27
msgid "Updated:"
msgstr "Aktualisiert:"
#: src/web/templates/base.html.j2:46
#: src/web/conn_table.py:53 src/web/templates/base.html.j2:58
msgid "Connections"
msgstr "Verbindungen"
#: src/web/templates/index.html.j2:5
#: src/web/conn_table.py:60
msgid "Device-IP:Port"
msgstr "Geräte-IP:Port"
#: src/web/conn_table.py:60
msgid "Device-IP"
msgstr "Geräte-IP"
#: src/web/conn_table.py:61 src/web/mqtt_table.py:34
msgid "Serial-No"
msgstr "Seriennummer"
#: src/web/conn_table.py:62
msgid "Cloud-IP:Port"
msgstr "Cloud-IP:Port"
#: src/web/conn_table.py:62
msgid "Cloud-IP"
msgstr "Cloud-IP"
#: src/web/log_files.py:48
msgid "n/a"
msgstr "keine Angabe"
#: src/web/mqtt_table.py:27
msgid "MQTT devices"
msgstr "MQTT Geräte"
#: src/web/mqtt_table.py:35
msgid "Node-ID"
msgstr ""
#: src/web/mqtt_table.py:36
msgid "HA-Area"
msgstr ""
#: src/web/templates/base.html.j2:37
msgid "Updated:"
msgstr "Aktualisiert:"
#: src/web/templates/base.html.j2:49
msgid "Version:"
msgstr ""
#: src/web/templates/base.html.j2:60 src/web/templates/page_notes.html.j2:5
msgid "Important Messages"
msgstr "Wichtige Hinweise"
#: src/web/templates/base.html.j2:61 src/web/templates/page_logging.html.j2:5
msgid "Log Files"
msgstr "Log Dateien"
#: src/web/templates/page_index.html.j2:3
msgid "TSUN Proxy - Connections"
msgstr "TSUN Proxy - Verbindungen"
#: src/web/templates/page_index.html.j2:5
msgid "Proxy Connection Overview"
msgstr "Proxy Verbindungen"
#: src/web/templates/index.html.j2:16
#: src/web/templates/page_index.html.j2:17
msgid "Server Mode"
msgstr "Server Modus"
#: src/web/templates/index.html.j2:17
#: src/web/templates/page_index.html.j2:18
msgid "Established from device to proxy"
msgstr "Vom Gerät zum Proxy aufgebaut"
#: src/web/templates/index.html.j2:27
#: src/web/templates/page_index.html.j2:30
msgid "Client Mode"
msgstr "Client Modus"
#: src/web/templates/index.html.j2:28
#: src/web/templates/page_index.html.j2:31
msgid "Established from proxy to device"
msgstr "Vom Proxy zum Gerät aufgebaut"
#: src/web/templates/index.html.j2:38
#: src/web/templates/page_index.html.j2:43
msgid "Proxy Mode"
msgstr "Proxy Modus"
#: src/web/templates/index.html.j2:39
#: src/web/templates/page_index.html.j2:44
msgid "Forwarding data to cloud"
msgstr "Weiterleitung in die Cloud"
#: src/web/templates/index.html.j2:49
#: src/web/templates/page_index.html.j2:56
msgid "Emu Mode"
msgstr "Emu Modus"
#: src/web/templates/index.html.j2:50
#: src/web/templates/page_index.html.j2:57
msgid "Emulation sends data to cloud"
msgstr "Emulation sendet in die Cloud"
#: src/web/templates/page_logging.html.j2:3
msgid "TSUN Proxy - Log Files"
msgstr "TSUN Proxy - Log Dateien"
#: src/web/templates/page_logging.html.j2:10
msgid "Do you really want to delete the log file: <br>%(file)s ?"
msgstr "Soll die Datei: <br>%(file)s<br>wirklich gelöscht werden?"
#: src/web/templates/page_logging.html.j2:12
msgid "Delete File"
msgstr "Datei löschen"
#: src/web/templates/page_logging.html.j2:13
msgid "Abort"
msgstr "Abbruch"
#: src/web/templates/page_mqtt.html.j2:3
msgid "TSUN Proxy - MQTT Status"
msgstr ""
#: src/web/templates/page_mqtt.html.j2:5
msgid "MQTT Overview"
msgstr "MQTT Überblick"
#: src/web/templates/page_mqtt.html.j2:16
msgid "Connection Time"
msgstr "Verbindungszeit"
#: src/web/templates/page_mqtt.html.j2:17
msgid "Time at which the connection was established"
msgstr "Zeitpunkt des Verbindungsaufbaus"
#: src/web/templates/page_mqtt.html.j2:29
msgid "Published Topics"
msgstr "Gesendete Topics"
#: src/web/templates/page_mqtt.html.j2:30
msgid "Number of published topics"
msgstr "Anzahl der veröffentlichten Topics"
#: src/web/templates/page_mqtt.html.j2:42
msgid "Received Topics"
msgstr "Empfangene Topics"
#: src/web/templates/page_mqtt.html.j2:43
msgid "Number of topics received"
msgstr "Anzahl der empfangenen Topics"
#: src/web/templates/page_notes.html.j2:3
msgid "TSUN Proxy - Important Messages"
msgstr "TSUN Proxy - Wichtige Hinweise"
#: src/web/templates/templ_log_files_list.html.j2:11
msgid "Created"
msgstr "Erzeugt"
#: src/web/templates/templ_log_files_list.html.j2:11
msgid "Modified"
msgstr "Modifiziert"
#: src/web/templates/templ_log_files_list.html.j2:11
msgid "Size"
msgstr "Größe"
#: src/web/templates/templ_log_files_list.html.j2:20
msgid "Download File"
msgstr "Datei Download"
#: src/web/templates/templ_notes_list.html.j2:3
msgid "Warnings and error messages"
msgstr "Warnungen und Fehlermeldungen"
#: src/web/templates/templ_notes_list.html.j2:18
msgid "Well done!"
msgstr "Gut gemacht!"
#: src/web/templates/templ_notes_list.html.j2:19
msgid "No warnings or errors have been logged since the last proxy start."
msgstr ""
"Seit dem letzten Proxystart wurden keine Warnungen oder Fehler "
"protokolliert."

View File

@@ -87,6 +87,7 @@ SRC_FILES := $(wildcard $(SRC_PROXY)/*.py)\
$(wildcard $(SRC_PROXY)/cnf/*.toml)\
$(wildcard $(SRC_PROXY)/gen3/*.py)\
$(wildcard $(SRC_PROXY)/gen3plus/*.py)\
$(wildcard $(SRC_PROXY)/utils/*.py)\
$(wildcard $(SRC_PROXY)/web/*.py)\
$(wildcard $(SRC_PROXY)/web/templates/*.html.j2)\
$(wildcard $(SRC_PROXY)/web/static/css/*.css)\
@@ -191,7 +192,7 @@ $(repro_all_subdirs) :
mkdir -p $@
$(repro_all_templates) : $(INST_BASE)/ha_addon_%/config.yaml: $(TEMPL)/config.jinja $(TEMPL)/%_data.json $(SRC)/.version FORCE
$(JINJA) --strict -D AppVersion=$(VERSION)-$* -D BuildID=$(BUILD_ID) $< $(filter %.json,$^) -o $@
$(JINJA) --strict -D AppVersion=$(VERSION)-$*$(RC) -D BuildID=$(BUILD_ID) $< $(filter %.json,$^) -o $@
$(repro_all_apparmor) : $(INST_BASE)/ha_addon_%/apparmor.txt: $(TEMPL)/apparmor.jinja $(TEMPL)/%_data.json
$(JINJA) --strict $< $(filter %.json,$^) -o $@

View File

@@ -13,12 +13,12 @@
# 1 Build Base Image #
######################
ARG BUILD_FROM="ghcr.io/hassio-addons/base:17.2.4"
ARG BUILD_FROM="ghcr.io/hassio-addons/base:17.2.5"
# hadolint ignore=DL3006
FROM $BUILD_FROM AS base
# Installiere Python, pip und virtuelle Umgebungstools
RUN apk add --no-cache python3=3.12.10-r0 py3-pip=24.3.1-r0 && \
RUN apk add --no-cache python3=3.12.10-r1 py3-pip=24.3.1-r0 && \
python -m venv /opt/venv && \
. /opt/venv/bin/activate

View File

@@ -30,4 +30,4 @@ cd /home/proxy || exit
export VERSION=$(cat /proxy-version.txt)
echo "Start Proxyserver..."
python3 server.py --json_config=/data/options.json --log_path=/homeassistant/tsun-proxy/logs/ --config_path=/homeassistant/tsun-proxy/ --log_backups=2
python3 server.py --rel_urls --json_config=/data/options.json --log_path=/homeassistant/tsun-proxy/logs/ --config_path=/homeassistant/tsun-proxy/ --log_backups=2

View File

@@ -2,7 +2,6 @@
{
"name": "TSUN-Proxy (Release Candidate)",
"description": "MQTT Proxy for TSUN Photovoltaic Inverters",
"version": "rc",
"image": "ghcr.io/s-allius/tsun-gen3-addon",
"slug": "tsun-proxy-rc",
"advanced": true,