Compare commits

..

192 Commits

Author SHA1 Message Date
Stefan Allius
00657c31f3 Fix rel build (#372)
* build rel without BUILD_ID

* update changelog
2025-04-13 21:22:45 +02:00
Stefan Allius
22ebad2edb Update rel 0.13.0 (#371)
* update compose help link

(cherry picked from commit 6d4ff0d508)

* fix link

(cherry picked from commit 3d422f9249)

* retrigger sonar qube test run

* fix rel build run

* bump version
2025-04-13 20:57:31 +02:00
Stefan Allius
e3c2672ea9 Fix rel build (#369)
* disable cache for rc build

* bump python version to 3.12.10-r0
2025-04-13 20:37:34 +02:00
Stefan Allius
86d9fc8c8f Update rel 0.13.0 (#366)
* update compose help link

(cherry picked from commit 6d4ff0d508)

* fix link

(cherry picked from commit 3d422f9249)

* fix rel build run
2025-04-13 20:02:10 +02:00
Stefan Allius
9031b5c793 Update rel 0.13.0 (#365)
* update compose help link

(cherry picked from commit 6d4ff0d508)

* fix link

(cherry picked from commit 3d422f9249)
2025-04-13 19:20:08 +02:00
Stefan Allius
9f27c5a582 fix link
(cherry picked from commit 3d422f9249)
2025-04-13 18:55:16 +02:00
Stefan Allius
1445268b70 Merge pull request #357 from s-allius/main
define the value 2 for the out status (#356)
2025-04-08 00:11:18 +02:00
Stefan Allius
8ca91c2fdd define the value 2 for the out status (#356) 2025-04-07 23:47:29 +02:00
Stefan Allius
ea749dcce6 enforce numbered release candidates (#353) 2025-04-06 22:28:32 +02:00
Stefan Allius
af5604d029 add alarm bitfields (#352)
- fix bitfield of the inverter alarms
- add batterie alarms
2025-04-06 20:07:17 +02:00
renovate[bot]
015b6b8db0 Update dependency pytest-cov to v6.1.1 (#346)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-04-06 01:28:27 +02:00
Stefan Allius
7782a3cb57 Add two states build from the measurements (#351)
* Add two states build from the measurements
- Batterie Status calculated from the batt current
- Power Supply State calc from the out Power

* improve test coverage
2025-04-06 01:21:41 +02:00
Stefan Allius
3d073acc58 Cleanup MQTT json format for DCU batterie (#349)
* Cleanup MQTT json format for DCU batterie
- add hw and sw version
- rename total generation into total charging energy
- rename cell temperature sensors
- restructure json format
- adapt unit tests

* revert changed test packages
2025-04-05 22:30:57 +02:00
Stefan Allius
6974672ba0 S allius/issue334 (#335)
* move forward_at_cmd_resp into InfosG3P class

- the variable is shared between the two connections
of an inverter. One is for the TSUN cloud and the
other for the device.

* use inverter class to share values between
the two protocol instances of a proxy
- move forward_at_cmd_resp into class InverterG3P
- store inverter ptr in Solarman_V5 instances
- add inverter ptr to all constructurs of protocols
- adapt doku and unit tests-
- add integration tests for AT+ commands which
  check the forwarding from and to the TSUN cloud

* adapt and improve the unit tests
- fix node_id declaration, which always has a / at
  the end. See config grammar for this rule
- set global var test to default after test run
2025-04-05 14:37:52 +02:00
Stefan Allius
4988a29a34 S allius/issue340 (#345)
* build the README.md files for the HA Add-ons
2025-04-04 20:11:43 +02:00
Stefan Allius
970b611d47 fix systemtest (#344) 2025-04-04 18:38:17 +02:00
renovate[bot]
1ec97a3e9c Update ghcr.io/hassio-addons/base Docker tag to v17.2.3 (#342)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-04-04 14:51:13 +02:00
renovate[bot]
2707582a45 Update dependency flake8 to v7.2.0 (#330)
* Update dependency flake8 to v7.2.0

* Flake8: ignore F821 errors, due of False Positives

# cleanup some unit tests

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-04-04 14:29:00 +02:00
renovate[bot]
bcec8dd843 Update dependency aiohttp to v3.11.16 (#341)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-04-02 21:00:43 +02:00
renovate[bot]
1b5af7fa97 Update dependency pytest-cov to v6.1.0 (#339)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-04-01 23:17:42 +02:00
renovate[bot]
2731c68675 Update dependency aiohttp to v3.11.15 (#338)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-04-01 23:12:17 +02:00
renovate[bot]
a8f8eca06c Update dependency aiomqtt to v2.3.1 (#337)
* Update dependency aiomqtt to v2.3.1

* update aiomqtt badge

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2025-04-01 23:08:42 +02:00
renovate[bot]
f9eb4ad8d7 Update dependency coverage to v7.8.0 (#336)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-04-01 22:54:40 +02:00
Stefan Allius
0e65e90c25 add DDZY422-D2 as not supported (#333)
* add DDZY422-D2 as not supported

* describe not supported devices clearer
2025-03-30 16:40:02 +02:00
renovate[bot]
18b2a2bfb2 Update dependency python-dotenv to v1.1.0 (#332)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-30 01:13:17 +01:00
renovate[bot]
d1da8a85d3 Update dependency pytest-asyncio to v0.26.0 (#331)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-30 01:05:22 +01:00
Stefan Allius
433faecbb5 update uml diagrams (#329)
* update uml diagrams

* pin versions to make test runs reproducible

* add install target for easier dev env setup
2025-03-30 00:44:27 +01:00
Stefan Allius
632498c384 S allius/issue327 (#328)
* fix typo

* add DCU-1000 storage systems/batteries

* fix compatiblity table

* concern ms3000 support
2025-03-26 23:24:56 +01:00
Stefan Allius
d9384a6118 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy 2025-03-26 19:43:34 +01:00
Stefan Allius
9ec111759a Merge pull request #326 from s-allius/dev-0.13
Dev 0.13
2025-03-26 19:40:42 +01:00
Stefan Allius
8d2dcb7212 S allius/issue320 (#324)
* at unit test for 0x4510 msg with frametype 5
2025-03-26 18:56:01 +01:00
Stefan Allius
32d7711ab7 S allius/issue321 (#325)
* support frame type no 8 for AT+ responses
2025-03-26 18:47:09 +01:00
Stefan Allius
dff8934b82 Dcu1000 (#312)
* set equipment model dor DCU1000 devices

* DCU1000: add temp sensor and mppt states

* DCU1000: add total generation

* add more DCU1000 registers for MODBUS polling

* improve names of batterie measurements

* add more diagnostic registers

* adapt unit tests

* move uml files into subfolder

* add sensors for batterie cell voltages

* swap On and Off for MPPT status
2025-03-25 20:10:10 +01:00
Stefan Allius
3eb6a24dcb Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2025-03-23 23:53:22 +01:00
Stefan Allius
da383c7794 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy 2025-03-23 23:50:33 +01:00
renovate[bot]
f9be171865 Update ghcr.io/hassio-addons/base Docker tag to v17.2.2 (#315)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-23 23:43:25 +01:00
Stefan Allius
45abc69ffb fix Add-on build errors
- bump python to version 3.12.9-r0
- fix workspace path for VSCode
2025-03-23 23:38:12 +01:00
Stefan Allius
96c35ed263 bump python to version 3.12.9-r0 2025-03-23 23:31:46 +01:00
Stefan Allius
795a52e172 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2025-03-17 22:34:31 +01:00
renovate[bot]
5d1ee60baf Update dependency aiohttp to v3.11.14 (#311)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-17 22:29:14 +01:00
Stefan Allius
7cf9e98c7f Merge branch 'renovate/python-3.x' of https://github.com/s-allius/tsun-gen3-proxy into renovate/python-3.x 2025-03-16 19:34:34 +01:00
Stefan Allius
e0777dca8e Add support for MS-3000 inverter (#299)
* split register map into multiple maps

* add base support reg mapping 0x01900000

* fix shadowed builtin

* detect reg mapping for sensor automatically

* add device and test regs for MS-3000

* add more register mappings

* fix unit tests

* add more MS-3000 registers

* build modell string for TSUN MS-3000

* add MS3000 unit test

* remove obsolete method __set_config_parms

* fix start addr of modbus scans

- in server mode the start addr must be reduced
  by mb_step

* add tests for sensor_list of ms-3000 inverters

* MS-3000: add integer test register

* DCU-1000: add Out Status register

* add integer test and batterie out register

* fix Sonar Qube finding

* DCU-1000: add temp sensors
2025-03-16 18:49:01 +01:00
Stefan Allius
955657fd87 add first costumer apparmor definition (#296)
* add first costumer apparmor definition

* add initial apparmor support
2025-03-16 13:11:03 +01:00
Stefan Allius
ecd21e46fb add modbus scanner config for HA Add-ons 2025-03-15 17:16:54 +01:00
Stefan Allius
3489e8997d fix MQTT paket transmitting (#309) 2025-03-15 13:52:49 +01:00
Stefan Allius
88cb01f613 add Modbus polling mode for DCU1000 (#305)
* add Modbus scanning mode

* fix modbus polling for DCU 1000

* add modbus register for DCU 1000

* calculate meta values from modbus regs

* update changelog

* reduce code duplication

* refactor modbus_scan

* add additional unit tests
2025-03-11 19:47:37 +01:00
Stefan Allius
be60f9ea1e calculate power values for DCU (#303)
* calculate power values for DCU

* refactor code
2025-03-02 21:09:03 +01:00
Stefan Allius
10b4a84701 allow R47serial numbers for GEN3 inverters (#302) 2025-03-02 19:05:36 +01:00
Stefan Allius
06ceb02f0d ignore apparmor.txt 2025-02-27 22:50:24 +01:00
Stefan Allius
8a2ca3ab9a fix the build target 2025-02-27 22:43:07 +01:00
Stefan Allius
3f3ed1b14f add watchdog for Add-ons (#291) 2025-02-27 16:11:32 +01:00
Stefan Allius
036dd6d1dc S allius/issue281 (#282)
* accept DCU serial number starting with '410'

* determine sensor-list by serial number

* adapt unit test for DCU support

* send first batterie measurements to home assistant

* add test case for sensor-list==3036

* add more registers for batteries

* improve error logging (Monitoring SN)

* update the add-on repro only for one stage

* add configuration for energie storages

* add License and Readme file to the add-on

* addon: add date and time to dev and debug docker container tag

* disable duplicate code check for config.py

* cleanup unit test, remove trailing whitespaces

* update changelog

* fix example config for batteries

* cleanup config.jinja template

* fix comments

* improve help texts
2025-02-24 22:39:34 +01:00
Stefan Allius
1f0ac97368 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2025-02-24 21:38:18 +01:00
renovate[bot]
5faf242d6c Update dependency aiohttp to v3.11.13 (#290)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-02-24 21:36:42 +01:00
Stefan Allius
ec3af69e62 S allius/issue288 (#289)
* remove apostrophes from fmt strings

- thanks to @onkelenno for the suggestion

* improve the logger initializing

- don't overwrite the logging.ini settings if the
env variable LOG_LVL isn't well defined
- Thanks to @onkelenno for the idea to improve

* set default argument for LOG_LVL to INFO in docker files

* adapt unit test
2025-02-23 14:17:57 +01:00
Stefan Allius
113a41ebfe Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2025-02-23 11:39:10 +01:00
renovate[bot]
13e6adc5c0 Update ghcr.io/hassio-addons/base Docker tag to v17.2.1 (#286)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-02-23 11:22:36 +01:00
Stefan Allius
f9256099c7 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2025-02-19 23:33:11 +01:00
renovate[bot]
204bc76153 Update SonarSource/sonarqube-scan-action action to v5 (#287)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-02-19 23:12:48 +01:00
Stefan Allius
58c7f51266 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2025-02-15 00:22:33 +01:00
renovate[bot]
1eaabb97a2 Update dependency aiocron to v2 (#284)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-02-15 00:21:43 +01:00
Stefan Allius
7a6e6f73a5 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2025-02-14 21:55:57 +01:00
renovate[bot]
39495d3e9e Update ghcr.io/hassio-addons/base Docker tag to v17.1.5 (#283)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-02-14 21:48:04 +01:00
Stefan Allius
a257f09d4c add ghcr logout for the clean target 2025-02-11 20:45:54 +01:00
Stefan Allius
5f0a35d55b Update AddOn base docker image to version 17.1.3 and python3 to 3.12.9-r0 2025-02-11 20:45:01 +01:00
Stefan Allius
4df36e2672 revert AddOn base docker image to version 17.1.0 2025-02-11 20:20:48 +01:00
Stefan Allius
48a9696df2 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2025-02-11 20:15:45 +01:00
renovate[bot]
24567eaf5f Update ghcr.io/hassio-addons/base Docker tag to v17.1.3 (#279)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-02-11 20:12:49 +01:00
Stefan Allius
42fe33bacf add initial DCU support 2025-02-11 00:08:57 +01:00
Stefan Allius
cfdd65606d Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2025-02-10 20:30:42 +01:00
renovate[bot]
2e3ed8f162 Update python Docker tag to v3.13.2 (#277)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-02-10 20:28:03 +01:00
renovate[bot]
66a875c291 Update dependency aiohttp to v3.11.12 (#276)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-02-10 20:25:07 +01:00
renovate[bot]
46043e7754 Update ghcr.io/hassio-addons/base Docker tag to v17.1.2 (#278)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-02-10 20:21:37 +01:00
Stefan Allius
01ad8eff6b Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2025-01-16 19:37:22 +01:00
Stefan Allius
53c76e72a2 Dev 0.12 (#275)
* bump version to 0.12.1

* add initial version for release candidates

* add rc version

* version 0.12.1

* addon: bump base image version to v17.1.0

* 270 ha addon add syntax check to config parameters (#274)

* fixed requirement status of client mode host

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

---------

Co-authored-by: metzi <147942647+mime24@users.noreply.github.com>
Co-authored-by: Michael Metz <michael.metz@siemens.com>
2025-01-16 19:31:30 +01:00
renovate[bot]
24b092b69e Update ghcr.io/hassio-addons/base Docker tag to v17.1.0 (#273)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-01-14 17:46:05 +01:00
metzi
cf1563dd55 270 ha addon add syntax check to config parameters (#271)
* quotation marks removed from monitor_sn

* validation for serial, ports and client_mode_host

* removed TODO:

* allow only serial with 16 digit and starting with R17, Y17, Y47

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>
2025-01-13 19:49:12 +01:00
renovate[bot]
962f6ee5fb Update ghcr.io/hassio-addons/base Docker tag to v17.0.2 (#268)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-01-04 21:12:39 +01:00
Stefan Allius
9e60ad4bcd Dev 0.12 (#266)
* add ha_addons repository to cscode workspace

* Issue220 ha addon dokumentation update (#232)

* initial DOCS.md for Addon

* links to Mosquitto and Adguard

* replaced _ by . for PV-Strings

* mentioned add-on installation method in README.md

* fix most of the markdown linter warnings

* add missing alt texts

* added nice add repository to my Home Assistant badges

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>

* S allius/issue216 (#235)


* improve docker run

- establish multistage Dockerfile
- build a python wheel for all needed packages
- remove unneeded tools like apk for runtime

* pin versions, fix hadolint warnings

* merge from dev-0.12

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* Issue220 ha addon dokumentation update (#245)

* revised config disclaimer

* add newline at end of file to fix linter warning

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* 238 ha addon repository check (#244)

* move Makefile and bake file into parent folder

* build config.yaml from template

* use Makefile instead of build shell script

* ignore temporary or created files

* add rules for building the add-on repository

* add rel version of add-on

* add  jinja2-cli

* ignore inverter replays which a older than 1 day (#246)

* S allius/issue7 (#248)

* report alarm and fault bitfield to ha

* define the alarm and fault names

* configure log path and max number of daily log files (#243)

* configure log path and max number of daily log files

* don't use a subfolder for configs

* use make instead of a build script

* mount /homeassistant/tsun-proxy

* Add venv to base image

* give write access to mounted folder

* intial checkin, ignore SC1091

* set advanced and stage value in config.yaml

* fix typo

* added watchdog and removed Port 8127 from mapping

* fixed typo and use new add-on repro

- change the install button to install from
 https://github.com/s-allius/ha-addons

* add addon-rel target

* disable watchdog due to exceptions in the ha supervisor

* update changelog

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* Update README.md (#251)

install `https://github.com/s-allius/ha-addons` as repro for our add-on

* add german language file (#253)

* fix return type get_extra_info in FakeWriter

* move global startup code into main methdod

* pin version of base image

* avoid forwarding to a private (lokal) IP addr (#256)

* avoid forwarding to a private (lokal) IP addr

* test DNS resolver issues

* increase test coverage

* update changelog

* fix client_mode configuration block (#252)

* fix client_mode block

* add client mode

* fix tests with client_mode values

* log client_mode configuration

* add forward flag for client_mode

* improve startup logging

* added client_mode example

* adjusted translation files

* AT commands added

* typo

* missing "PLUS"

* link to config details

* improve log msg for config problems

* improve log msg on config errors

* improve log msg for config problems

* copy CHANGELOG.md into add-on repro

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* rename "ConfigErr" to match naming convention

* disable test coverage for __main__

* update changelog version 0.12

* Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy

* copy the run.sh scripts into the add-on repros

* set image path using jinja template

* fix wiki pathss

---------

Co-authored-by: metzi <147942647+mime24@users.noreply.github.com>
Co-authored-by: Michael Metz <michael.metz@siemens.com>
2024-12-24 14:20:12 +01:00
Stefan Allius
f5d760e2f0 Change wiki paths 2024-12-24 14:14:56 +01:00
Stefan Allius
3234e87b55 S allius/issue180 (#265)
* move default_config.toml into src/cnf/.

* improve file handling

* remove obsolete rules
2024-12-24 00:13:32 +01:00
Stefan Allius
412013f626 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2024-12-23 19:17:40 +01:00
renovate[bot]
1781dba065 Update dependency aiohttp to v3.11.11 (#264)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-12-23 19:17:03 +01:00
Stefan Allius
1b3833989e Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.13 2024-12-23 14:03:30 +01:00
Stefan Allius
26ca006853 Dev 0.12 (#262) 2024-12-23 14:01:27 +01:00
Stefan Allius
1e160f3b0f set verion 0.13 2024-12-23 00:10:57 +01:00
Stefan Allius
338b86964d Dev 0.12 (#260)
- fix build add-on version for releases
2024-12-23 00:02:40 +01:00
Stefan Allius
35952654db Dev 0.12 (#259)
* add ha_addons repository to cscode workspace

* Issue220 ha addon dokumentation update (#232)

* initial DOCS.md for Addon

* links to Mosquitto and Adguard

* replaced _ by . for PV-Strings

* mentioned add-on installation method in README.md

* fix most of the markdown linter warnings

* add missing alt texts

* added nice add repository to my Home Assistant badges

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>

* S allius/issue216 (#235)


* improve docker run

- establish multistage Dockerfile
- build a python wheel for all needed packages
- remove unneeded tools like apk for runtime

* pin versions, fix hadolint warnings

* merge from dev-0.12

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* Issue220 ha addon dokumentation update (#245)

* revised config disclaimer

* add newline at end of file to fix linter warning

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* 238 ha addon repository check (#244)

* move Makefile and bake file into parent folder

* build config.yaml from template

* use Makefile instead of build shell script

* ignore temporary or created files

* add rules for building the add-on repository

* add rel version of add-on

* add  jinja2-cli

* ignore inverter replays which a older than 1 day (#246)

* S allius/issue7 (#248)

* report alarm and fault bitfield to ha

* define the alarm and fault names

* configure log path and max number of daily log files (#243)

* configure log path and max number of daily log files

* don't use a subfolder for configs

* use make instead of a build script

* mount /homeassistant/tsun-proxy

* Add venv to base image

* give write access to mounted folder

* intial checkin, ignore SC1091

* set advanced and stage value in config.yaml

* fix typo

* added watchdog and removed Port 8127 from mapping

* fixed typo and use new add-on repro

- change the install button to install from
 https://github.com/s-allius/ha-addons

* add addon-rel target

* disable watchdog due to exceptions in the ha supervisor

* update changelog

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* Update README.md (#251)

install `https://github.com/s-allius/ha-addons` as repro for our add-on

* add german language file (#253)

* fix return type get_extra_info in FakeWriter

* move global startup code into main methdod

* pin version of base image

* avoid forwarding to a private (lokal) IP addr (#256)

* avoid forwarding to a private (lokal) IP addr

* test DNS resolver issues

* increase test coverage

* update changelog

* fix client_mode configuration block (#252)

* fix client_mode block

* add client mode

* fix tests with client_mode values

* log client_mode configuration

* add forward flag for client_mode

* improve startup logging

* added client_mode example

* adjusted translation files

* AT commands added

* typo

* missing "PLUS"

* link to config details

* improve log msg for config problems

* improve log msg on config errors

* improve log msg for config problems

* copy CHANGELOG.md into add-on repro

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* rename "ConfigErr" to match naming convention

* disable test coverage for __main__

* update changelog version 0.12

---------

Co-authored-by: metzi <147942647+mime24@users.noreply.github.com>
Co-authored-by: Michael Metz <michael.metz@siemens.com>
2024-12-22 22:46:37 +01:00
Stefan Allius
55c403a754 Dev 0.12 (#258)
* add ha_addons repository to cscode workspace

* Issue220 ha addon dokumentation update (#232)

* initial DOCS.md for Addon

* links to Mosquitto and Adguard

* replaced _ by . for PV-Strings

* mentioned add-on installation method in README.md

* fix most of the markdown linter warnings

* add missing alt texts

* added nice add repository to my Home Assistant badges

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>

* S allius/issue216 (#235)


* improve docker run

- establish multistage Dockerfile
- build a python wheel for all needed packages
- remove unneeded tools like apk for runtime

* pin versions, fix hadolint warnings

* merge from dev-0.12

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* Issue220 ha addon dokumentation update (#245)

* revised config disclaimer

* add newline at end of file to fix linter warning

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* 238 ha addon repository check (#244)

* move Makefile and bake file into parent folder

* build config.yaml from template

* use Makefile instead of build shell script

* ignore temporary or created files

* add rules for building the add-on repository

* add rel version of add-on

* add  jinja2-cli

* ignore inverter replays which a older than 1 day (#246)

* S allius/issue7 (#248)

* report alarm and fault bitfield to ha

* define the alarm and fault names

* configure log path and max number of daily log files (#243)

* configure log path and max number of daily log files

* don't use a subfolder for configs

* use make instead of a build script

* mount /homeassistant/tsun-proxy

* Add venv to base image

* give write access to mounted folder

* intial checkin, ignore SC1091

* set advanced and stage value in config.yaml

* fix typo

* added watchdog and removed Port 8127 from mapping

* fixed typo and use new add-on repro

- change the install button to install from
 https://github.com/s-allius/ha-addons

* add addon-rel target

* disable watchdog due to exceptions in the ha supervisor

* update changelog

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* Update README.md (#251)

install `https://github.com/s-allius/ha-addons` as repro for our add-on

* add german language file (#253)

* fix return type get_extra_info in FakeWriter

* move global startup code into main methdod

* pin version of base image

* avoid forwarding to a private (lokal) IP addr (#256)

* avoid forwarding to a private (lokal) IP addr

* test DNS resolver issues

* increase test coverage

* update changelog

* fix client_mode configuration block (#252)

* fix client_mode block

* add client mode

* fix tests with client_mode values

* log client_mode configuration

* add forward flag for client_mode

* improve startup logging

* added client_mode example

* adjusted translation files

* AT commands added

* typo

* missing "PLUS"

* link to config details

* improve log msg for config problems

* improve log msg on config errors

* improve log msg for config problems

* copy CHANGELOG.md into add-on repro

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* rename "ConfigErr" to match naming convention

* disable test coverage for __main__

* update changelog version 0.12

---------

Co-authored-by: metzi <147942647+mime24@users.noreply.github.com>
Co-authored-by: Michael Metz <michael.metz@siemens.com>
2024-12-22 22:35:12 +01:00
Stefan Allius
3bf245300d Dev 0.12 (#257)
* add ha_addons repository to cscode workspace

* Issue220 ha addon dokumentation update (#232)

* initial DOCS.md for Addon

* links to Mosquitto and Adguard

* replaced _ by . for PV-Strings

* mentioned add-on installation method in README.md

* fix most of the markdown linter warnings

* add missing alt texts

* added nice add repository to my Home Assistant badges

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>

* S allius/issue216 (#235)


* improve docker run

- establish multistage Dockerfile
- build a python wheel for all needed packages
- remove unneeded tools like apk for runtime

* pin versions, fix hadolint warnings

* merge from dev-0.12

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* Issue220 ha addon dokumentation update (#245)

* revised config disclaimer

* add newline at end of file to fix linter warning

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* 238 ha addon repository check (#244)

* move Makefile and bake file into parent folder

* build config.yaml from template

* use Makefile instead of build shell script

* ignore temporary or created files

* add rules for building the add-on repository

* add rel version of add-on

* add  jinja2-cli

* ignore inverter replays which a older than 1 day (#246)

* S allius/issue7 (#248)

* report alarm and fault bitfield to ha

* define the alarm and fault names

* configure log path and max number of daily log files (#243)

* configure log path and max number of daily log files

* don't use a subfolder for configs

* use make instead of a build script

* mount /homeassistant/tsun-proxy

* Add venv to base image

* give write access to mounted folder

* intial checkin, ignore SC1091

* set advanced and stage value in config.yaml

* fix typo

* added watchdog and removed Port 8127 from mapping

* fixed typo and use new add-on repro

- change the install button to install from
 https://github.com/s-allius/ha-addons

* add addon-rel target

* disable watchdog due to exceptions in the ha supervisor

* update changelog

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* Update README.md (#251)

install `https://github.com/s-allius/ha-addons` as repro for our add-on

* add german language file (#253)

* fix return type get_extra_info in FakeWriter

* move global startup code into main methdod

* pin version of base image

* avoid forwarding to a private (lokal) IP addr (#256)

* avoid forwarding to a private (lokal) IP addr

* test DNS resolver issues

* increase test coverage

* update changelog

* fix client_mode configuration block (#252)

* fix client_mode block

* add client mode

* fix tests with client_mode values

* log client_mode configuration

* add forward flag for client_mode

* improve startup logging

* added client_mode example

* adjusted translation files

* AT commands added

* typo

* missing "PLUS"

* link to config details

* improve log msg for config problems

* improve log msg on config errors

* improve log msg for config problems

* copy CHANGELOG.md into add-on repro

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>

* rename "ConfigErr" to match naming convention

* disable test coverage for __main__

* update changelog version 0.12

---------

Co-authored-by: metzi <147942647+mime24@users.noreply.github.com>
Co-authored-by: Michael Metz <michael.metz@siemens.com>
2024-12-22 22:25:50 +01:00
Stefan Allius
badc065b7a Merge pull request #242 from s-allius/ha-repro
move ha repro file into root dir
2024-12-10 19:09:33 +01:00
Stefan Allius
aea6cc9763 move file into root dir 2024-12-10 19:06:29 +01:00
Stefan Allius
92d1e648ae Merge pull request #241 from s-allius/renovate/python-3.x
Update python Docker tag
2024-12-09 21:58:41 +01:00
renovate[bot]
879b6608b3 Update python Docker tag 2024-12-09 20:56:38 +00:00
Stefan Allius
b69e7e2242 Merge pull request #240 from s-allius/renovate/aiohttp-3.x
Update dependency aiohttp to v3.11.10
2024-12-09 21:54:28 +01:00
renovate[bot]
0913fde126 Update dependency aiohttp to v3.11.10 2024-12-09 20:50:01 +00:00
Stefan Allius
bedbe08eeb Merge pull request #237 from s-allius/dev-0.12
Dev 0.12
2024-12-08 18:59:50 +01:00
Stefan Allius
3c81d446dd update changelog 2024-12-08 18:57:40 +01:00
Stefan Allius
b335881500 S allius/issue217 2 (#230)
* add some reader classes to get the configuration

* adapt unittests

* get config from json or toml file

* loop over all config readers to get the configuration

* rename config test files

* use relative paths for coverage test in vscode

* do not throw an error for missing config files

* remove obsolete tests

* use dotted key notation for pv sub dictonary

* log config reading progress

* remove create_config_toml.py

* remove obsolete tests for the ha_addon

* disable mosquitto tests if the server is down

* ignore main method for test coverage

* increase test coverage

* pytest-cov: use relative_files only on github, so coverage will work with vscode locally

* remove unneeded imports

* add missing test cases

* disable branch coverage, cause its not reachable
2024-12-08 13:25:04 +01:00
Stefan Allius
ac7b02bde9 init act_config, def_config even without init() call 2024-12-03 22:49:38 +01:00
Stefan Allius
47a89c269f fix some flake8 warnings 2024-12-03 22:48:52 +01:00
Stefan Allius
be3b4d6df0 S allius/issue206 (#213)
* update changelog

* add addon-dev target

* initial version

* use prebuild docker image

* initial version for multi arch images

* fix missing label latest

* create log and config folder first.

* clean up and translate to english

* set labels with docker bake

* add addon-debug and addon-dev targets

* pass version number to proxy at runtime

* add two more callbacks

* get addon version from app

* deploy rc addon container to ghcr

* move ha_addon test into subdir

* fix crash on container restart

- mkdir -p returns no error even if the director
  exists

* prepation for unit testing

- move script into a method

* added further config to schema

* typo fixed

* added monitor_sn + PV-strings 3-6 to create toml

* added options.json for testing

* prepare pytest and coverage for addons

* fix missing values in resulting config.toml
- define mqtt default values
- convert filter configuration

* first running unittest for addons

* add ha_addons

* increase test coverage

* test empty options.json file for HA AddOn

* fix pytest call in terminal

* improve test coverage

* remove uneeded options.json

* move config.py into subdir cnf

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>
2024-12-03 22:22:50 +01:00
Stefan Allius
a5b2b4b7c2 S allius/issue217 (#229)
* move config.py into a sub directory cnf

* adapt unit test

* split config class

- use depency injection to get config

* increase test coverage
2024-12-03 22:02:23 +01:00
Stefan Allius
668c631018 S allius/issue222 (#223)
* github action: use ubuntu 24.04 and sonar-scanner-action 4
2024-12-02 23:41:58 +01:00
Stefan Allius
07c989a305 increase mqtt timeout to 10s 2024-12-02 23:11:30 +01:00
Stefan Allius
28cf875533 migrate paho.mqtt CallbackAPIVersion to VERSION2 (#225) 2024-12-02 22:49:56 +01:00
Stefan Allius
9bae905c08 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.12 2024-11-29 21:38:23 +01:00
metzi
45b57109a8 Add on (#212)
* added service to transfer Add-on config from options.json to config.toml

* added feature to get MQTT config from Homeassistant

current version is MVP. can run as Home Assistant Add-On, config.toml is automatically created from option parameters in the add-on configuration tab.

* fix pylance and flake8 warnings

* prepare building a ha addon

- move build script into root dir
- cp source files in addon build-tree

* ignore proxy source files in addon build tree

* move proxy source files in own directory

* remove duplicates source files from repro

* check for a valis SONAR_TOKEN

* rename add_on path

* prepare for unittests and coverage measurement

* move file cause of the changes pathname

* move the proxy dir to /home/proxy

* build addon with make now

* remove duplicated requirements.txt file from repo

* undo changes

---------

Co-authored-by: Michael Metz <michael.metz@siemens.com>
Co-authored-by: Stefan Allius <stefan.allius@t-online.de>
2024-11-29 21:02:19 +01:00
Stefan Allius
2c69044bf8 initial test version 2024-11-24 22:26:55 +01:00
Stefan Allius
3bada76516 S allius/pytest (#211)
* - fix pytest setup that can be startet from the rootdir
  - support python venv environment
  - add pytest.ini
  - move common settings from .vscode/settings.json into pytest.ini
  - add missing requirements
  - fix import paths for pytests

* - support python venv environment

* initial version

* - add missing requirements python-dotenv

* fix import paths for pytests

* fix pytest warnings

* initial version

* report 5 slowest test durations

* add more vscode settings for python
2024-11-24 22:07:43 +01:00
Stefan Allius
84231c034c specify more offset of the 0x4110 message 2024-11-23 16:31:44 +01:00
Stefan Allius
d4fd396dcf Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.12 2024-11-20 22:09:53 +01:00
dependabot[bot]
976eaed9ea Bump aiohttp from 3.10.5 to 3.10.11 in /app (#209)
* Bump aiohttp from 3.10.5 to 3.10.11 in /app

Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.10.5 to 3.10.11.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.10.5...v3.10.11)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

* bump sonarcloud-github-action to v3.1.0

* prepare version 0.11.1

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Stefan Allius
2024-11-20 21:08:47 +01:00
Stefan Allius
211a958080 add PROD_COMPL_TYPE to trace 2024-11-20 20:08:20 +01:00
Stefan Allius
5ced5ff06a S allius/issue205 (#207)
* Add SolarmanEmu class

* Forward a device ind to establish the EMU connection

* Move SolarmanEmu class into a dedicated file

* Add cloud connection counter

* Send inverter data in emulator mode

* Improve emulator mode

- parse more values from MQTT register
- differ between inverter and logger serial no

* Add some unit tests for SolarmanEmu class

* Send seconds since last sync in data packets

* Increase test coverage
2024-11-13 22:03:28 +01:00
Stefan Allius
78a35b5513 report alarm and fault bitfield to ha (#204)
* report alarm and fault bitfield to home assistant

* initial verson of message builder for SolarmanV5

- for SolarmaV5 we build he param field for the
  device and inverter message from the internal
  database
- added param description to the info table
  for constant values, which are not parsed and
  stored in internal database

* define constants for often used format strings

* update changelog
2024-11-02 15:09:10 +01:00
Stefan Allius
9b22fe354c clear remote ptr on disc only for client ifcs 2024-10-26 17:30:00 +02:00
Stefan Allius
a6ad3d4f0d fix linter warnings 2024-10-25 23:49:35 +02:00
Stefan Allius
4993676614 remove all eval() calls 2024-10-25 23:41:25 +02:00
Stefan Allius
10a18237c7 replace some eval calls 2024-10-25 21:38:36 +02:00
Stefan Allius
8d67f1745d update SonarSource/sonarcloud-github-action 2024-10-25 20:36:53 +02:00
Stefan Allius
9eb7c7fbe0 increase test coverage 2024-10-19 01:23:16 +02:00
Stefan Allius
6c6109d421 update class diagramms 2024-10-18 23:49:23 +02:00
Stefan Allius
7d0ea41728 reduce code duplications 2024-10-17 23:20:13 +02:00
Stefan Allius
ce5bd6eb0a reduce code duplications 2024-10-17 21:51:26 +02:00
Stefan Allius
6122f40718 fix recv_resp method call 2024-10-16 23:25:18 +02:00
Stefan Allius
c5f184a730 improve setting the node_id in the modbus 2024-10-16 23:20:23 +02:00
Stefan Allius
6da5d2cef6 define __slots__ for class ByteFifo (#202)
* define __slots__ for class ByteFifo

* disable set-timezone action

* set set-timezone to UTC

* try MathRobin/timezone-action@v1.1

* set TZ to "Europe/Berlin"

* define __slots__
2024-10-15 22:16:22 +02:00
Stefan Allius
db06d8c8e6 define __slots__ 2024-10-15 22:11:19 +02:00
Stefan Allius
3863454a84 set TZ to "Europe/Berlin" 2024-10-15 21:59:32 +02:00
Stefan Allius
5775cb1ce3 try MathRobin/timezone-action@v1.1 2024-10-15 21:53:11 +02:00
Stefan Allius
5d61a261b1 set set-timezone to UTC 2024-10-15 21:37:01 +02:00
Stefan Allius
bbda66e455 disable set-timezone action 2024-10-15 21:28:57 +02:00
Stefan Allius
0c7bf7956d define __slots__ for class ByteFifo 2024-10-15 21:25:09 +02:00
Stefan Allius
6b9c13ddfe Merge branch 'dev-0.11' of https://github.com/s-allius/tsun-gen3-proxy into main 2024-10-15 20:30:04 +02:00
Stefan Allius
a6ffcc0949 update version 0.11 2024-10-13 18:24:00 +02:00
Stefan Allius
c956c13d13 Dev 0.11 (#200)
* Code Cleanup (#158)


* print coverage report

* create sonar-project property file

* install all py dependencies in one step

* code cleanup

* reduce cognitive complexity

* do not build on *.yml changes

* optimise versionstring handling (#159)

- Reading the version string from the image updates
  it even if the image is re-pulled without re-deployment

* fix linter warning

* exclude *.pyi filese

* ignore some rules for tests

* cleanup (#160)

* Sonar qube 3 (#163)

fix SonarQube warnings in modbus.py

* Sonar qube 3 (#164)


* fix SonarQube warnings

* Sonar qube 3 (#165)

* cleanup

* Add support for TSUN Titan inverter
Fixes #161


* fix SonarQube warnings

* fix error

* rename field "config"

* SonarQube reads flake8 output

* don't stop on flake8 errors

* flake8 scan only app/src for SonarQube

* update flake8 run

* ignore flake8 C901

* cleanup

* fix linter warnings

* ignore changed *.yml files

* read sensor list solarman data packets

* catch 'No route to' error and log only in debug mode

* fix unit tests

* add sensor_list configuration

* adapt unit tests

* fix SonarQube warnings

* Sonar qube 3 (#166)

* add unittests for mqtt.py

* add mock

* move test requirements into a file

* fix unit tests

* fix formating

* initial version

* fix SonarQube warning

* Sonar qube 4 (#169)

* add unit test for inverter.py

* fix SonarQube warning

* Sonar qube 5 (#170)

* fix SonarLints warnings

* use random IP adresses for unit tests

* Docker: The description ist missing (#171)

Fixes #167

* S allius/issue167 (#172)

* cleanup

* Sonar qube 6 (#174)

* test class ModbusConn

* Sonar qube 3 (#178)

* add more unit tests

* GEN3: don't crash on overwritten msg in the receive buffer

* improve test coverage und reduce test delays

* reduce cognitive complexity

* fix merge

* fix merge conflikt

* fix merge conflict

* S allius/issue182 (#183)

* GEN3: After inverter firmware update the 'Unknown Msg Type' increases continuously
Fixes #182

* add support for Controller serial no and MAC

* test hardening

* GEN3: add support for new messages of version 3 firmwares

* bump libraries to latest versions

- bump aiomqtt to version 2.3.0
- bump aiohttp to version 3.10.5

* improve test coverage

* reduce cognective complexity

* fix target preview

* remove dubbled fixtures

* increase test coverage

* Update README.md (#185)

update badges

* S allius/issue186 (#187)

* Parse more values in Server Mode
Fixes #186

* read OUTPUT_COEFFICIENT and MAC_ADDR in SrvMode

* fix unit test

* increase test coverage

* S allius/issue186 (#188)

* increase test coverage

* update changelog

* add dokumentation

* change default config

* Update README.md (#189)

Config file is now foldable

* GEN3: Invalid Contact Info Msg (#192)

Fixes #191

* Refactoring async stream (#194)

* GEN3: Invalid Contact Info Msg
Fixes #191

* introduce ifc with FIFOs

* add object factory

* use AsyncIfc class with FIFO

* declare more methods as classmethods

* - refactoring

- remove _forward_buffer
- make async_write private

* remove _forward_buffer

* refactoring

* avoid mqtt handling for invalid serial numbers

* add two more callbacks

* FIX update_header_cb handling

* split AsyncStream in two classes

* split ConnectionG3(P) in server and client class

* update class diagramm

* refactor server creation

* remove duplicated imports

* reduce code duplication

* move StremPtr instances into Inverter class

* resolution of connection classes

- remove ConnectionG3Client
- remove ConnectionG3Server
- remove ConnectionG3PClient
- remove ConnectionG3PServer

* fix server connections

* fix client loop closing

* don't overwrite self.remote in constructor

* update class diagramm

* fixes

- fixes null pointer accesses
- initalize AsyncStreamClient with proper
  StreamPtr instance

* add close callback

* refactor close handling

* remove connection classes

* move more code into InverterBase class

* remove test_inverter_base.py

* add abstract inverter interface class

* initial commit

* fix sonar qube warnings

* rename class Inverter into Proxy

* fix typo

* move class InverterIfc into a separate file

* add more testcases

* use ProtocolIfc class

* add unit tests for AsyncStream class

* icrease test coverage

* reduce cognitive complexity

* increase test coverage

* increase tes coverage

* simplify heartbeat handler

* remove obsolete tx_get method

* add more unittests

* update changelog

* remove __del__ method for proper gc runs

* check releasing of ModbusConn instances

* call garbage collector to release unreachable objs

* decrease ref counter after the with block

* S allius/issue196 (#198)

* fix healthcheck

- on infrastructure with IPv6 support localhost
  might be resolved to an IPv6 adress. Since the
  proxy only support IPv4 for now, we replace
  localhost by 127.0.0.1, to fix this

* merge from main
2024-10-13 18:12:10 +02:00
Stefan Allius
85fe7261d5 Merge branch 'main' into dev-0.11 2024-10-13 18:07:38 +02:00
Stefan Allius
d4b618742c merge from main 2024-10-13 17:31:55 +02:00
Stefan Allius
719c6f703a S allius/issue196 (#198)
* fix healthcheck

- on infrastructure with IPv6 support localhost
  might be resolved to an IPv6 adress. Since the
  proxy only support IPv4 for now, we replace
  localhost by 127.0.0.1, to fix this
2024-10-13 17:13:07 +02:00
Stefan Allius
62ea2a9e6f Refactoring async stream (#194)
* GEN3: Invalid Contact Info Msg
Fixes #191

* introduce ifc with FIFOs

* add object factory

* use AsyncIfc class with FIFO

* declare more methods as classmethods

* - refactoring

- remove _forward_buffer
- make async_write private

* remove _forward_buffer

* refactoring

* avoid mqtt handling for invalid serial numbers

* add two more callbacks

* FIX update_header_cb handling

* split AsyncStream in two classes

* split ConnectionG3(P) in server and client class

* update class diagramm

* refactor server creation

* remove duplicated imports

* reduce code duplication

* move StremPtr instances into Inverter class

* resolution of connection classes

- remove ConnectionG3Client
- remove ConnectionG3Server
- remove ConnectionG3PClient
- remove ConnectionG3PServer

* fix server connections

* fix client loop closing

* don't overwrite self.remote in constructor

* update class diagramm

* fixes

- fixes null pointer accesses
- initalize AsyncStreamClient with proper
  StreamPtr instance

* add close callback

* refactor close handling

* remove connection classes

* move more code into InverterBase class

* remove test_inverter_base.py

* add abstract inverter interface class

* initial commit

* fix sonar qube warnings

* rename class Inverter into Proxy

* fix typo

* move class InverterIfc into a separate file

* add more testcases

* use ProtocolIfc class

* add unit tests for AsyncStream class

* icrease test coverage

* reduce cognitive complexity

* increase test coverage

* increase tes coverage

* simplify heartbeat handler

* remove obsolete tx_get method

* add more unittests

* update changelog

* remove __del__ method for proper gc runs

* check releasing of ModbusConn instances

* call garbage collector to release unreachable objs

* decrease ref counter after the with block
2024-10-13 16:07:01 +02:00
Stefan Allius
166a856705 GEN3: Invalid Contact Info Msg (#192)
Fixes #191
2024-09-19 19:17:22 +02:00
Stefan Allius
bfea38d9da Dev 0.11 (#190)
* Code Cleanup (#158)


* print coverage report

* create sonar-project property file

* install all py dependencies in one step

* code cleanup

* reduce cognitive complexity

* do not build on *.yml changes

* optimise versionstring handling (#159)

- Reading the version string from the image updates
  it even if the image is re-pulled without re-deployment

* fix linter warning

* exclude *.pyi filese

* ignore some rules for tests

* cleanup (#160)

* Sonar qube 3 (#163)

fix SonarQube warnings in modbus.py

* Sonar qube 3 (#164)


* fix SonarQube warnings

* Sonar qube 3 (#165)

* cleanup

* Add support for TSUN Titan inverter
Fixes #161


* fix SonarQube warnings

* fix error

* rename field "config"

* SonarQube reads flake8 output

* don't stop on flake8 errors

* flake8 scan only app/src for SonarQube

* update flake8 run

* ignore flake8 C901

* cleanup

* fix linter warnings

* ignore changed *.yml files

* read sensor list solarman data packets

* catch 'No route to' error and log only in debug mode

* fix unit tests

* add sensor_list configuration

* adapt unit tests

* fix SonarQube warnings

* Sonar qube 3 (#166)

* add unittests for mqtt.py

* add mock

* move test requirements into a file

* fix unit tests

* fix formating

* initial version

* fix SonarQube warning

* Sonar qube 4 (#169)

* add unit test for inverter.py

* fix SonarQube warning

* Sonar qube 5 (#170)

* fix SonarLints warnings

* use random IP adresses for unit tests

* Docker: The description ist missing (#171)

Fixes #167

* S allius/issue167 (#172)

* cleanup

* Sonar qube 6 (#174)

* test class ModbusConn

* Sonar qube 3 (#178)

* add more unit tests

* GEN3: don't crash on overwritten msg in the receive buffer

* improve test coverage und reduce test delays

* reduce cognitive complexity

* fix merge

* fix merge conflikt

* fix merge conflict

* S allius/issue182 (#183)

* GEN3: After inverter firmware update the 'Unknown Msg Type' increases continuously
Fixes #182

* add support for Controller serial no and MAC

* test hardening

* GEN3: add support for new messages of version 3 firmwares

* bump libraries to latest versions

- bump aiomqtt to version 2.3.0
- bump aiohttp to version 3.10.5

* improve test coverage

* reduce cognective complexity

* fix target preview

* remove dubbled fixtures

* increase test coverage

* Update README.md (#185)

update badges

* S allius/issue186 (#187)

* Parse more values in Server Mode
Fixes #186

* read OUTPUT_COEFFICIENT and MAC_ADDR in SrvMode

* fix unit test

* increase test coverage

* S allius/issue186 (#188)

* increase test coverage

* update changelog

* add dokumentation

* change default config

* Update README.md (#189)

Config file is now foldable
2024-09-16 00:45:36 +02:00
Stefan Allius
d5ec47fd1e Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.11 2024-09-16 00:37:39 +02:00
Stefan Allius
828f26cf24 Update README.md (#189)
Config file is now foldable
2024-09-16 00:17:43 +02:00
Stefan Allius
0b3d84ff36 change default config 2024-09-16 00:12:30 +02:00
Stefan Allius
5642c912a8 add dokumentation 2024-09-15 15:17:45 +02:00
Stefan Allius
614acbf32d update changelog 2024-09-15 01:18:36 +02:00
Stefan Allius
57525ca519 S allius/issue186 (#188)
* increase test coverage
2024-09-15 01:02:49 +02:00
Stefan Allius
5ef68280b1 S allius/issue186 (#187)
* Parse more values in Server Mode
Fixes #186

* read OUTPUT_COEFFICIENT and MAC_ADDR in SrvMode

* fix unit test

* increase test coverage
2024-09-14 19:49:29 +02:00
Stefan Allius
e12c78212f Update README.md (#185)
update badges
2024-09-14 08:40:53 +02:00
Stefan Allius
2ab35a8257 increase test coverage 2024-09-07 18:04:28 +02:00
Stefan Allius
865216b8d9 remove dubbled fixtures 2024-09-07 18:03:50 +02:00
Stefan Allius
5d5d7c218f fix target preview 2024-09-07 13:49:45 +02:00
Stefan Allius
be4c6ac77f S allius/issue182 (#183)
* GEN3: After inverter firmware update the 'Unknown Msg Type' increases continuously
Fixes #182

* add support for Controller serial no and MAC

* test hardening

* GEN3: add support for new messages of version 3 firmwares

* bump libraries to latest versions

- bump aiomqtt to version 2.3.0
- bump aiohttp to version 3.10.5

* improve test coverage

* reduce cognective complexity
2024-09-07 11:45:16 +02:00
Stefan Allius
a9dc7e6847 Dev 0.11 (#181)
* Sonar qube 6 (#174)

* test class ModbusConn

* Sonar qube 3 (#178)

* add more unit tests

* GEN3: don't crash on overwritten msg in the receive buffer

* improve test coverage und reduce test delays

* reduce cognitive complexity
2024-09-03 18:58:24 +02:00
Stefan Allius
270732f1d0 fix merge conflict 2024-09-03 18:54:49 +02:00
Stefan Allius
7b4fabdc25 fix merge conflikt 2024-09-03 18:48:21 +02:00
Stefan Allius
2351ec314a fix merge 2024-09-03 18:42:48 +02:00
Stefan Allius
604d30c711 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.11 2024-09-03 18:39:27 +02:00
Stefan Allius
ab5256659b reduce cognitive complexity 2024-09-03 18:32:44 +02:00
Stefan Allius
a76c0ac440 improve test coverage und reduce test delays 2024-09-03 17:23:09 +02:00
Stefan Allius
215dcd98e6 GEN3: don't crash on overwritten msg in the receive buffer 2024-09-03 17:22:34 +02:00
Stefan Allius
627ca97360 Test modbus_tcp (#179)
* add more unit tests
2024-08-30 20:40:53 +02:00
Stefan Allius
d2b88ab838 Sonar qube 3 (#178)
* add more unit tests
2024-08-29 23:47:30 +02:00
Stefan Allius
6d9addc7d5 Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.11 2024-08-27 21:41:11 +02:00
Stefan Allius
1bb08fb211 Update README.md (#177) 2024-08-27 15:03:57 +02:00
Stefan Allius
193eea65af Update README.md (#176)
add SonarCloude shields
2024-08-27 00:24:11 +02:00
Stefan Allius
2b8dacb0de Dev 0.11 (#175)
* use random IP adresses for unit tests

* Docker: The description ist missing (#171)

Fixes #167

* S allius/issue167 (#172)

* cleanup

* Sonar qube 6 (#174)

* test class ModbusConn
2024-08-26 23:49:23 +02:00
Stefan Allius
cb0c69944f Merge branch 'main' of https://github.com/s-allius/tsun-gen3-proxy into dev-0.11 2024-08-26 23:45:48 +02:00
Stefan Allius
7f41365815 Sonar qube 6 (#174)
* test class ModbusConn
2024-08-26 23:37:24 +02:00
Stefan Allius
5db3fbf495 Update README.md (#173) 2024-08-26 21:28:44 +02:00
Stefan Allius
d44726c0f3 S allius/issue167 (#172)
* cleanup
2024-08-25 23:28:35 +02:00
Stefan Allius
1985557bce Docker: The description ist missing (#171)
Fixes #167
2024-08-25 23:05:25 +02:00
Stefan Allius
7dc2595d71 use random IP adresses for unit tests 2024-08-25 12:02:27 +02:00
Stefan Allius
6d9a446bfe Sonar qube 5 (#170)
* fix SonarLints warnings
2024-08-24 23:03:02 +02:00
Stefan Allius
f9c1b83ccd Sonar qube 4 (#169)
* add unit test for inverter.py

* fix SonarQube warning
2024-08-24 22:21:55 +02:00
Stefan Allius
58b42f7d7c SonarCloud setup (#168)
* Code Cleanup (#158)

* print coverage report

* create sonar-project property file

* install all py dependencies in one step

* code cleanup

* reduce cognitive complexity

* do not build on *.yml changes

* optimise versionstring handling (#159)

- Reading the version string from the image updates
  it even if the image is re-pulled without re-deployment

* fix linter warning

* exclude *.pyi filese

* ignore some rules for tests

* cleanup (#160)

* Sonar qube 3 (#163)

fix SonarQube warnings in modbus.py

* Sonar qube 3 (#164)


* fix SonarQube warnings

* Sonar qube 3 (#165)

* cleanup

* Add support for TSUN Titan inverter
Fixes #161


* fix SonarQube warnings

* fix error

* rename field "config"

* SonarQube reads flake8 output

* don't stop on flake8 errors

* flake8 scan only app/src for SonarQube

* update flake8 run

* ignore flake8 C901

* cleanup

* fix linter warnings

* ignore changed *.yml files

* read sensor list solarman data packets

* catch 'No route to' error and log only in debug mode

* fix unit tests

* add sensor_list configuration

* adapt unit tests

* fix SonarQube warnings

* Sonar qube 3 (#166)

* add unittests for mqtt.py

* add mock

* move test requirements into a file

* fix unit tests

* fix formating

* initial version

* fix SonarQube warning
2024-08-23 21:24:01 +02:00
Stefan Allius
27045cac6e Sonar qube 3 (#166)
* add unittests for mqtt.py

* add mock

* move test requirements into a file

* fix unit tests

* fix formating

* initial version

* fix SonarQube warning
2024-08-23 00:26:01 +02:00
Stefan Allius
54de2aecfe Sonar qube 3 (#165)
* cleanup

* Add support for TSUN Titan inverter
Fixes #161


* fix SonarQube warnings

* fix error

* rename field "config"

* SonarQube reads flake8 output

* don't stop on flake8 errors

* flake8 scan only app/src for SonarQube

* update flake8 run

* ignore flake8 C901

* cleanup

* fix linter warnings

* ignore changed *.yml files

* read sensor list solarman data packets

* catch 'No route to' error and log only in debug mode

* fix unit tests

* add sensor_list configuration

* adapt unit tests

* fix SonarQube warnings
2024-08-16 21:07:08 +02:00
Stefan Allius
5a39370cc3 Sonar qube 3 (#164)
* fix SonarQube warnings
2024-08-13 22:22:45 +02:00
Stefan Allius
7a9b23d068 Sonar qube 3 (#163)
fix SonarQube warnings in modbus.py
2024-08-13 21:11:56 +02:00
Stefan Allius
e34afcb523 cleanup (#160) 2024-08-11 23:22:07 +02:00
Stefan Allius
22df381da5 ignore some rules for tests 2024-08-11 00:48:19 +02:00
Stefan Allius
117e6a7570 exclude *.pyi filese 2024-08-10 23:55:19 +02:00
Stefan Allius
65de946992 fix linter warning 2024-08-10 23:53:35 +02:00
Stefan Allius
33d385db10 optimise versionstring handling (#159)
- Reading the version string from the image updates
  it even if the image is re-pulled without re-deployment
2024-08-10 22:53:25 +02:00
Stefan Allius
1e610af1df Code Cleanup (#158)
* print coverage report

* create sonar-project property file

* install all py dependencies in one step

* code cleanup

* reduce cognitive complexity

* do not build on *.yml changes
2024-08-10 20:41:31 +02:00
Stefan Allius
db1169f61f Update README.md (#156)
add modbus_polling to example config
2024-08-10 16:49:18 +02:00
Stefan Allius
383be10e87 Hotfix v0.10.1: fix displaying the version string at startup and in HA (#155)
* Version 0.10.0 no longer displays the version string (#154)

Fixes #153
2024-08-10 14:18:25 +02:00
Stefan Allius
b364fb3f8e Dev 0.10 (#151)
* S allius/issue117 (#118)

* add shutdown flag

* add more register definitions

* add start commando for client side connections

* add first support for port 8899

* fix shutdown

* add client_mode configuration

* read client_mode config to setup inverter connections

* add client_mode connections over port 8899

* add preview build

* Update README.md

describe the new client-mode over port 8899 for GEN3PLUS

* MODBUS: the last digit of the inverter version is a hexadecimal number (#121)

* S allius/issue117 (#122)

* add shutdown flag

* add more register definitions

* add start commando for client side connections

* add first support for port 8899

* fix shutdown

* add client_mode configuration

* read client_mode config to setup inverter connections

* add client_mode connections over port 8899

* add preview build

* add documentation for client_mode

* catch os error and log thme with DEBUG level

* update changelog

* make the maximum output coefficient configurable (#124)

* S allius/issue120 (#126)

* add config option to disable the modbus polling

* read more modbus regs in polling mode

* extend connection timeouts if polling mode is disabled

* update changelog

* S allius/issue125 (#127)

* fix linter warning

* move sequence diagramm to wiki

* catch asyncio.CancelledError

* S allius/issue128 (#130)

* set Register.NO_INPUTS fix to 4 for GEN3PLUS

* don't set Register.NO_INPUTS per MODBUS

* fix unit tests

* register OUTPUT_COEFFICIENT at HA

* update changelog

* - Home Assistant: improve inverter status value texts

* - GEN3: add inverter status

* on closing send outstanding MQTT data to the broker

* force MQTT publish on every conn open and close

* reset inverter state on close

- workaround which reset the inverter status to
  offline when the inverter has a very low
  output power on connection close

* improve client modified
- reduce the polling cadence to 30s
- set controller statistics for HA

* client mode set controller IP for HA

* S allius/issue131 (#132)

* Make __publish_outstanding_mqtt public

* update proxy counter

- on client mode connection establishment or
  disconnecting update tje counection counter

* Update README.md (#133)

* reset inverter state on close

- workaround which reset the inverter status to
  offline when the inverter has a very low
  output power on connection close

* S allius/issue134 (#135)

* add polling invertval and method ha_remove()

* add client_mode arg to constructors

- add PollingInvervall

* hide some topics in client mode

- we hide topics in HA by sending an empty register
  MQTT topic during HA auto configuration

* add client_mode value

* update class diagram

* fix modbus close handler

- fix empty call and cleanup que
- add unit test

* don't sent an initial 1710 msg in client mode

* change HA icon for inverter status

* increase test coverage

* accelerate timer tests

* bump aiomqtt and schema to latest release (#137)

* MQTT timestamps and protocol improvements (#140)

* add TS_INPUT, TS_GRID and TS_TOTAL

* prepare MQTT timestamps

- add _set_mqtt_timestamp method
- fix hexdump printing

* push dev and debug images to docker.io

* add unix epoche timestamp for MQTT pakets

* set timezone for unit tests

* set name für setting timezone step

* trigger new action

* GEN3 and GEN3PLUS: handle multiple message

- read: iterate over the receive buffer
- forward: append messages to the forward buffer
- _update_header: iterate over the forward buffer

* GEN3: optimize timeout handling

- longer timeout in state init and reveived
- got to state pending only from state up

* update changelog

* cleanup

* print coloured logs

* Create sonarcloud.yml (#143)

* Update sonarcloud.yml

* Update sonarcloud.yml

* Update sonarcloud.yml

* Update sonarcloud.yml

* Update sonarcloud.yml

* build multi arch images with sboms (#146)

* don't send MODBUS request when state is not up (#147)

* adapt timings

* don't send MODBUS request when state is note up

* adapt unit test

* make test code more clean (#148)

* Make test code more clean (#149)

* cleanup

* Code coverage for SonarCloud (#150)


* cleanup code and unit tests

* add test coverage for SonarCloud

* configure SonarCloud

* update changelog

* Do no build on *.yml changes

* prepare release 0.10.0

* disable MODBUS_POLLING for GEN§PLUS in example config

* bump aiohttp to version 3.10.2

* code cleanup

* Fetch all history for all tags and branches
2024-08-09 23:16:47 +02:00
Stefan Allius
a42ba8a8c6 Dev 0.9 (#115)
* make timestamp handling stateless

* adapt tests for stateless timestamp handling

* initial version

* add more type annotations

* add more type annotations

* fix Generator annotation for ha_proxy_confs

* fix names of issue branches

* add more type annotations

* don't use depricated varn anymore

* don't mark all test as async

* fix imports

* fix solarman unit tests

- fake Mqtt class

* print image build time during proxy start

* update changelog

* fix pytest collect warning

* cleanup msg_get_time handler

* addapt unit test

* label debug images with debug

* dump droped packages

* fix warnings

* add systemtest with invalid start byte

* update changelog

* update changelog

* add exposed ports and healthcheck

* add wget for healthcheck

* add aiohttp

* use config validation for healthcheck

* add http server for healthcheck

* calculate msg prossesing time

* add healthy check methods

* fix typo

* log ConfigErr with DEBUG level

* Update async_stream.py

- check if processing time is < 5 sec

* add a close handler to release internal resources

* call modbus close hanlder on a close call

* add exception handling for forward handler

* update changelog

* isolate Modbus fix

* cleanup

* update changelog

* add heaithy handler

* log unrelease references

* add healtcheck

* complete exposed port list

* add wget for healtcheck

* add aiohttp

* use Enum class for State

* calc processing time for healthcheck

* add HTTP server for healthcheck

* cleanup

* Update CHANGELOG.md

* updat changelog

* add docstrings to state enum

* set new state State.received

* add healthy method

* log healthcheck infos with DEBUG level

* update changelog

* S allius/issue100 (#101)

* detect dead connections

- disconnect connection on Msg receive timeout
- improve connection trace (add connection id)

* update changelog

* fix merge conflict

* fix unittests

* S allius/issue108 (#109)

* add more data types

* adapt unittests

* improve test coverage

* fix linter warning

* update changelog

* S allius/issue102 (#110)

* hotfix: don't send two MODBUS commands together

* fix unit tests

* remove read loop

* optional sleep between msg read and sending rsp

* wait after read 0.5s before sending a response

* add pending state

* fix state definitions

* determine the connection timeout by the conn state

* avoid sending MODBUS cmds in the inverter's reporting phase

* update changelog

* S allius/issue111 (#112)

Synchronize regular MODBUS commands with the status of the inverter to prevent the inverter from crashing due to unexpected packets.

* inital checkin

* remove crontab entry for regular MODBUS cmds

* add timer for regular MODBUS polling

* fix Stop method call for already stopped timer

* optimize MB_START_TIMEOUT value

* cleanup

* update changelog

* fix buildx warnings

* fix timer cleanup

* fix Config.class_init()

- return error string or None
- release Schema structure after building thr config

* add quit flag to docker push

* fix timout calculation

* rename python to debugpy

* add asyncio log

* cleanup shutdown
- stop webserver on shutdown
- enable asyncio debug mode for debug versions

* update changelog

* update changelog

* fix exception in MODBUS timeout callback

* update changelog
2024-07-01 23:41:56 +02:00
108 changed files with 14347 additions and 3720 deletions

3
.cover_ghaction_rc Normal file
View File

@@ -0,0 +1,3 @@
[run]
branch = True
relative_files = True

View File

@@ -1,2 +1,2 @@
[run]
branch = True
branch = True

14
.env_example Normal file
View File

@@ -0,0 +1,14 @@
# example file for the .env file. The .env set private values
# which are needed for builing containers
# registry for debug an dev container
PRIVAT_CONTAINER_REGISTRY=docker.io/<user>/
# registry for official container (preview, rc, rel)
PUBLIC_CONTAINER_REGISTRY=ghcr.io/<user>/
PUBLIC_CR_KEY=
# define serial number of GEN3PLUS devices for systemtests
# the serialnumber are coded as 4-byte hex-strings
SOLARMAN_INV_SNR='00000000'
SOLARMAN_DCU_SNR='00000000'

View File

@@ -18,33 +18,54 @@ on:
- '**.dockerfile' # Do no build on *.dockerfile changes
- '**.sh' # Do no build on *.sh changes
pull_request:
branches: [ "main" ]
branches: [ "main", "dev-*" ]
permissions:
contents: read
pull-requests: read # allows SonarCloud to decorate PRs with analysis results
env:
TZ: "Europe/Berlin"
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
jobs:
build:
runs-on: ubuntu-latest
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.12
with:
fetch-depth: 0 # Shallow clones should be disabled for a better relevancy of analysis
- name: Set up Python 3.13
uses: actions/setup-python@v5
with:
python-version: "3.12"
python-version: "3.13"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install flake8 pytest pytest-asyncio
if [ -f requirements-test.txt ]; then pip install -r requirements-test.txt; fi
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Lint with flake8
run: |
# stop the build if there are Python syntax errors or undefined names
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
flake8 . --count --select=E9,F63,F7,F82 --ignore=F821 --show-source --statistics
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
flake8 --exit-zero --ignore=C901,E121,E123,E126,E133,E226,E241,E242,E704,W503,W504,W505 --format=pylint --output-file=output_flake.txt --exclude=*.pyc app/src/
- name: Test with pytest
run: |
python -m pytest app
python -m pytest app --cov=app/src --cov-config=.cover_ghaction_rc --cov-report=xml
coverage report
- name: Analyze with SonarCloud
if: ${{ env.SONAR_TOKEN != 0 }}
uses: SonarSource/sonarqube-scan-action@v5
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
projectBaseDir: .
args:
-Dsonar.projectKey=s-allius_tsun-gen3-proxy
-Dsonar.python.coverage.reportPaths=coverage.xml
-Dsonar.python.flake8.reportPaths=output_flake.txt
# -Dsonar.docker.hadolint.reportPaths=

4
.gitignore vendored
View File

@@ -1,11 +1,15 @@
__pycache__
.pytest_cache
.venv/**
bin/**
mosquitto/**
homeassistant/**
ha_addons/ha_addon/rootfs/home/proxy/*
ha_addons/ha_addon/rootfs/requirements.txt
tsun_proxy/**
Doku/**
.DS_Store
.coverage
.env
.venv
coverage.xml

2
.hadolint.yaml Normal file
View File

@@ -0,0 +1,2 @@
ignored:
- SC1091

1
.python-version Normal file
View File

@@ -0,0 +1 @@
3.13.2

View File

@@ -0,0 +1,4 @@
{
"sonarCloudOrganization": "s-allius",
"projectKey": "s-allius_tsun-gen3-proxy"
}

27
.vscode/settings.json vendored
View File

@@ -1,15 +1,32 @@
{
"python.analysis.extraPaths": [
"app/src",
"app/tests",
".venv/lib",
],
"python.testing.pytestArgs": [
"-vv",
"app",
"-vvv",
"--cov=app/src",
"--cov-report=xml",
"--cov-report=html",
"app",
"system_tests"
],
"python.testing.unittestEnabled": false,
"python.testing.pytestEnabled": true,
"flake8.args": [
"--extend-exclude=app/tests/*.py system_tests/*.py"
]
"--extend-exclude=app/tests/*.py,system_tests/*.py"
],
"sonarlint.connectedMode.project": {
"connectionId": "s-allius",
"projectKey": "s-allius_tsun-gen3-proxy"
},
"files.exclude": {
"**/*.pyi": true
},
"python.analysis.typeEvaluation.deprecateTypingAliases": true,
"python.autoComplete.extraPaths": [
".venv/lib"
],
"coverage-gutters.coverageBaseDir": "tsun",
"makefile.configureOnOpen": false
}

View File

@@ -5,7 +5,101 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [unreleased]
## [0.13.0] - 2025-04-13
- update dependency python to 3.13
- add initial support for TSUN MS-3000
- add initial apparmor support [#293](https://github.com/s-allius/tsun-gen3-proxy/issues/293)
- add Modbus polling mode for DCU1000 [#292](https://github.com/s-allius/tsun-gen3-proxy/issues/292)
- add Modbus scanning mode
- allow `R47`serial numbers for GEN3 inverters
- add watchdog for Add-ons
- add first costumer apparmor definition
- Respect logging.ini file, if LOG_ENV isn't set well [#288](https://github.com/s-allius/tsun-gen3-proxy/issues/288)
- Remove trailing apostrophe in the log output [#288](https://github.com/s-allius/tsun-gen3-proxy/issues/288)
- update AddOn base docker image to version 17.2.1
- addon: add date and time to dev container version
- Update AddOn python3 to 3.12.9-r0
- add initial DCU support
- update aiohttp to version 3.11.12
- fix the path handling for logging.ini and default_config.toml [#180](https://github.com/s-allius/tsun-gen3-proxy/issues/180)
## [0.12.1] - 2025-01-13
- addon: bump base image version to v17.1.0
- addon: add syntax check to config parameters
- addon: bump base image version to v17.0.2
## [0.12.0] - 2024-12-22
- add hadolint configuration
- detect usage of a local DNS resolver [#37](https://github.com/s-allius/tsun-gen3-proxy/issues/37)
- path for logs is now configurable by cli args
- configure the number of keeped logfiles by cli args
- add DOCS.md and CHANGELOG.md for add-ons
- pin library version und update them with renovate
- build config.yaml for add-ons by a jinja2 template
- use gnu make to build proxy and add-on
- make the configuration more flexible, add command line args to control this
- fix the python path so we don't need special import paths for unit tests anymore
- add emulator mode [#205](https://github.com/s-allius/tsun-gen3-proxy/issues/205)
- ignore inverter replays which a older than 1 day [#246](https://github.com/s-allius/tsun-gen3-proxy/issues/246)
- support test coverage in vscode
- upgrade SonarQube action to version 4
- update github action to Ubuntu 24-04
- add initial support for home assistant add-ons from @mime24
- github action: use ubuntu 24.04 and sonar-scanner-action 4 [#222](https://github.com/s-allius/tsun-gen3-proxy/issues/222)
- migrate paho.mqtt CallbackAPIVersion to VERSION2 [#224](https://github.com/s-allius/tsun-gen3-proxy/issues/224)
- add PROD_COMPL_TYPE to trace
- add SolarmanV5 messages builder
- report inverter alarms and faults per MQTT [#7](https://github.com/s-allius/tsun-gen3-proxy/issues/7)
## [0.11.1] - 2024-11-20
- fix pytest setup that can be startet from the rootdir
- support python venv environment
- add pytest.ini
- move common settings from .vscode/settings.json into pytest.ini
- add missing requirements
- fix import paths for pytests
- Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.10.5 to 3.10.11.
## [0.11.0] - 2024-10-13
- fix healthcheck on infrastructure with IPv6 support [#196](https://github.com/s-allius/tsun-gen3-proxy/issues/196)
- refactoring: cleaner architecture, increase test coverage
- Parse more values in Server Mode [#186](https://github.com/s-allius/tsun-gen3-proxy/issues/186)
- GEN3: add support for new messages of version 3 firmwares [#182](https://github.com/s-allius/tsun-gen3-proxy/issues/182)
- add support for controller MAC and serial number
- GEN3: don't crash on overwritten msg in the receive buffer
- Reading the version string from the image updates it even if the image is re-pulled without re-deployment
## [0.10.1] - 2024-08-10
- fix displaying the version string at startup and in HA [#153](https://github.com/s-allius/tsun-gen3-proxy/issues/153)
## [0.10.0] - 2024-08-09
- bump aiohttp to version 3.10.2
- add SonarQube and code coverage support
- don't send MODBUS request when state is note up; adapt timeouts [#141](https://github.com/s-allius/tsun-gen3-proxy/issues/141)
- build multi arch images with sboms [#144](https://github.com/s-allius/tsun-gen3-proxy/issues/144)
- add timestamp to MQTT topics [#138](https://github.com/s-allius/tsun-gen3-proxy/issues/138)
- improve the message handling, to avoid hangs
- GEN3: allow long timeouts until we received first inverter data (not only device data)
- bump aiomqtt to version 2.2.0
- bump schema to version 0.7.7
- Home Assistant: improve inverter status value texts
- GEN3: add inverter status
- fix flapping registers [#128](https://github.com/s-allius/tsun-gen3-proxy/issues/128)
- register OUTPUT_COEFFICIENT at HA
- GEN3: INVERTER_STATUS,
- add config option to disable the MODBUS polling [#120](https://github.com/s-allius/tsun-gen3-proxy/issues/120)
- make the maximum output coefficient configurable [#123](https://github.com/s-allius/tsun-gen3-proxy/issues/123)
- cleanup shutdown
- add preview build
- MODBUS: the last digit of the inverter version is a hexadecimal number [#119](https://github.com/s-allius/tsun-gen3-proxy/issues/119)
- GEN3PLUS: add client_mode connection on port 8899 [#117](https://github.com/s-allius/tsun-gen3-proxy/issues/117)
## [0.9.0] - 2024-07-01

18
Makefile Normal file
View File

@@ -0,0 +1,18 @@
.PHONY: build clean addon-dev addon-debug addon-rc addon-rel debug dev preview rc rel check-docker-compose install
debug dev preview rc rel:
$(MAKE) -C app $@
clean build:
$(MAKE) -C ha_addons $@
addon-dev addon-debug addon-rc addon-rel:
$(MAKE) -C ha_addons $(patsubst addon-%,%,$@)
check-docker-compose:
docker-compose config -q
install:
python3 -m pip install --upgrade pip
python3 -m pip install -r requirements.txt
python3 -m pip install -r requirements-test.txt

314
README.md
View File

@@ -1,34 +1,44 @@
<h1 align="center">TSUN-Gen3-Proxy</h1>
<p align="center">A proxy for</p>
<h3 align="center">TSUN Gen 3 Micro-Inverters</h3>
<h3 align="center">and Batteries</h3>
<p align="center">for easy</p>
<h3 align="center">MQTT/Home-Assistant</h3>
<p align="center">integration</p>
<p align="center">
<a href="https://opensource.org/licenses/BSD-3-Clause"><img alt="License: BSD-3-Clause" src="https://img.shields.io/badge/License-BSD_3--Clause-green.svg"></a>
<a href="https://www.python.org/downloads/release/python-3120/"><img alt="Supported Python versions" src="https://img.shields.io/badge/python-3.12-blue.svg"></a>
<a href="https://sbtinstruments.github.io/aiomqtt/introduction.html"><img alt="Supported aiomqtt versions" src="https://img.shields.io/badge/aiomqtt-2.0.1-lightblue.svg"></a>
<a href="https://www.python.org/downloads/release/python-3130/"><img alt="Supported Python versions" src="https://img.shields.io/badge/python-3.13-blue.svg"></a>
<a href="https://aiomqtt.bo3hm.com/introduction.html"><img alt="Supported aiomqtt versions" src="https://img.shields.io/badge/aiomqtt-2.3.1-lightblue.svg"></a>
<a href="https://libraries.io/pypi/aiocron"><img alt="Supported aiocron versions" src="https://img.shields.io/badge/aiocron-1.8-lightblue.svg"></a>
<a href="https://toml.io/en/v1.0.0"><img alt="Supported toml versions" src="https://img.shields.io/badge/toml-1.0.0-lightblue.svg"></a>
<br>
<a href="https://sonarcloud.io/component_measures?id=s-allius_tsun-gen3-proxy&metric=alert_status"><img alt="The quality gate status" src="https://sonarcloud.io/api/project_badges/measure?project=s-allius_tsun-gen3-proxy&metric=alert_status"></a>
<a href="https://sonarcloud.io/component_measures?id=s-allius_tsun-gen3-proxy&metric=bugs"><img alt="No of bugs" src="https://sonarcloud.io/api/project_badges/measure?project=s-allius_tsun-gen3-proxy&metric=bugs"></a>
<a href="https://sonarcloud.io/component_measures?id=s-allius_tsun-gen3-proxy&metric=code_smells"><img alt="No of code-smells" src="https://sonarcloud.io/api/project_badges/measure?project=s-allius_tsun-gen3-proxy&metric=code_smells"></a>
<br>
<a href="https://sonarcloud.io/component_measures?id=s-allius_tsun-gen3-proxy&metric=coverage"><img alt="Test coverage in percent" src="https://sonarcloud.io/api/project_badges/measure?project=s-allius_tsun-gen3-proxy&metric=coverage"></a>
</p>
# Overview
This proxy enables a reliable connection between TSUN third generation inverters and an MQTT broker. With the proxy, you can easily retrieve real-time values such as power, current and daily energy and integrate the inverter into typical home automations. This works even without an internet connection. The optional connection to the TSUN Cloud can be disabled!
This proxy enables a reliable connection between TSUN third generation devices and an MQTT broker. With the proxy, you can easily retrieve real-time values such as power, current and daily energy from inverters and energy storage systems and integrate them into typical home automations. This works even without an internet connection. The optional connection to the TSUN Cloud can be disabled!
In detail, the inverter establishes a TCP connection to the TSUN cloud to transmit current measured values every 300 seconds. To be able to forward the measurement data to an MQTT broker, the proxy must be looped into this TCP connection.
In detail, the device establishes a TCP connection to the TSUN cloud to transmit current measured values every 300 seconds. To be able to forward the measurement data to an MQTT broker, the proxy must be looped into this TCP connection.
Through this, the inverter then establishes a connection to the proxy and the proxy establishes another connection to the TSUN Cloud. The transmitted data is interpreted by the proxy and then passed on to both the TSUN Cloud and the MQTT broker. The connection to the TSUN Cloud is optional and can be switched off in the configuration (default is on). Then no more data is sent to the Internet, but no more remote updates of firmware and operating parameters (e.g. rated power, grid parameters) are possible.
Through this, the device then establishes a connection to the proxy and the proxy establishes another connection to the TSUN Cloud. The transmitted data is interpreted by the proxy and then passed on to both the TSUN Cloud and the MQTT broker. The connection to the TSUN Cloud is optional and can be switched off in the configuration (default is on). Then no more data is sent to the Internet, but no more remote updates of firmware and operating parameters (e.g. rated power, grid parameters) are possible.
By means of `docker` a simple installation and operation is possible. By using `docker-composer`, a complete stack of proxy, `MQTT-brocker` and `home-assistant` can be started easily.
Alternatively you can run the TSUN-Proxy as a Home Assistant Add-on. The installation of this add-on is pretty straightforward and not different in comparison to installing any other custom Home Assistant add-on.
Follow the Instructions mentioned in the add-on subdirectory `ha_addons`.
<br>
This project is not related to the company TSUN. It is a private initiative that aims to connect TSUN inverters with an MQTT broker. There is no support and no warranty from TSUN.
This project is not related to the company TSUN. It is a private initiative that aims to connect TSUN inverters and storage systems with an MQTT broker. There is no support and no warranty from TSUN.
<br><br>
```txt
❗An essential requirement is that the proxy can be looped into the connection
between the inverter and TSUN Cloud.
between the device and TSUN Cloud.
There are various ways to do this, for example via an DNS host entry or via firewall
rules (iptables) in your router. However, depending on the circumstances, not all
@@ -40,7 +50,8 @@ If you use a Pi-hole, you can also store the host entry in the Pi-hole.
## Features
- Supports TSUN GEN3 PLUS inverters: TSOL-MS2000, MS1800 and MS1600
- Supports TSUN GEN3 inverters: TSOL-MS800, MS700, MS600, MS400, MS350 and MS300
- Supports TSUN GEN3 PLUS batteries: TSOL-DC1000 (from version 0.13)
- Supports TSUN GEN3 inverters: TSOL-MS3000, MS800, MS700, MS600, MS400, MS350 and MS300
- `MQTT` support
- `Home-Assistant` auto-discovery support
- `MODBUS` support via MQTT topics
@@ -59,11 +70,20 @@ Here are some screenshots of how the inverter is displayed in the Home Assistant
## Requirements
### Requirements for Docker Installation
- A running Docker engine to host the container
- Ability to loop the proxy into the connection between the inverter and the TSUN cloud
- Ability to loop the proxy into the connection between the device and the TSUN cloud
### Requirements for Home Assistant Add-on Installation
- Running Home Assistant on Home Assistant OS or Supervised. Container and Core installations doesn't support add-ons.
- Ability to loop the proxy into the connection between the device and the TSUN cloud
# Getting Started
## for Docker Installation
To run the proxy, you first need to create the image. You can do this quite simply as follows:
```sh
@@ -89,15 +109,30 @@ With this information we can customize the `docker run`` statement:
docker run --dns '8.8.8.8' --env 'UID=1050' -p '5005:5005' -p '10000:10000' -v ./config:/home/tsun-proxy/config -v ./log:/home/tsun-proxy/log tsun-proxy
```
## for Home Assistant Add-on Installation
1. Add the repository URL to the Home Assistant add-on store
[![Add repository on my Home Assistant][repository-badge]][repository-url]
2. Reload the add-on store page
3. Click the "Install" button to install the add-on.
# Configuration
The configuration consists of several parts. First, the container and the proxy itself must be configured, and then the connection of the inverter to the proxy must be set up, which is done differently depending on the inverter generation
```txt
❗The following description applies to the Docker installation. When installing the Home
Assistant add-on, the configuration is carried out via the Home Assistant UI. Some of the
options described below are not required for this. Additionally, creating a config.toml
file is not necessary. However, for a general understanding of the configuration and
functionality, it is helpful to read the following description.
```
For GEN3PLUS inverters, this can be done easily via the web interface of the inverter. The GEN3 inverters do not have a web interface, so the proxy is integrated via a modified DNS resolution.
The configuration consists of several parts. First, the container and the proxy itself must be configured, and then the connection of the device to the proxy must be set up, which is done differently depending on the device generation
For GEN3PLUS devices, this can be done easily via the web interface of the devices. The GEN3 inverters do not have a web interface, so the proxy is integrated via a modified DNS resolution.
1. [Container Setup](#container-setup)
2. [Proxy Configuration](#proxy-configuration)
3. [Inverter Configuration](#inverter-configuration) (only GEN3PLUS)
3. [Inverter and Batterie Configuration](#inverter-and-batterie-configuration) (only GEN3PLUS)
4. [DNS Settings](#dns-settings) (Mandatory for GEN3)
## Container Setup
@@ -106,7 +141,7 @@ No special configuration is required for the Docker container if it is built and
On the host, two directories (for log files and for config files) must be mapped. If necessary, the UID of the proxy process can be adjusted, which is also the owner of the log and configuration files.
A description of the configuration parameters can be found [here](https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-details#docker-compose-environment-variables).
A description of the configuration parameters can be found [here](https://github.com/s-allius/tsun-gen3-proxy/wiki/configuration-env#docker-compose-environment-variables)
## Proxy Configuration
@@ -115,26 +150,63 @@ The proxy can be configured via the file 'config.toml'. When the proxy is starte
The configration uses the TOML format, which aims to be easy to read due to obvious semantics.
You find more details here: <https://toml.io/en/v1.0.0>
<details>
<summary>Here is an example of a <b>config.toml</b> file</summary>
```toml
# configuration for tsun cloud for 'GEN3' inverters
tsun.enabled = true # false: disables connecting to the tsun cloud, and avoids updates
tsun.host = 'logger.talent-monitoring.com'
tsun.port = 5005
# configuration for solarman cloud for 'GEN3 PLUS' inverters
solarman.enabled = true # false: disables connecting to the tsun cloud, and avoids updates
solarman.host = 'iot.talent-monitoring.com'
solarman.port = 10000
##########################################################################################
###
### T S U N - G E N 3 - P R O X Y
###
### from Stefan Allius
###
##########################################################################################
###
### The readme will give you an overview of the project:
### https://s-allius.github.io/tsun-gen3-proxy/
###
### The proxy supports different operation modes. Select the proper mode
### which depends on your inverter type and you inverter firmware.
### Please read:
### https://github.com/s-allius/tsun-gen3-proxy/wiki/Operation-Modes-Overview
###
### Here you will find a description of all configuration options:
### https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-toml
###
### The configration uses the TOML format, which aims to be easy to read due to
### obvious semantics. You find more details here: https://toml.io/en/v1.0.0
###
##########################################################################################
# mqtt broker configuration
##########################################################################################
##
## MQTT broker configuration
##
## In this block, you must configure the connection to your MQTT broker and specify the
## required credentials. As the proxy does not currently support an encrypted connection
## to the MQTT broker, it is strongly recommended that you do not use a public broker.
##
## https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-toml#mqtt-broker-account
##
mqtt.host = 'mqtt' # URL or IP address of the mqtt broker
mqtt.port = 1883
mqtt.user = ''
mqtt.passwd = ''
# home-assistant
##########################################################################################
##
## HOME ASSISTANT
##
## The proxy supports the MQTT autoconfiguration of Home Assistant (HA). The default
## values match the HA default configuration. If you need to change these or want to use
## a different MQTT client, you can adjust the prefixes of the MQTT topics below.
##
## https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-toml#home-assistant
##
ha.auto_conf_prefix = 'homeassistant' # MQTT prefix for subscribing for homeassistant status updates
ha.discovery_prefix = 'homeassistant' # MQTT prefix for discovery topic
ha.entity_prefix = 'tsun' # MQTT topic prefix for publishing inverter values
@@ -142,34 +214,142 @@ ha.proxy_node_id = 'proxy' # MQTT node id, for the proxy_node_i
ha.proxy_unique_id = 'P170000000000001' # MQTT unique id, to identify a proxy instance
# microinverters
inverters.allow_all = false # True: allow inverters, even if we have no inverter mapping
##########################################################################################
##
## GEN3 Proxy Mode Configuration
##
## In this block, you can configure an optional connection to the TSUN cloud for GEN3
## inverters. This connection is only required if you want send data to the TSUN cloud
## to use the TSUN APPs or receive firmware updates.
##
## https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-toml#tsun-cloud-for-gen3-inverter-only
##
# inverter mapping, maps a `serial_no* to a `node_id` and defines an optional `suggested_area` for `home-assistant`
#
# for each inverter add a block starting with [inverters."<16-digit serial numbeer>"]
tsun.enabled = true # false: disables connecting to the tsun cloud, and avoids updates
tsun.host = 'logger.talent-monitoring.com'
tsun.port = 5005
##########################################################################################
##
## GEN3PLUS Proxy Mode Configuration
##
## In this block, you can configure an optional connection to the TSUN cloud for GEN3PLUS
## inverters. This connection is only required if you want send data to the TSUN cloud
## to use the TSUN APPs or receive firmware updates.
##
## https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-toml#solarman-cloud-for-gen3plus-inverter-only
##
solarman.enabled = true # false: disables connecting to the tsun cloud, and avoids updates
solarman.host = 'iot.talent-monitoring.com'
solarman.port = 10000
##########################################################################################
###
### Inverter Definitions
###
### The proxy supports the simultaneous operation of several inverters, even of different
### types. A configuration block must be defined for each inverter, in which all necessary
### parameters must be specified. These depend on the operation mode used and also differ
### slightly depending on the inverter type.
###
### In addition, the PV modules can be defined at the individual inputs for documentation
### purposes, whereby these are displayed in Home Assistant.
###
### The proxy only accepts connections from known inverters. This can be switched off for
### test purposes and unknown serial numbers are also accepted.
###
inverters.allow_all = false # only allow known inverters
##########################################################################################
##
## For each GEN3 inverter, the serial number of the inverter must be mapped to an MQTT
## definition. To do this, the corresponding configuration block is started with
## `[Inverter.“<16-digit serial number>”]` so that all subsequent parameters are assigned
## to this inverter. Further inverter-specific parameters (e.g. polling mode) can be set
## in the configuration block
##
## The serial numbers of all GEN3 inverters start with `R17`!
##
[inverters."R17xxxxxxxxxxxx1"]
node_id = 'inv1' # Optional, MQTT replacement for inverters serial number
suggested_area = 'roof' # Optional, suggested installation area for home-assistant
node_id = 'inv_1' # MQTT replacement for inverters serial number
suggested_area = 'roof' # suggested installation place for home-assistant
modbus_polling = false # Disable optional MODBUS polling for GEN3 inverter
pv1 = {type = 'RSM40-8-395M', manufacturer = 'Risen'} # Optional, PV module descr
pv2 = {type = 'RSM40-8-395M', manufacturer = 'Risen'} # Optional, PV module descr
[inverters."R17xxxxxxxxxxxx2"]
node_id = 'inv2' # Optional, MQTT replacement for inverters serial number
suggested_area = 'balcony' # Optional, suggested installation area for home-assistant
pv1 = {type = 'RSM40-8-405M', manufacturer = 'Risen'} # Optional, PV module descr
pv2 = {type = 'RSM40-8-405M', manufacturer = 'Risen'} # Optional, PV module descr
##########################################################################################
##
## For each GEN3PLUS inverter, the serial number of the inverter must be mapped to an MQTT
## definition. To do this, the corresponding configuration block is started with
## `[Inverter.“<16-digit serial number>”]` so that all subsequent parameters are assigned
## to this inverter. Further inverter-specific parameters (e.g. polling mode, client mode)
## can be set in the configuration block
##
## The serial numbers of all GEN3PLUS inverters start with `Y17` or Y47! Each GEN3PLUS
## inverter is supplied with a “Monitoring SN:”. This can be found on a sticker enclosed
## with the inverter.
##
[inverters."Y17xxxxxxxxxxxx1"] # This block is also for inverters with a Y47 serial no
monitor_sn = 2000000000 # The "Monitoring SN:" can be found on a sticker enclosed with the inverter
node_id = 'inv_3' # MQTT replacement for inverters serial number
suggested_area = 'garage' # suggested installation place for home-assistant
monitor_sn = 2000000000 # The GEN3PLUS "Monitoring SN:"
node_id = 'inv_2' # MQTT replacement for inverters serial number
suggested_area = 'garage' # suggested installation place for home-assistant
modbus_polling = true # Enable optional MODBUS polling
# if your inverter supports SSL connections you must use the client_mode. Pls, uncomment
# the next line and configure the fixed IP of your inverter
#client_mode = {host = '192.168.0.1', port = 8899, forward = true}
pv1 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
pv2 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
pv3 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
pv4 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
##########################################################################################
##
## For each GEN3PLUS energy storage system, the serial number must be mapped to an MQTT
## definition. To do this, the corresponding configuration block is started with
## `[batteries.“<16-digit serial number>”]` so that all subsequent parameters are assigned
## to this energy storage system. Further device-specific parameters (e.g. polling mode,
## client mode) can be set in the configuration block
##
## The serial numbers of all GEN3PLUS energy storage systems/batteries start with `410`!
## Each GEN3PLUS device is supplied with a “Monitoring SN:”. This can be found on a
## sticker enclosed with the inverter.
##
[batteries."4100000000000001"]
monitor_sn = 3000000000 # The GEN3PLUS "Monitoring SN:"
node_id = 'bat_1' # MQTT replacement for devices serial number
suggested_area = ''garage' # suggested installation place for home-assistant
modbus_polling = true # Enable optional MODBUS polling
# if your inverter supports SSL connections you must use the client_mode. Pls, uncomment
# the next line and configure the fixed IP of your inverter
#client_mode = {host = '192.168.0.1', port = 8899, forward = true}
pv1 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
pv2 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
##########################################################################################
###
### If the proxy mode is configured, commands from TSUN can be sent to the inverter via
### this connection or parameters (e.g. network credentials) can be queried. Filters can
### then be configured for the AT+ commands from the TSUN Cloud so that only certain
### accesses are permitted.
###
### An overview of all known AT+ commands can be found here:
### https://github.com/s-allius/tsun-gen3-proxy/wiki/AT--commands
###
[gen3plus.at_acl]
tsun.allow = ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'] # allow this for TSUN access
tsun.block = []
@@ -178,27 +358,45 @@ mqtt.block = []
```
## Inverter Configuration
</details>
GEN3PLUS inverters offer a web interface that can be used to configure the inverter. This is very practical for sending the data directly to the proxy. On the one hand, the inverter broadcasts its own SSID on 2.4GHz. This can be recognized because it is broadcast with `AP_<Montoring SN>`. You will find the `Monitor SN` and the password for the WLAN connection on a small sticker enclosed with the inverter.
## Inverter and Batterie Configuration
If you have already connected the inverter to the cloud via the TSUN app, you can also address the inverter directly via WiFi. In the first case, the inverter uses the fixed IP address `10.10.100.254`, in the second case you have to look up the IP address in your router.
GEN3PLUS devices (inverter, batteries, ...) offer a web interface that can be used to configure it. This is very practical for sending the data directly to the proxy. On the one hand, the device broadcasts its own SSID on 2.4GHz. This can be recognized because it is broadcast with `AP_<Montoring SN>`. You will find the `Monitor SN` and the password for the WLAN connection on a small sticker enclosed with the device.
The standard web interface of the inverter can be accessed at `http://<ip-adress>/index_cn.html`. Here you can set up the WLAN connection or change the password. The default user and password is `admin`/`admin`.
If you have already connected the device to the cloud via the TSUN app, you can also address the device directly via WiFi. In the first case, the device uses the fixed IP address `10.10.100.254`, in the second case you have to look up the IP address in your router.
The standard web interface of the device can be accessed at `http://<ip-adress>/index_cn.html`. Here you can set up the WLAN connection or change the password. The default user and password is `admin`/`admin`.
For our purpose, the hidden URL `http://<ip-adress>/config_hide.html` should be called. There you can see and modify the parameters for accessing the cloud. Here we enter the IP address of our proxy and the IP port `10000` for the `Server A Setting` and for `Optional Server Setting`. The second entry is used as a backup in the event of connection problems.
```txt
❗If the IP port is set to 10443 in the device configuration, you probably have a firmware with SSL support.
In this case, you MUST NOT change the port or the host address, as this may cause the device to hang and
require a complete reset. Use the configuration in client mode instead.
```
If access to the web interface does not work, it can also be redirected via DNS redirection, as is necessary for the GEN3 inverters.
## Client Mode (GEN3PLUS only)
Newer GEN3PLUS inverters, batteries and smart meter support SSL encrypted connections over port 10443 to the TSUN cloud. In this case you can't loop the proxy into this connection, since the certicate verification of the device don't allow this. You can configure the proxy in client-mode to establish an unencrypted connection to the inverter. For this porpuse the device listen on port `8899`.
There are some requirements to be met:
- the device should have a fixed IP
- the proxy must be able to reach the device. You must configure a corresponding route in your router if the device and the proxy are in different IP networks
- add a 'client_mode' line to your config.toml file, to specify the device's ip address
## DNS Settings
### Loop the proxy into the connection
To include the proxy in the connection between the inverter and the TSUN Cloud, you must adapt the DNS record of *logger.talent-monitoring.com* within the network that your inverter uses. You need a mapping from logger.talent-monitoring.com to the IP address of the host running the Docker engine.
To include the proxy in the connection between the device and the TSUN Cloud, you must adapt the DNS record of *logger.talent-monitoring.com* within the network that your deivce uses. You need a mapping from logger.talent-monitoring.com to the IP address of the host running the Docker engine.
The new GEN3 PLUS inverters use a different URL. Here, *iot.talent-monitoring.com* must be redirected.
The new GEN3 PLUS devices use a different URL. Here, *iot.talent-monitoring.com* must be redirected.
This can be done, for example, by adding a local DNS record to the Pi-hole if you are using it.
This can be done, for example, by adding a local DNS record to the Pi-hole if you are using it. User of the Home Assistant Add-on should use the AdGuard Add-on for this.
### DNS Rebind Protection
@@ -214,7 +412,7 @@ As described above, set a DNS sever in the Docker command or Docker compose file
### Over The Air (OTA) firmware update
Even if the proxy is connected between the inverter and the TSUN Cloud, an OTA update is supported. To do this, the inverter must be able to reach the website <http://www.talent-monitoring.com:9002/> in order to download images from there.
Even if the proxy is connected between the device and the TSUN Cloud, an OTA update is supported. To do this, the device must be able to reach the website <http://www.talent-monitoring.com:9002/> in order to download images from there.
It must be ensured that this address is not mapped to the proxy!
@@ -226,22 +424,25 @@ In the following table you will find an overview of which inverter model has bee
A combination with a red question mark should work, but I have not checked it in detail.
<table align="center">
<tr><th align="center">Micro Inverter Model</th><th align="center">Fw. 1.00.06</th><th align="center">Fw. 1.00.17</th><th align="center">Fw. 1.00.20</th><th align="center">Fw. 4.0.10</th></tr>
<tr><td>GEN3 micro inverters (single MPPT):<br>MS300, MS350, MS400<br>MS400-D</td><td align="center"></td><td align="center"></td><td align="center"></td><td align="center"></td></tr>
<tr><td>GEN3 micro inverters (dual MPPT):<br>MS600, MS700, MS800<br>MS600-D, MS800-D</td><td align="center">✔️</td><td align="center">✔️</td><td align="center">✔️</td><td align="center"></td></tr>
<tr><td>GEN3 PLUS micro inverters:<br>MS1600, MS1800, MS2000<br>MS2000-D</td><td align="center"></td><td align="center"></td><td align="center"></td><td align="center">✔️</td></tr>
<tr><td>TITAN micro inverters:<br>TSOL-MP3000, MP2250, MS3000</td><td align="center"></td><td align="center"></td><td align="center"></td><td align="center"></td></tr>
<tr><th align="center">Micro Inverter Model</th><th align="center">Fw. 1.00.06</th><th align="center">Fw. 1.00.17</th><th align="center">Fw. 1.00.20</th><th align="center">Fw. 4.0.10</th><th align="center">Fw. 4.0.20</th></tr>
<tr><td>GEN3 micro inverters (single MPPT):<br>MS300, MS350, MS400<br>MS400-D</td><td align="center">✔️</td><td align="center">✔️</td><td align="center">✔️</td><td align="center"></td><td align="center"></td></tr>
<tr><td>GEN3 micro inverters (dual MPPT):<br>MS600, MS700, MS800<br>MS600-D, MS800-D</td><td align="center">✔️</td><td align="center">✔️</td><td align="center">✔️</td><td align="center"></td><td align="center"></td></tr>
<tr><td>GEN3 micro inverters (quad MPPT):<br>MS3000</td><td align="center">✔️</td><td align="center">✔️</td><td align="center">✔️</td><td align="center"></td><td align="center"></td></tr>
<tr><td>GEN3 PLUS micro inverters:<br>MS1600, MS1800, MS2000<br>MS2000-D, MS800</td><td align="center"></td><td align="center"></td><td align="center"></td><td align="center">✔️</td><td align="center">✔️</td></tr>
<tr><td>GEN3 PLUS storage systems:<br>DC1000</td><td align="center"></td><td align="center"></td><td align="center"></td><td align="center">✔️</td><td align="center">✔️</td></tr>
<tr><td>GEN3 PLUS smart meter:<br>TSOL-MG3-MS, DDZY422-D2</td><td align="center"></td><td align="center"></td><td align="center"></td><td align="center">❓</td><td align="center">❓</td></tr>
</<tr><td>TITAN micro inverters:<br>TSOL-MP3000, MP2250</td><td align="center">❓</td><td align="center">❓</td><td align="center">❓</td><td align="center">❓</td><td align="center">❓</td></tr>
</table>
```txt
Legend
: Firmware not available for this devices
✔️: proxy support testet
❓: proxy support possible but not testet
✔️: Proxy support testet
❓: Proxy support unknown. There is an open port, but all known protocols do not work.
🚧: Proxy support in preparation
```
The new inverters of the GEN3 Plus generation (e.g. MS-2000) use a completely different protocol for data transmission to the TSUN server. These inverters are supported from proxy version 0.6. The serial numbers of these inverters start with `Y17E` or `Y47E` instead of `R17E`
❗GEN3 Plus generation devices (e.g. MS-2000, DC-1000) can be recognized by their serial number. This starts with 'Y17' or 'Y47' for inverters and '410' for the DC-1000 battery storage system. In contrast, the serial number of GEN3 inverters begins with 'R17' or 'R47'.
If you have one of these combinations with a red question mark, it would be very nice if you could send me a proxy trace so that I can carry out the detailed checks and adjust the device and system tests. [Ask here how to send a trace](https://github.com/s-allius/tsun-gen3-proxy/discussions/categories/traces-for-compatibility-check)
@@ -265,3 +466,6 @@ We're very happy to receive contributions to this project! You can get started b
## Changelog
The changelog lives in [CHANGELOG.md](https://github.com/s-allius/tsun-gen3-proxy/blob/main/CHANGELOG.md). It follows the principles of [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
[repository-badge]: https://img.shields.io/badge/Add%20repository%20to%20my-Home%20Assistant-41BDF5?logo=home-assistant&style=for-the-badge
[repository-url]: https://my.home-assistant.io/redirect/supervisor_add_addon_repository/?repository_url=https%3A%2F%2Fgithub.com%2Fs-allius%2Fha-addons

1
app/.version Normal file
View File

@@ -0,0 +1 @@
0.13.0

View File

@@ -4,14 +4,13 @@ ARG GID=1000
#
# first stage for our base image
FROM python:3.12-alpine AS base
USER root
FROM python:3.13-alpine AS base
COPY --chmod=0700 ./hardening_base.sh .
COPY --chmod=0700 ./hardening_base.sh /
RUN apk upgrade --no-cache && \
apk add --no-cache su-exec && \
./hardening_base.sh && \
rm ./hardening_base.sh
apk add --no-cache su-exec=0.2-r3 && \
/hardening_base.sh && \
rm /hardening_base.sh
#
# second stage for building wheels packages
@@ -19,8 +18,8 @@ FROM base AS builder
# copy the dependencies file to the root dir and install requirements
COPY ./requirements.txt /root/
RUN apk add --no-cache build-base && \
python -m pip install --no-cache-dir -U pip wheel && \
RUN apk add --no-cache build-base=0.5-r3 && \
python -m pip install --no-cache-dir pip==24.3.1 wheel==0.45.1 && \
python -OO -m pip wheel --no-cache-dir --wheel-dir=/root/wheels -r /root/requirements.txt
@@ -31,10 +30,9 @@ ARG SERVICE_NAME
ARG VERSION
ARG UID
ARG GID
ARG LOG_LVL
ARG LOG_LVL=INFO
ARG environment
ENV VERSION=$VERSION
ENV SERVICE_NAME=$SERVICE_NAME
ENV UID=$UID
ENV GID=$GID
@@ -51,9 +49,9 @@ VOLUME ["/home/$SERVICE_NAME/log", "/home/$SERVICE_NAME/config"]
# and unistall python packages and alpine package manger to reduce attack surface
COPY --from=builder /root/wheels /root/wheels
COPY --chmod=0700 ./hardening_final.sh .
RUN python -m pip install --no-cache --no-index /root/wheels/* && \
RUN python -m pip install --no-cache-dir --no-cache --no-index /root/wheels/* && \
rm -rf /root/wheels && \
python -m pip uninstall --yes setuptools wheel pip && \
python -m pip uninstall --yes wheel pip && \
apk --purge del apk-tools && \
./hardening_final.sh && \
rm ./hardening_final.sh
@@ -61,19 +59,11 @@ RUN python -m pip install --no-cache --no-index /root/wheels/* && \
# copy the content of the local src and config directory to the working directory
COPY --chmod=0700 entrypoint.sh /root/entrypoint.sh
COPY config .
COPY src .
RUN date > /build-date.txt
RUN echo ${VERSION} > /proxy-version.txt \
&& date > /build-date.txt
EXPOSE 5005 8127 10000
# command to run on container start
ENTRYPOINT ["/root/entrypoint.sh"]
CMD [ "python3", "./server.py" ]
LABEL org.opencontainers.image.title="TSUN Gen3 Proxy"
LABEL org.opencontainers.image.authors="Stefan Allius"
LABEL org.opencontainers.image.source=https://github.com/s-allius/tsun-gen3-proxy
LABEL org.opencontainers.image.description='This proxy enables a reliable connection between TSUN third generation inverters (eg. TSOL MS600, MS800, MS2000) and an MQTT broker to integrate the inverter into typical home automations.'
LABEL org.opencontainers.image.licenses="BSD-3-Clause"
LABEL org.opencontainers.image.vendor="Stefan Allius"

43
app/Makefile Normal file
View File

@@ -0,0 +1,43 @@
#!make
include ../.env
SHELL = /bin/sh
IMAGE = tsun-gen3-proxy
# Folders
SRC=.
export BUILD_DATE := ${shell date -Iminutes}
VERSION := $(shell cat $(SRC)/.version)
export MAJOR := $(shell echo $(VERSION) | cut -f1 -d.)
PUBLIC_URL := $(shell echo $(PUBLIC_CONTAINER_REGISTRY) | cut -f1 -d/)
PUBLIC_USER :=$(shell echo $(PUBLIC_CONTAINER_REGISTRY) | cut -f2 -d/)
dev debug:
@echo version: $(VERSION) build-date: $(BUILD_DATE) image: $(PRIVAT_CONTAINER_REGISTRY)$(IMAGE)
export VERSION=$(VERSION)-$@ && \
export IMAGE=$(PRIVAT_CONTAINER_REGISTRY)$(IMAGE) && \
docker buildx bake -f docker-bake.hcl $@
rc:
@[ "${RC}" ] || ( echo ">> RC is not set"; exit 1 )
@echo version: $(VERSION) build-date: $(BUILD_DATE) image: $(PUBLIC_CONTAINER_REGISTRY)$(IMAGE)
@echo login at $(PUBLIC_URL) as $(PUBLIC_USER)
@DO_LOGIN="$(shell echo $(PUBLIC_CR_KEY) | docker login $(PUBLIC_URL) -u $(PUBLIC_USER) --password-stdin)"
export VERSION=$(VERSION)-$@$(RC) && \
export IMAGE=$(PUBLIC_CONTAINER_REGISTRY)$(IMAGE) && \
docker buildx bake -f docker-bake.hcl $@
preview rel:
@echo version: $(VERSION) build-date: $(BUILD_DATE) image: $(PUBLIC_CONTAINER_REGISTRY)$(IMAGE)
@echo login at $(PUBLIC_URL) as $(PUBLIC_USER)
@DO_LOGIN="$(shell echo $(PUBLIC_CR_KEY) | docker login $(PUBLIC_URL) -u $(PUBLIC_USER) --password-stdin)"
export VERSION=$(VERSION)-$@ && \
export IMAGE=$(PUBLIC_CONTAINER_REGISTRY)$(IMAGE) && \
docker buildx bake -f docker-bake.hcl $@
.PHONY: debug dev preview rc rel

View File

@@ -1,55 +0,0 @@
#!/bin/bash
# Usage: ./build.sh [dev|rc|rel]
# dev: development build
# rc: release candidate build
# rel: release build and push to ghcr.io
# Note: for release build, you need to set GHCR_TOKEN
# export GHCR_TOKEN=<YOUR_GITHUB_TOKEN> in your .zprofile
# see also: https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-container-registry
set -e
BUILD_DATE=$(date -Iminutes)
BRANCH=$(git rev-parse --abbrev-ref HEAD)
VERSION=$(git describe --tags --abbrev=0)
VERSION="${VERSION:1}"
arr=(${VERSION//./ })
MAJOR=${arr[0]}
IMAGE=tsun-gen3-proxy
if [[ $1 == debug ]] || [[ $1 == dev ]] ;then
IMAGE=docker.io/sallius/${IMAGE}
VERSION=${VERSION}-$1
elif [[ $1 == rc ]] || [[ $1 == rel ]];then
IMAGE=ghcr.io/s-allius/${IMAGE}
else
echo argument missing!
echo try: $0 '[debug|dev|rc|rel]'
exit 1
fi
echo version: $VERSION build-date: $BUILD_DATE image: $IMAGE
if [[ $1 == debug ]];then
docker build --build-arg "VERSION=${VERSION}" --build-arg environment=dev --build-arg "LOG_LVL=DEBUG" --label "org.opencontainers.image.created=${BUILD_DATE}" --label "org.opencontainers.image.version=${VERSION}" --label "org.opencontainers.image.revision=${BRANCH}" -t ${IMAGE}:debug app
elif [[ $1 == dev ]];then
docker build --build-arg "VERSION=${VERSION}" --build-arg environment=production --label "org.opencontainers.image.created=${BUILD_DATE}" --label "org.opencontainers.image.version=${VERSION}" --label "org.opencontainers.image.revision=${BRANCH}" -t ${IMAGE}:dev app
elif [[ $1 == rc ]];then
docker build --build-arg "VERSION=${VERSION}" --build-arg environment=production --label "org.opencontainers.image.created=${BUILD_DATE}" --label "org.opencontainers.image.version=${VERSION}" --label "org.opencontainers.image.revision=${BRANCH}" -t ${IMAGE}:rc -t ${IMAGE}:${VERSION} app
echo 'login to ghcr.io'
echo $GHCR_TOKEN | docker login ghcr.io -u s-allius --password-stdin
docker push -q ghcr.io/s-allius/tsun-gen3-proxy:rc
docker push -q ghcr.io/s-allius/tsun-gen3-proxy:${VERSION}
elif [[ $1 == rel ]];then
docker build --no-cache --build-arg "VERSION=${VERSION}" --build-arg environment=production --label "org.opencontainers.image.created=${BUILD_DATE}" --label "org.opencontainers.image.version=${VERSION}" --label "org.opencontainers.image.revision=${BRANCH}" -t ${IMAGE}:latest -t ${IMAGE}:${MAJOR} -t ${IMAGE}:${VERSION} app
echo 'login to ghcr.io'
echo $GHCR_TOKEN | docker login ghcr.io -u s-allius --password-stdin
docker push -q ghcr.io/s-allius/tsun-gen3-proxy:latest
docker push -q ghcr.io/s-allius/tsun-gen3-proxy:${MAJOR}
docker push -q ghcr.io/s-allius/tsun-gen3-proxy:${VERSION}
fi
echo 'check docker-compose.yaml file'
docker-compose config -q

View File

@@ -1,56 +0,0 @@
# configuration to reach tsun cloud
tsun.enabled = true # false: disables connecting to the tsun cloud, and avoids updates
tsun.host = 'logger.talent-monitoring.com'
tsun.port = 5005
# configuration to reach the new tsun cloud for G3 Plus inverters
solarman.enabled = true # false: disables connecting to the tsun cloud, and avoids updates
solarman.host = 'iot.talent-monitoring.com'
solarman.port = 10000
# mqtt broker configuration
mqtt.host = 'mqtt' # URL or IP address of the mqtt broker
mqtt.port = 1883
mqtt.user = ''
mqtt.passwd = ''
# home-assistant
ha.auto_conf_prefix = 'homeassistant' # MQTT prefix for subscribing for homeassistant status updates
ha.discovery_prefix = 'homeassistant' # MQTT prefix for discovery topic
ha.entity_prefix = 'tsun' # MQTT topic prefix for publishing inverter values
ha.proxy_node_id = 'proxy' # MQTT node id, for the proxy_node_id
ha.proxy_unique_id = 'P170000000000001' # MQTT unique id, to identify a proxy instance
# microinverters
inverters.allow_all = true # allow inverters, even if we have no inverter mapping
# inverter mapping, maps a `serial_no* to a `mqtt_id` and defines an optional `suggested_place` for `home-assistant`
#
# for each inverter add a block starting with [inverters."<16-digit serial numbeer>"]
[inverters."R170000000000001"]
#node_id = '' # Optional, MQTT replacement for inverters serial number
#suggested_area = '' # Optional, suggested installation area for home-assistant
#pv1 = {type = 'RSM40-8-395M', manufacturer = 'Risen'} # Optional, PV module descr
#pv2 = {type = 'RSM40-8-395M', manufacturer = 'Risen'} # Optional, PV module descr
#[inverters."R17xxxxxxxxxxxx2"]
#node_id = '' # Optional, MQTT replacement for inverters serial number
#suggested_area = '' # Optional, suggested installation area for home-assistant
#pv1 = {type = 'RSM40-8-405M', manufacturer = 'Risen'} # Optional, PV module descr
#pv2 = {type = 'RSM40-8-405M', manufacturer = 'Risen'} # Optional, PV module descr
[inverters."Y170000000000001"]
monitor_sn = 2000000000 # The "Monitoring SN:" can be found on a sticker enclosed with the inverter
#node_id = '' # Optional, MQTT replacement for inverters serial number
#suggested_area = '' # Optional, suggested installation place for home-assistant
#pv1 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
#pv2 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
#pv3 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
#pv4 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
[gen3plus.at_acl]
tsun.allow = ['AT+Z', 'AT+UPURL', 'AT+SUPDATE']
tsun.block = []
mqtt.allow = ['AT+']
mqtt.block = []

93
app/docker-bake.hcl Normal file
View File

@@ -0,0 +1,93 @@
variable "IMAGE" {
default = "tsun-gen3-proxy"
}
variable "VERSION" {
default = "0.0.0"
}
variable "MAJOR" {
default = "0"
}
variable "BUILD_DATE" {
default = "dev"
}
variable "BRANCH" {
default = ""
}
variable "DESCRIPTION" {
default = "This proxy enables a reliable connection between TSUN third generation inverters (eg. TSOL MS600, MS800, MS2000) and an MQTT broker to integrate the inverter into typical home automations."
}
target "_common" {
context = "."
dockerfile = "Dockerfile"
args = {
VERSION = "${VERSION}"
environment = "production"
}
attest = [
"type =provenance,mode=max",
"type =sbom,generator=docker/scout-sbom-indexer:latest"
]
annotations = [
"index:org.opencontainers.image.title=TSUN Gen3 Proxy",
"index:org.opencontainers.image.authors=Stefan Allius",
"index:org.opencontainers.image.created=${BUILD_DATE}",
"index:org.opencontainers.image.version=${VERSION}",
"index:org.opencontainers.image.revision=${BRANCH}",
"index:org.opencontainers.image.description=${DESCRIPTION}",
"index:org.opencontainers.image.licenses=BSD-3-Clause",
"index:org.opencontainers.image.source=https://github.com/s-allius/tsun-gen3-proxy"
]
labels = {
"org.opencontainers.image.title" = "TSUN Gen3 Proxy"
"org.opencontainers.image.authors" = "Stefan Allius"
"org.opencontainers.image.created" = "${BUILD_DATE}"
"org.opencontainers.image.version" = "${VERSION}"
"org.opencontainers.image.revision" = "${BRANCH}"
"org.opencontainers.image.description" = "${DESCRIPTION}"
"org.opencontainers.image.licenses" = "BSD-3-Clause"
"org.opencontainers.image.source" = "https://github.com/s-allius/tsun-gen3-proxy"
}
output = [
"type=image,push=true"
]
no-cache = false
platforms = ["linux/amd64", "linux/arm64", "linux/arm/v7"]
}
target "_debug" {
args = {
LOG_LVL = "DEBUG"
environment = "dev"
}
}
target "_prod" {
args = {
}
}
target "debug" {
inherits = ["_common", "_debug"]
tags = ["${IMAGE}:debug"]
}
target "dev" {
inherits = ["_common"]
tags = ["${IMAGE}:dev"]
}
target "preview" {
inherits = ["_common", "_prod"]
tags = ["${IMAGE}:preview", "${IMAGE}:${VERSION}"]
}
target "rc" {
inherits = ["_common", "_prod"]
tags = ["${IMAGE}:rc", "${IMAGE}:${VERSION}"]
}
target "rel" {
inherits = ["_common", "_prod"]
tags = ["${IMAGE}:latest", "${IMAGE}:${MAJOR}", "${IMAGE}:${VERSION}"]
no-cache = true
}

263
app/docu/proxy.svg Normal file
View File

@@ -0,0 +1,263 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<!-- Generated by graphviz version 2.40.1 (20161225.0304)
-->
<!-- Title: G Pages: 1 -->
<svg width="634pt" height="966pt"
viewBox="0.00 0.00 634.00 966.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 962)">
<title>G</title>
<polygon fill="#ffffff" stroke="transparent" points="-4,4 -4,-962 630,-962 630,4 -4,4"/>
<!-- A0 -->
<g id="node1" class="node">
<title>A0</title>
<polygon fill="#fff8dc" stroke="#000000" points="200.1964,-934 91.8036,-934 91.8036,-898 206.1964,-898 206.1964,-928 200.1964,-934"/>
<polyline fill="none" stroke="#000000" points="200.1964,-934 200.1964,-928 "/>
<polyline fill="none" stroke="#000000" points="206.1964,-928 200.1964,-928 "/>
<text text-anchor="middle" x="149" y="-919" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">You can stick notes</text>
<text text-anchor="middle" x="149" y="-907" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">on diagrams too!</text>
</g>
<!-- A1 -->
<g id="node2" class="node">
<title>A1</title>
<polygon fill="none" stroke="#000000" points="224,-926 224,-958 340,-958 340,-926 224,-926"/>
<text text-anchor="start" x="233.649" y="-939" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;&lt;AbstractIterMeta&gt;&gt;</text>
<polygon fill="none" stroke="#000000" points="224,-906 224,-926 340,-926 340,-906 224,-906"/>
<polygon fill="none" stroke="#000000" points="224,-874 224,-906 340,-906 340,-874 224,-874"/>
<text text-anchor="start" x="260.61" y="-887" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__iter__()</text>
</g>
<!-- A4 -->
<g id="node5" class="node">
<title>A4</title>
<polygon fill="none" stroke="#000000" points="187,-726 187,-758 378,-758 378,-726 187,-726"/>
<text text-anchor="start" x="248.5965" y="-739" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;&lt;InverterIfc&gt;&gt;</text>
<polygon fill="none" stroke="#000000" points="187,-706 187,-726 378,-726 378,-706 187,-706"/>
<polygon fill="none" stroke="#000000" points="187,-650 187,-706 378,-706 378,-650 187,-650"/>
<text text-anchor="start" x="249.022" y="-687" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">healthy()&#45;&gt;bool</text>
<text text-anchor="start" x="196.7835" y="-675" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;disc(shutdown_started=False)</text>
<text text-anchor="start" x="228.044" y="-663" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;create_remote()</text>
</g>
<!-- A1&#45;&gt;A4 -->
<g id="edge1" class="edge">
<title>A1&#45;&gt;A4</title>
<path fill="none" stroke="#000000" stroke-dasharray="5,2" d="M282,-863.7744C282,-831.6663 282,-790.6041 282,-758.1476"/>
<polygon fill="none" stroke="#000000" points="278.5001,-863.8621 282,-873.8622 285.5001,-863.8622 278.5001,-863.8621"/>
</g>
<!-- A2 -->
<g id="node3" class="node">
<title>A2</title>
<polygon fill="none" stroke="#000000" points="450,-454 450,-498 572,-498 572,-454 450,-454"/>
<text text-anchor="start" x="501.277" y="-479" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Mqtt</text>
<text text-anchor="start" x="478.4815" y="-467" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;&lt;Singleton&gt;&gt;</text>
<polygon fill="none" stroke="#000000" points="450,-398 450,-454 572,-454 572,-398 450,-398"/>
<text text-anchor="start" x="468.4875" y="-435" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;static&gt;ha_restarts</text>
<text text-anchor="start" x="476.2665" y="-423" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;static&gt;__client</text>
<text text-anchor="start" x="459.8735" y="-411" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;static&gt;__cb_MqttIsUp</text>
<polygon fill="none" stroke="#000000" points="450,-354 450,-398 572,-398 572,-354 450,-354"/>
<text text-anchor="start" x="472.936" y="-379" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;publish()</text>
<text text-anchor="start" x="477.1045" y="-367" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;close()</text>
</g>
<!-- A3 -->
<g id="node4" class="node">
<title>A3</title>
<polygon fill="none" stroke="#000000" points="396,-792 396,-824 626,-824 626,-792 396,-792"/>
<text text-anchor="start" x="498.2215" y="-805" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Proxy</text>
<polygon fill="none" stroke="#000000" points="396,-676 396,-792 626,-792 626,-676 396,-676"/>
<text text-anchor="start" x="482.6545" y="-773" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;cls&gt;db_stat</text>
<text text-anchor="start" x="475.991" y="-761" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;cls&gt;entity_prfx</text>
<text text-anchor="start" x="466.826" y="-749" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;cls&gt;discovery_prfx</text>
<text text-anchor="start" x="466.262" y="-737" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;cls&gt;proxy_node_id</text>
<text text-anchor="start" x="462.373" y="-725" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;cls&gt;proxy_unique_id</text>
<text text-anchor="start" x="478.216" y="-713" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;cls&gt;mqtt:Mqtt</text>
<text text-anchor="start" x="480.4355" y="-689" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__ha_restarts</text>
<polygon fill="none" stroke="#000000" points="396,-584 396,-676 626,-676 626,-584 396,-584"/>
<text text-anchor="start" x="487.1145" y="-657" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">class_init()</text>
<text text-anchor="start" x="481.834" y="-645" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">class_close()</text>
<text text-anchor="start" x="453.484" y="-621" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;_cb_mqtt_is_up()</text>
<text text-anchor="start" x="405.697" y="-609" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;_register_proxy_stat_home_assistant()</text>
<text text-anchor="start" x="414.584" y="-597" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;_async_publ_mqtt_proxy_stat(key)</text>
</g>
<!-- A3&#45;&gt;A2 -->
<g id="edge9" class="edge">
<title>A3&#45;&gt;A2</title>
<path fill="none" stroke="#000000" d="M511,-571.373C511,-549.9571 511,-528.339 511,-508.5579"/>
<polygon fill="#000000" stroke="#000000" points="511.0001,-571.682 515,-577.6821 511,-583.682 507,-577.682 511.0001,-571.682"/>
<polygon fill="#000000" stroke="#000000" points="511,-498.392 515.5001,-508.3919 511,-503.392 511.0001,-508.392 511.0001,-508.392 511.0001,-508.392 511,-503.392 506.5001,-508.392 511,-498.392 511,-498.392"/>
</g>
<!-- A5 -->
<g id="node6" class="node">
<title>A5</title>
<polygon fill="none" stroke="#000000" points="214,-502 214,-534 405,-534 405,-502 214,-502"/>
<text text-anchor="start" x="281.16" y="-515" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InverterBase</text>
<polygon fill="none" stroke="#000000" points="214,-386 214,-502 405,-502 405,-386 214,-386"/>
<text text-anchor="start" x="290.3335" y="-483" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_registry</text>
<text text-anchor="start" x="278.9355" y="-471" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__ha_restarts</text>
<text text-anchor="start" x="299.497" y="-447" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">addr</text>
<text text-anchor="start" x="282.5505" y="-435" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">config_id:str</text>
<text text-anchor="start" x="255.8785" y="-423" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">prot_class:MessageProt</text>
<text text-anchor="start" x="270.053" y="-411" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">remote:StreamPtr</text>
<text text-anchor="start" x="275.332" y="-399" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">local:StreamPtr</text>
<polygon fill="none" stroke="#000000" points="214,-318 214,-386 405,-386 405,-318 214,-318"/>
<text text-anchor="start" x="276.022" y="-367" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">healthy()&#45;&gt;bool</text>
<text text-anchor="start" x="223.7835" y="-355" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;disc(shutdown_started=False)</text>
<text text-anchor="start" x="255.044" y="-343" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;create_remote()</text>
<text text-anchor="start" x="249.484" y="-331" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;async_publ_mqtt()</text>
</g>
<!-- A3&#45;&gt;A5 -->
<g id="edge7" class="edge">
<title>A3&#45;&gt;A5</title>
<path fill="none" stroke="#000000" d="M417.6791,-575.5683C407.6409,-561.7533 397.5008,-547.7982 387.6588,-534.2532"/>
<polygon fill="none" stroke="#000000" points="414.8649,-577.6495 423.5747,-583.682 420.5279,-573.5347 414.8649,-577.6495"/>
</g>
<!-- A4&#45;&gt;A5 -->
<g id="edge2" class="edge">
<title>A4&#45;&gt;A5</title>
<path fill="none" stroke="#000000" stroke-dasharray="5,2" d="M288.2719,-639.4228C291.3086,-608.1559 295.0373,-569.7639 298.491,-534.2034"/>
<polygon fill="none" stroke="#000000" points="284.7531,-639.4473 287.27,-649.7389 291.7203,-640.1241 284.7531,-639.4473"/>
</g>
<!-- A6 -->
<g id="node7" class="node">
<title>A6</title>
<polygon fill="none" stroke="#000000" points="365,-236 365,-268 465,-268 465,-236 365,-236"/>
<text text-anchor="start" x="392.4995" y="-249" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">StreamPtr</text>
<polygon fill="none" stroke="#000000" points="365,-216 365,-236 465,-236 465,-216 365,-216"/>
<polygon fill="none" stroke="#000000" points="365,-172 365,-216 465,-216 465,-172 365,-172"/>
<text text-anchor="start" x="374.7175" y="-197" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">stream:ProtocolIfc</text>
<text text-anchor="start" x="389.7185" y="-185" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ifc:AsyncIfc</text>
</g>
<!-- A5&#45;&gt;A6 -->
<g id="edge8" class="edge">
<title>A5&#45;&gt;A6</title>
<path fill="none" stroke="#000000" d="M364.6387,-317.872C371.8786,-303.802 379.0526,-289.86 385.6187,-277.0995"/>
<polygon fill="#000000" stroke="#000000" points="390.2846,-268.0318 389.7105,-278.9826 387.9969,-272.4777 385.7091,-276.9237 385.7091,-276.9237 385.7091,-276.9237 387.9969,-272.4777 381.7078,-274.8647 390.2846,-268.0318 390.2846,-268.0318"/>
<text text-anchor="middle" x="389.5069" y="-285.0166" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">2</text>
</g>
<!-- A7 -->
<g id="node8" class="node">
<title>A7</title>
<polygon fill="none" stroke="#000000" points="346.7314,-238 271.2686,-238 271.2686,-202 346.7314,-202 346.7314,-238"/>
<text text-anchor="middle" x="309" y="-217" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InverterG3</text>
</g>
<!-- A5&#45;&gt;A7 -->
<g id="edge5" class="edge">
<title>A5&#45;&gt;A7</title>
<path fill="none" stroke="#000000" d="M309,-307.7729C309,-280.5002 309,-254.684 309,-238.2013"/>
<polygon fill="none" stroke="#000000" points="305.5001,-307.872 309,-317.872 312.5001,-307.872 305.5001,-307.872"/>
</g>
<!-- A9 -->
<g id="node10" class="node">
<title>A9</title>
<polygon fill="none" stroke="#000000" points="102.9001,-238 21.0999,-238 21.0999,-202 102.9001,-202 102.9001,-238"/>
<text text-anchor="middle" x="62" y="-217" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InverterG3P</text>
</g>
<!-- A5&#45;&gt;A9 -->
<g id="edge6" class="edge">
<title>A5&#45;&gt;A9</title>
<path fill="none" stroke="#000000" d="M205.2667,-346.4637C174.3973,-321.9347 140.8582,-294.4156 111,-268 100.2971,-258.5312 88.8616,-247.3925 79.732,-238.23"/>
<polygon fill="none" stroke="#000000" points="203.462,-349.4991 213.4739,-352.965 207.8086,-344.0121 203.462,-349.4991"/>
</g>
<!-- A11 -->
<g id="node12" class="node">
<title>A11</title>
<polygon fill="none" stroke="#000000" points="458.6421,-36 369.3579,-36 369.3579,0 458.6421,0 458.6421,-36"/>
<text text-anchor="middle" x="414" y="-15" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;&lt;AsyncIfc&gt;&gt;</text>
</g>
<!-- A6&#45;&gt;A11 -->
<g id="edge11" class="edge">
<title>A6&#45;&gt;A11</title>
<path fill="none" stroke="#000000" d="M401.1633,-171.974C395.4982,-146.4565 391.0868,-114.547 395,-86 396.8468,-72.5276 400.661,-57.9618 404.3907,-45.7804"/>
<polygon fill="#000000" stroke="#000000" points="407.4587,-36.1851 408.6994,-47.0805 405.9359,-40.9476 404.4131,-45.71 404.4131,-45.71 404.4131,-45.71 405.9359,-40.9476 400.1269,-44.3395 407.4587,-36.1851 407.4587,-36.1851"/>
<text text-anchor="middle" x="409.9892" y="-53.0243" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">1</text>
</g>
<!-- A12 -->
<g id="node13" class="node">
<title>A12</title>
<polygon fill="none" stroke="#000000" points="502.0879,-122 403.9121,-122 403.9121,-86 502.0879,-86 502.0879,-122"/>
<text text-anchor="middle" x="453" y="-101" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;&lt;ProtocolIfc&gt;&gt;</text>
</g>
<!-- A6&#45;&gt;A12 -->
<g id="edge10" class="edge">
<title>A6&#45;&gt;A12</title>
<path fill="none" stroke="#000000" d="M430.7853,-171.8133C435.2329,-158.2365 439.9225,-143.9208 443.8408,-131.9595"/>
<polygon fill="#000000" stroke="#000000" points="447.0602,-122.132 448.2235,-133.036 445.5036,-126.8835 443.9471,-131.6351 443.9471,-131.6351 443.9471,-131.6351 445.5036,-126.8835 439.6707,-130.2341 447.0602,-122.132 447.0602,-122.132"/>
<text text-anchor="middle" x="449.4498" y="-138.9887" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">1</text>
</g>
<!-- A8 -->
<g id="node9" class="node">
<title>A8</title>
<polygon fill="#fff8dc" stroke="#000000" points="583.406,-248 482.594,-248 482.594,-192 589.406,-192 589.406,-242 583.406,-248"/>
<polyline fill="none" stroke="#000000" points="583.406,-248 583.406,-242 "/>
<polyline fill="none" stroke="#000000" points="589.406,-242 583.406,-242 "/>
<text text-anchor="middle" x="536" y="-235" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Creates an GEN3</text>
<text text-anchor="middle" x="536" y="-223" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">inverter instance</text>
<text text-anchor="middle" x="536" y="-211" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">with</text>
<text text-anchor="middle" x="536" y="-199" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">prot_class:Talent</text>
</g>
<!-- A7&#45;&gt;A8 -->
<g id="edge3" class="edge">
<title>A7&#45;&gt;A8</title>
<path fill="none" stroke="#000000" stroke-dasharray="5,2" d="M317.0491,-238.3283C325.9345,-256.0056 342.0793,-281.6949 365,-293 404.8598,-312.6598 424.0578,-310.2929 465,-293 486.6607,-283.8511 504.9784,-264.5049 517.5802,-248.0264"/>
</g>
<!-- A10 -->
<g id="node11" class="node">
<title>A10</title>
<polygon fill="#fff8dc" stroke="#000000" points="247.522,-248 120.478,-248 120.478,-192 253.522,-192 253.522,-242 247.522,-248"/>
<polyline fill="none" stroke="#000000" points="247.522,-248 247.522,-242 "/>
<polyline fill="none" stroke="#000000" points="253.522,-242 247.522,-242 "/>
<text text-anchor="middle" x="187" y="-235" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Creates an GEN3PLUS</text>
<text text-anchor="middle" x="187" y="-223" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">inverter instance</text>
<text text-anchor="middle" x="187" y="-211" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">with</text>
<text text-anchor="middle" x="187" y="-199" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">prot_class:SolarmanV5</text>
</g>
<!-- A9&#45;&gt;A10 -->
<g id="edge4" class="edge">
<title>A9&#45;&gt;A10</title>
<path fill="none" stroke="#000000" stroke-dasharray="5,2" d="M103.0156,-220C108.8114,-220 114.6072,-220 120.403,-220"/>
</g>
<!-- A12&#45;&gt;A11 -->
<g id="edge12" class="edge">
<title>A12&#45;&gt;A11</title>
<path fill="none" stroke="#000000" stroke-dasharray="5,2" d="M444.7291,-85.7616C439.4033,-74.0176 432.3824,-58.5355 426.396,-45.3349"/>
<polygon fill="#000000" stroke="#000000" points="422.259,-36.2121 430.4874,-43.4608 424.324,-40.7657 426.3891,-45.3194 426.3891,-45.3194 426.3891,-45.3194 424.324,-40.7657 422.2908,-47.1779 422.259,-36.2121 422.259,-36.2121"/>
<text text-anchor="middle" x="429.5451" y="-69.7445" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">use</text>
</g>
<!-- A13 -->
<g id="node14" class="node">
<title>A13</title>
<polygon fill="none" stroke="#000000" points="9,-454 9,-486 116,-486 116,-454 9,-454"/>
<text text-anchor="start" x="32.7695" y="-467" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ModbusConn</text>
<polygon fill="none" stroke="#000000" points="9,-386 9,-454 116,-454 116,-386 9,-386"/>
<text text-anchor="start" x="53.0515" y="-435" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">host</text>
<text text-anchor="start" x="53.887" y="-423" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">port</text>
<text text-anchor="start" x="52.497" y="-411" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">addr</text>
<text text-anchor="start" x="18.883" y="-399" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">stream:InverterG3P</text>
<polygon fill="none" stroke="#000000" points="9,-366 9,-386 116,-386 116,-366 9,-366"/>
</g>
<!-- A13&#45;&gt;A9 -->
<g id="edge13" class="edge">
<title>A13&#45;&gt;A9</title>
<path fill="none" stroke="#000000" d="M62,-365.8625C62,-327.1513 62,-278.6088 62,-248.4442"/>
<polygon fill="#000000" stroke="#000000" points="62,-238.2147 66.5001,-248.2147 62,-243.2147 62.0001,-248.2147 62.0001,-248.2147 62.0001,-248.2147 62,-243.2147 57.5001,-248.2148 62,-238.2147 62,-238.2147"/>
<text text-anchor="middle" x="70.4524" y="-253.3409" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">1</text>
<text text-anchor="middle" x="53.5476" y="-344.7363" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">has</text>
</g>
<!-- A14 -->
<g id="node15" class="node">
<title>A14</title>
<polygon fill="none" stroke="#000000" points="0,-714 0,-746 124,-746 124,-714 0,-714"/>
<text text-anchor="start" x="35.8835" y="-727" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ModbusTcp</text>
<polygon fill="none" stroke="#000000" points="0,-694 0,-714 124,-714 124,-694 0,-694"/>
<polygon fill="none" stroke="#000000" points="0,-662 0,-694 124,-694 124,-662 0,-662"/>
<text text-anchor="start" x="9.763" y="-675" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;modbus_loop()</text>
</g>
<!-- A14&#45;&gt;A13 -->
<g id="edge14" class="edge">
<title>A14&#45;&gt;A13</title>
<path fill="none" stroke="#000000" d="M62,-661.7778C62,-617.9184 62,-548.5387 62,-496.3736"/>
<polygon fill="#000000" stroke="#000000" points="62,-486.1827 66.5001,-496.1827 62,-491.1827 62.0001,-496.1827 62.0001,-496.1827 62.0001,-496.1827 62,-491.1827 57.5001,-496.1828 62,-486.1827 62,-486.1827"/>
<text text-anchor="middle" x="70.4524" y="-501.3089" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">*</text>
<text text-anchor="middle" x="53.5476" y="-640.6516" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">creates</text>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 20 KiB

36
app/docu/proxy.yuml Normal file
View File

@@ -0,0 +1,36 @@
// {type:class}
// {direction:topDown}
// {generate:true}
[note: You can stick notes on diagrams too!{bg:cornsilk}]
[<<AbstractIterMeta>>||__iter__()]
[Mqtt;<<Singleton>>|<static>ha_restarts;<static>__client;<static>__cb_MqttIsUp|<async>publish();<async>close()]
[Proxy|<cls>db_stat;<cls>entity_prfx;<cls>discovery_prfx;<cls>proxy_node_id;<cls>proxy_unique_id;<cls>mqtt:Mqtt;;__ha_restarts|class_init();class_close();;<async>_cb_mqtt_is_up();<async>_register_proxy_stat_home_assistant();<async>_async_publ_mqtt_proxy_stat(key)]
[<<InverterIfc>>||healthy()->bool;<async>disc(shutdown_started=False);<async>create_remote();]
[<<AbstractIterMeta>>]^-.-[<<InverterIfc>>]
[InverterBase|_registry;__ha_restarts;;addr;config_id:str;prot_class:MessageProt;remote:StreamPtr;local:StreamPtr;|healthy()->bool;<async>disc(shutdown_started=False);<async>create_remote();<async>async_publ_mqtt()]
[StreamPtr||stream:ProtocolIfc;ifc:AsyncIfc]
[<<InverterIfc>>]^-.-[InverterBase]
[InverterG3]-[note: Creates an GEN3 inverter instance with prot_class:Talent{bg:cornsilk}]
[InverterG3P]-[note: Creates an GEN3PLUS inverter instance with prot_class:SolarmanV5{bg:cornsilk}]
[InverterBase]^[InverterG3]
[InverterBase]^[InverterG3P]
[Proxy]^[InverterBase]
[InverterBase]-2>[StreamPtr]
[Proxy]++->[Mqtt;<<Singleton>>]
[<<AsyncIfc>>]
[StreamPtr]-1>[<<ProtocolIfc>>]
[StreamPtr]-1>[<<AsyncIfc>>]
[<<ProtocolIfc>>]use-.->[<<AsyncIfc>>]
[ModbusConn|host;port;addr;stream:InverterG3P;|]has-1>[InverterG3P]
[ModbusTcp||<async>modbus_loop()]creates-*>[ModbusConn]

383
app/docu/proxy_2.svg Normal file
View File

@@ -0,0 +1,383 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<!-- Generated by graphviz version 2.40.1 (20161225.0304)
-->
<!-- Title: G Pages: 1 -->
<svg width="548pt" height="2000pt"
viewBox="0.00 0.00 548.12 2000.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 1996)">
<title>G</title>
<polygon fill="#ffffff" stroke="transparent" points="-4,4 -4,-1996 544.1155,-1996 544.1155,4 -4,4"/>
<!-- A0 -->
<g id="node1" class="node">
<title>A0</title>
<polygon fill="#fff8dc" stroke="#000000" points="239.7476,-1972 141.4834,-1972 141.4834,-1928 245.7476,-1928 245.7476,-1966 239.7476,-1972"/>
<polyline fill="none" stroke="#000000" points="239.7476,-1972 239.7476,-1966 "/>
<polyline fill="none" stroke="#000000" points="245.7476,-1966 239.7476,-1966 "/>
<text text-anchor="middle" x="193.6155" y="-1959" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Example of</text>
<text text-anchor="middle" x="193.6155" y="-1947" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">instantiation for a</text>
<text text-anchor="middle" x="193.6155" y="-1935" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">GEN3 inverter!</text>
</g>
<!-- A1 -->
<g id="node2" class="node">
<title>A1</title>
<polygon fill="none" stroke="#000000" points="263.6155,-1960 263.6155,-1992 379.6155,-1992 379.6155,-1960 263.6155,-1960"/>
<text text-anchor="start" x="273.2645" y="-1973" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;&lt;AbstractIterMeta&gt;&gt;</text>
<polygon fill="none" stroke="#000000" points="263.6155,-1940 263.6155,-1960 379.6155,-1960 379.6155,-1940 263.6155,-1940"/>
<polygon fill="none" stroke="#000000" points="263.6155,-1908 263.6155,-1940 379.6155,-1940 379.6155,-1908 263.6155,-1908"/>
<text text-anchor="start" x="300.2255" y="-1921" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__iter__()</text>
</g>
<!-- A15 -->
<g id="node16" class="node">
<title>A15</title>
<polygon fill="none" stroke="#000000" points="276.6155,-1748 276.6155,-1780 366.6155,-1780 366.6155,-1748 276.6155,-1748"/>
<text text-anchor="start" x="286.322" y="-1761" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;&lt;ProtocolIfc&gt;&gt;</text>
<polygon fill="none" stroke="#000000" points="276.6155,-1716 276.6155,-1748 366.6155,-1748 366.6155,-1716 276.6155,-1716"/>
<text text-anchor="start" x="302.449" y="-1729" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_registry</text>
<polygon fill="none" stroke="#000000" points="276.6155,-1684 276.6155,-1716 366.6155,-1716 366.6155,-1684 276.6155,-1684"/>
<text text-anchor="start" x="306.618" y="-1697" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A1&#45;&gt;A15 -->
<g id="edge15" class="edge">
<title>A1&#45;&gt;A15</title>
<path fill="none" stroke="#000000" stroke-dasharray="5,2" d="M321.6155,-1897.756C321.6155,-1862.0883 321.6155,-1815.1755 321.6155,-1780.3644"/>
<polygon fill="none" stroke="#000000" points="318.1156,-1897.9674 321.6155,-1907.9674 325.1156,-1897.9674 318.1156,-1897.9674"/>
</g>
<!-- A2 -->
<g id="node3" class="node">
<title>A2</title>
<polygon fill="none" stroke="#000000" points="77.6155,-662 77.6155,-694 175.6155,-694 175.6155,-662 77.6155,-662"/>
<text text-anchor="start" x="98.2755" y="-675" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InverterBase</text>
<polygon fill="none" stroke="#000000" points="77.6155,-606 77.6155,-662 175.6155,-662 175.6155,-606 77.6155,-606"/>
<text text-anchor="start" x="116.6125" y="-643" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">addr</text>
<text text-anchor="start" x="87.1685" y="-631" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">remote:StreamPtr</text>
<text text-anchor="start" x="92.4475" y="-619" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">local:StreamPtr</text>
<polygon fill="none" stroke="#000000" points="77.6155,-550 77.6155,-606 175.6155,-606 175.6155,-550 77.6155,-550"/>
<text text-anchor="start" x="91.0575" y="-587" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">create_remote()</text>
<text text-anchor="start" x="111.618" y="-563" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A3 -->
<g id="node4" class="node">
<title>A3</title>
<polygon fill="none" stroke="#000000" points="75.3469,-320 -.1159,-320 -.1159,-284 75.3469,-284 75.3469,-320"/>
<text text-anchor="middle" x="37.6155" y="-299" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InverterG3</text>
</g>
<!-- A2&#45;&gt;A3 -->
<g id="edge1" class="edge">
<title>A2&#45;&gt;A3</title>
<path fill="none" stroke="#000000" d="M103.8,-539.9668C83.1352,-465.6664 54.2132,-361.677 42.6665,-320.1609"/>
<polygon fill="none" stroke="#000000" points="100.4796,-541.0903 106.5312,-549.7868 107.2236,-539.2146 100.4796,-541.0903"/>
</g>
<!-- A4 -->
<g id="node5" class="node">
<title>A4</title>
<polygon fill="none" stroke="#000000" points="189.9521,-320 93.2789,-320 93.2789,-284 189.9521,-284 189.9521,-320"/>
<text text-anchor="middle" x="141.6155" y="-299" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">local:StreamPtr</text>
</g>
<!-- A2&#45;&gt;A4 -->
<g id="edge2" class="edge">
<title>A2&#45;&gt;A4</title>
<path fill="none" stroke="#000000" d="M130.5679,-537.6831C133.7849,-469.0527 138.1335,-376.283 140.2896,-330.2853"/>
<polygon fill="#000000" stroke="#000000" points="130.5625,-537.7999 134.2771,-543.9807 130.0005,-549.7868 126.2859,-543.606 130.5625,-537.7999"/>
<polygon fill="#000000" stroke="#000000" points="140.7642,-320.1609 144.7909,-330.3606 140.53,-325.1554 140.2959,-330.1499 140.2959,-330.1499 140.2959,-330.1499 140.53,-325.1554 135.8008,-329.9391 140.7642,-320.1609 140.7642,-320.1609"/>
</g>
<!-- A5 -->
<g id="node6" class="node">
<title>A5</title>
<polygon fill="none" stroke="#000000" points="315.0096,-320 208.2214,-320 208.2214,-284 315.0096,-284 315.0096,-320"/>
<text text-anchor="middle" x="261.6155" y="-299" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">remote:StreamPtr</text>
</g>
<!-- A2&#45;&gt;A5 -->
<g id="edge3" class="edge">
<title>A2&#45;&gt;A5</title>
<path fill="none" stroke="#000000" d="M157.861,-538.7126C166.3035,-516.9056 175.6035,-493.4873 184.6155,-472 205.944,-421.1467 232.9474,-362.7699 248.6524,-329.3512"/>
<polygon fill="#000000" stroke="#000000" points="157.8454,-538.7533 159.4203,-545.7903 153.5304,-549.9506 151.9554,-542.9136 157.8454,-538.7533"/>
<polygon fill="#000000" stroke="#000000" points="252.9567,-320.2155 252.7653,-331.1797 250.8256,-324.7387 248.6945,-329.2618 248.6945,-329.2618 248.6945,-329.2618 250.8256,-324.7387 244.6237,-327.3438 252.9567,-320.2155 252.9567,-320.2155"/>
</g>
<!-- A9 -->
<g id="node10" class="node">
<title>A9</title>
<polygon fill="none" stroke="#000000" points="128.6155,-100 128.6155,-132 306.6155,-132 306.6155,-100 128.6155,-100"/>
<text text-anchor="start" x="173.167" y="-113" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">AsyncStreamServer</text>
<polygon fill="none" stroke="#000000" points="128.6155,-68 128.6155,-100 306.6155,-100 306.6155,-68 128.6155,-68"/>
<text text-anchor="start" x="185.3865" y="-81" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">create_remote</text>
<polygon fill="none" stroke="#000000" points="128.6155,0 128.6155,-68 306.6155,-68 306.6155,0 128.6155,0"/>
<text text-anchor="start" x="169.273" y="-49" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;server_loop()</text>
<text text-anchor="start" x="160.104" y="-37" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;_async_forward()</text>
<text text-anchor="start" x="138.4245" y="-25" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;publish_outstanding_mqtt()</text>
<text text-anchor="start" x="202.618" y="-13" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A4&#45;&gt;A9 -->
<g id="edge9" class="edge">
<title>A4&#45;&gt;A9</title>
<path fill="none" stroke="#000000" d="M151.2355,-272.1274C161.7441,-239.4955 178.9835,-185.9626 193.26,-141.6303"/>
<polygon fill="#000000" stroke="#000000" points="151.1313,-272.451 153.0996,-279.3883 147.4529,-283.8733 145.4847,-276.936 151.1313,-272.451"/>
<polygon fill="#000000" stroke="#000000" points="196.3509,-132.0321 197.5689,-142.9302 194.8182,-136.7914 193.2855,-141.5507 193.2855,-141.5507 193.2855,-141.5507 194.8182,-136.7914 189.0022,-140.1713 196.3509,-132.0321 196.3509,-132.0321"/>
</g>
<!-- A10 -->
<g id="node11" class="node">
<title>A10</title>
<polygon fill="none" stroke="#000000" points="362.6155,-82 362.6155,-114 500.6155,-114 500.6155,-82 362.6155,-82"/>
<text text-anchor="start" x="389.1125" y="-95" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">AsyncStreamClient</text>
<polygon fill="none" stroke="#000000" points="362.6155,-62 362.6155,-82 500.6155,-82 500.6155,-62 362.6155,-62"/>
<polygon fill="none" stroke="#000000" points="362.6155,-18 362.6155,-62 500.6155,-62 500.6155,-18 362.6155,-18"/>
<text text-anchor="start" x="385.4935" y="-43" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;client_loop()</text>
<text text-anchor="start" x="372.4395" y="-31" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;_async_forward())</text>
</g>
<!-- A5&#45;&gt;A10 -->
<g id="edge11" class="edge">
<title>A5&#45;&gt;A10</title>
<path fill="none" stroke="#000000" d="M269.2148,-283.6405C279.6962,-259.3121 299.996,-215.5582 323.6155,-182 338.3046,-161.1299 356.5265,-140.1557 373.793,-121.8925"/>
<polygon fill="#000000" stroke="#000000" points="381.1214,-114.2395 377.4553,-124.5745 377.6632,-117.8508 374.2051,-121.4621 374.2051,-121.4621 374.2051,-121.4621 377.6632,-117.8508 370.9549,-118.3498 381.1214,-114.2395 381.1214,-114.2395"/>
<text text-anchor="middle" x="268.7308" y="-260.6464" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">0..1</text>
</g>
<!-- A6 -->
<g id="node7" class="node">
<title>A6</title>
<polygon fill="none" stroke="#000000" points="396.6155,-1114 396.6155,-1146 513.6155,-1146 513.6155,-1114 396.6155,-1114"/>
<text text-anchor="start" x="424.5445" y="-1127" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;&lt;AsyncIfc&gt;&gt;</text>
<polygon fill="none" stroke="#000000" points="396.6155,-1094 396.6155,-1114 513.6155,-1114 513.6155,-1094 396.6155,-1094"/>
<polygon fill="none" stroke="#000000" points="396.6155,-822 396.6155,-1094 513.6155,-1094 513.6155,-822 396.6155,-822"/>
<text text-anchor="start" x="424.5515" y="-1075" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">set_node_id()</text>
<text text-anchor="start" x="422.8815" y="-1063" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">get_conn_no()</text>
<text text-anchor="start" x="436.779" y="-1039" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_add()</text>
<text text-anchor="start" x="434.5595" y="-1027" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_flush()</text>
<text text-anchor="start" x="438.169" y="-1015" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_get()</text>
<text text-anchor="start" x="434.279" y="-1003" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_peek()</text>
<text text-anchor="start" x="438.449" y="-991" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_log()</text>
<text text-anchor="start" x="434.2845" y="-979" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_clear()</text>
<text text-anchor="start" x="438.449" y="-967" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_len()</text>
<text text-anchor="start" x="432.89" y="-943" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">fwd_add()</text>
<text text-anchor="start" x="434.56" y="-931" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">fwd_log()</text>
<text text-anchor="start" x="437.894" y="-919" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_get()</text>
<text text-anchor="start" x="434.004" y="-907" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_peek()</text>
<text text-anchor="start" x="438.174" y="-895" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_log()</text>
<text text-anchor="start" x="434.0095" y="-883" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_clear()</text>
<text text-anchor="start" x="438.174" y="-871" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_len()</text>
<text text-anchor="start" x="430.1145" y="-859" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_set_cb()</text>
<text text-anchor="start" x="406.495" y="-835" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">prot_set_timeout_cb()</text>
</g>
<!-- A7 -->
<g id="node8" class="node">
<title>A7</title>
<polygon fill="none" stroke="#000000" points="447.6155,-652 447.6155,-684 540.6155,-684 540.6155,-652 447.6155,-652"/>
<text text-anchor="start" x="465.7795" y="-665" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">AsyncIfcImpl</text>
<polygon fill="none" stroke="#000000" points="447.6155,-560 447.6155,-652 540.6155,-652 540.6155,-560 447.6155,-560"/>
<text text-anchor="start" x="457.1635" y="-633" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">fwd_fifo:ByteFifo</text>
<text text-anchor="start" x="461.0525" y="-621" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_fifo:ByteFifo</text>
<text text-anchor="start" x="460.7775" y="-609" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_fifo:ByteFifo</text>
<text text-anchor="start" x="460.2115" y="-597" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">conn_no:Count</text>
<text text-anchor="start" x="476.329" y="-585" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">node_id</text>
<text text-anchor="start" x="469.665" y="-573" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">timeout_cb</text>
</g>
<!-- A6&#45;&gt;A7 -->
<g id="edge4" class="edge">
<title>A6&#45;&gt;A7</title>
<path fill="none" stroke="#000000" stroke-dasharray="5,2" d="M473.1735,-811.7434C478.1009,-766.0069 483.1088,-719.5241 486.9345,-684.013"/>
<polygon fill="none" stroke="#000000" points="469.682,-811.4771 472.0907,-821.7945 476.6418,-812.227 469.682,-811.4771"/>
</g>
<!-- A8 -->
<g id="node9" class="node">
<title>A8</title>
<polygon fill="none" stroke="#000000" points="418.6155,-390 418.6155,-422 520.6155,-422 520.6155,-390 418.6155,-390"/>
<text text-anchor="start" x="439.8895" y="-403" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">AsyncStream</text>
<polygon fill="none" stroke="#000000" points="418.6155,-310 418.6155,-390 520.6155,-390 520.6155,-310 418.6155,-310"/>
<text text-anchor="start" x="455.1685" y="-371" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">reader</text>
<text text-anchor="start" x="457.3985" y="-359" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">writer</text>
<text text-anchor="start" x="459.6125" y="-347" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">addr</text>
<text text-anchor="start" x="455.1685" y="-335" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">r_addr</text>
<text text-anchor="start" x="455.7235" y="-323" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">l_addr</text>
<polygon fill="none" stroke="#000000" points="418.6155,-182 418.6155,-310 520.6155,-310 520.6155,-182 418.6155,-182"/>
<text text-anchor="start" x="441.2695" y="-279" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;loop</text>
<text text-anchor="start" x="457.3975" y="-267" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">disc()</text>
<text text-anchor="start" x="454.618" y="-255" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
<text text-anchor="start" x="450.1695" y="-243" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">healthy()</text>
<text text-anchor="start" x="434.886" y="-219" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__async_read()</text>
<text text-anchor="start" x="434.3365" y="-207" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__async_write()</text>
<text text-anchor="start" x="428.2225" y="-195" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__async_forward()</text>
</g>
<!-- A7&#45;&gt;A8 -->
<g id="edge5" class="edge">
<title>A7&#45;&gt;A8</title>
<path fill="none" stroke="#000000" d="M488.1838,-549.5774C485.3646,-511.9877 481.8463,-465.0771 478.6327,-422.2295"/>
<polygon fill="none" stroke="#000000" points="484.7214,-550.2112 488.9596,-559.9214 491.7018,-549.6876 484.7214,-550.2112"/>
</g>
<!-- A8&#45;&gt;A9 -->
<g id="edge6" class="edge">
<title>A8&#45;&gt;A9</title>
<path fill="none" stroke="#000000" d="M412.0877,-185.8777C410.9473,-184.56 409.7899,-183.2666 408.6155,-182 380.1271,-151.2753 341.6819,-125.829 306.7513,-106.6759"/>
<polygon fill="none" stroke="#000000" points="409.4058,-188.1271 418.4338,-193.672 414.834,-183.7074 409.4058,-188.1271"/>
</g>
<!-- A8&#45;&gt;A10 -->
<g id="edge7" class="edge">
<title>A8&#45;&gt;A10</title>
<path fill="none" stroke="#000000" d="M448.6523,-171.8077C445.3431,-151.2556 442.1142,-131.2022 439.3729,-114.1772"/>
<polygon fill="none" stroke="#000000" points="445.2363,-172.6095 450.2815,-181.9259 452.1472,-171.4966 445.2363,-172.6095"/>
</g>
<!-- A11 -->
<g id="node12" class="node">
<title>A11</title>
<polygon fill="none" stroke="#000000" points="193.6155,-740 193.6155,-772 307.6155,-772 307.6155,-740 193.6155,-740"/>
<text text-anchor="start" x="236.7235" y="-753" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Talent</text>
<polygon fill="none" stroke="#000000" points="193.6155,-600 193.6155,-740 307.6155,-740 307.6155,-600 193.6155,-600"/>
<text text-anchor="start" x="231.4385" y="-721" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">conn_no</text>
<text text-anchor="start" x="240.6125" y="-709" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">addr</text>
<text text-anchor="start" x="203.3785" y="-685" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">await_conn_resp_cnt</text>
<text text-anchor="start" x="238.393" y="-673" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">id_str</text>
<text text-anchor="start" x="219.2155" y="-661" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">contact_name</text>
<text text-anchor="start" x="222.5555" y="-649" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">contact_mail</text>
<text text-anchor="start" x="226.16" y="-637" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">db:InfosG3</text>
<text text-anchor="start" x="224.4995" y="-625" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">mb:Modbus</text>
<text text-anchor="start" x="236.7275" y="-613" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">switch</text>
<polygon fill="none" stroke="#000000" points="193.6155,-472 193.6155,-600 307.6155,-600 307.6155,-472 193.6155,-472"/>
<text text-anchor="start" x="208.108" y="-581" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_contact_info()</text>
<text text-anchor="start" x="210.048" y="-569" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_ota_update()</text>
<text text-anchor="start" x="215.892" y="-557" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_get_time()</text>
<text text-anchor="start" x="203.944" y="-545" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_collector_data()</text>
<text text-anchor="start" x="205.889" y="-533" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_inverter_data()</text>
<text text-anchor="start" x="215.056" y="-521" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_unknown()</text>
<text text-anchor="start" x="231.1695" y="-497" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">healthy()</text>
<text text-anchor="start" x="235.618" y="-485" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A11&#45;&gt;A4 -->
<g id="edge8" class="edge">
<title>A11&#45;&gt;A4</title>
<path fill="none" stroke="#000000" d="M196.2254,-462.3225C179.1579,-412.2162 162.0761,-362.0677 151.6753,-331.5332"/>
<polygon fill="#000000" stroke="#000000" points="199.4666,-471.8382 191.9826,-463.8233 197.8544,-467.1053 196.2422,-462.3723 196.2422,-462.3723 196.2422,-462.3723 197.8544,-467.1053 200.5019,-460.9213 199.4666,-471.8382 199.4666,-471.8382"/>
<polygon fill="#000000" stroke="#000000" points="151.6435,-331.4398 145.9225,-327.05 147.7742,-320.0807 153.4952,-324.4705 151.6435,-331.4398"/>
</g>
<!-- A11&#45;&gt;A5 -->
<g id="edge10" class="edge">
<title>A11&#45;&gt;A5</title>
<path fill="none" stroke="#000000" d="M256.1287,-461.6172C258.0803,-404.8425 260.0297,-348.132 260.994,-320.0807"/>
<polygon fill="#000000" stroke="#000000" points="255.7773,-471.8382 251.6236,-461.6895 255.9491,-466.8412 256.121,-461.8441 256.121,-461.8441 256.121,-461.8441 255.9491,-466.8412 260.6183,-461.9988 255.7773,-471.8382 255.7773,-471.8382"/>
<text text-anchor="middle" x="268.8186" y="-335.4866" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">0..1</text>
</g>
<!-- A13 -->
<g id="node14" class="node">
<title>A13</title>
<polygon fill="none" stroke="#000000" points="333.6155,-318 333.6155,-350 400.6155,-350 400.6155,-318 333.6155,-318"/>
<text text-anchor="start" x="349.6085" y="-331" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InfosG3</text>
<polygon fill="none" stroke="#000000" points="333.6155,-298 333.6155,-318 400.6155,-318 400.6155,-298 333.6155,-298"/>
<polygon fill="none" stroke="#000000" points="333.6155,-254 333.6155,-298 400.6155,-298 400.6155,-254 333.6155,-254"/>
<text text-anchor="start" x="343.4995" y="-279" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_confs()</text>
<text text-anchor="start" x="351.2835" y="-267" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">parse()</text>
</g>
<!-- A11&#45;&gt;A13 -->
<g id="edge13" class="edge">
<title>A11&#45;&gt;A13</title>
<path fill="none" stroke="#000000" d="M305.5203,-471.8728C311.6394,-455.0532 317.7699,-438.1631 323.6155,-422 330.9569,-401.7009 338.9463,-379.4498 346.0242,-359.681"/>
<polygon fill="#000000" stroke="#000000" points="349.4187,-350.1951 350.2862,-361.1266 347.734,-354.9028 346.0494,-359.6104 346.0494,-359.6104 346.0494,-359.6104 347.734,-354.9028 341.8125,-358.0942 349.4187,-350.1951 349.4187,-350.1951"/>
</g>
<!-- A12 -->
<g id="node13" class="node">
<title>A12</title>
<polygon fill="none" stroke="#000000" points="326.6155,-710 326.6155,-742 429.6155,-742 429.6155,-710 326.6155,-710"/>
<text text-anchor="start" x="367.2775" y="-723" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Infos</text>
<polygon fill="none" stroke="#000000" points="326.6155,-654 326.6155,-710 429.6155,-710 429.6155,-654 326.6155,-654"/>
<text text-anchor="start" x="370.057" y="-691" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">stat</text>
<text text-anchor="start" x="345.6015" y="-679" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">new_stat_data</text>
<text text-anchor="start" x="359.219" y="-667" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">info_dev</text>
<polygon fill="none" stroke="#000000" points="326.6155,-502 326.6155,-654 429.6155,-654 429.6155,-502 326.6155,-502"/>
<text text-anchor="start" x="353.951" y="-635" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">static_init()</text>
<text text-anchor="start" x="352" y="-623" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">dev_value()</text>
<text text-anchor="start" x="348.946" y="-611" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">inc_counter()</text>
<text text-anchor="start" x="347.276" y="-599" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">dec_counter()</text>
<text text-anchor="start" x="345.3255" y="-587" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_proxy_conf</text>
<text text-anchor="start" x="360.3285" y="-575" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_conf</text>
<text text-anchor="start" x="353.1095" y="-563" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_remove</text>
<text text-anchor="start" x="354.49" y="-551" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">update_db</text>
<text text-anchor="start" x="338.6525" y="-539" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">set_db_def_value</text>
<text text-anchor="start" x="348.101" y="-527" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">get_db_value</text>
<text text-anchor="start" x="336.438" y="-515" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ignore_this_device</text>
</g>
<!-- A12&#45;&gt;A13 -->
<g id="edge12" class="edge">
<title>A12&#45;&gt;A13</title>
<path fill="none" stroke="#000000" d="M373.1357,-491.6786C371.4196,-441.7544 369.5661,-387.8351 368.2756,-350.293"/>
<polygon fill="none" stroke="#000000" points="369.6466,-492.0596 373.4882,-501.9334 376.6425,-491.819 369.6466,-492.0596"/>
</g>
<!-- A14 -->
<g id="node15" class="node">
<title>A14</title>
<polygon fill="none" stroke="#000000" points="297.6155,-1524 297.6155,-1556 446.6155,-1556 446.6155,-1524 297.6155,-1524"/>
<text text-anchor="start" x="351.833" y="-1537" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Message</text>
<polygon fill="none" stroke="#000000" points="297.6155,-1300 297.6155,-1524 446.6155,-1524 446.6155,-1300 297.6155,-1300"/>
<text text-anchor="start" x="335.442" y="-1505" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">server_side:bool</text>
<text text-anchor="start" x="345.9995" y="-1493" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">mb:Modbus</text>
<text text-anchor="start" x="346.834" y="-1481" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ifc:AsyncIfc</text>
<text text-anchor="start" x="354.329" y="-1469" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">node_id</text>
<text text-anchor="start" x="332.6585" y="-1457" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">header_valid:bool</text>
<text text-anchor="start" x="347.1055" y="-1445" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">header_len</text>
<text text-anchor="start" x="352.9395" y="-1433" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">data_len</text>
<text text-anchor="start" x="350.44" y="-1421" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">unique_id</text>
<text text-anchor="start" x="344.3305" y="-1409" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">sug_area:str</text>
<text text-anchor="start" x="341.2715" y="-1397" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">new_data:dict</text>
<text text-anchor="start" x="348.2155" y="-1385" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">state:State</text>
<text text-anchor="start" x="321.82" y="-1373" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">shutdown_started:bool</text>
<text text-anchor="start" x="341" y="-1361" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">modbus_elms</text>
<text text-anchor="start" x="337.1225" y="-1349" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">mb_timer:Timer</text>
<text text-anchor="start" x="346.0005" y="-1337" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">mb_timeout</text>
<text text-anchor="start" x="335.168" y="-1325" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">mb_first_timeout</text>
<text text-anchor="start" x="326.2695" y="-1313" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">modbus_polling:bool</text>
<polygon fill="none" stroke="#000000" points="297.6155,-1196 297.6155,-1300 446.6155,-1300 446.6155,-1196 297.6155,-1196"/>
<text text-anchor="start" x="321" y="-1281" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_set_mqtt_timestamp()</text>
<text text-anchor="start" x="349.6155" y="-1269" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_timeout()</text>
<text text-anchor="start" x="322.383" y="-1257" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_send_modbus_cmd()</text>
<text text-anchor="start" x="307.375" y="-1245" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt; end_modbus_cmd()</text>
<text text-anchor="start" x="357.118" y="-1233" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
<text text-anchor="start" x="342.946" y="-1221" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">inc_counter()</text>
<text text-anchor="start" x="341.276" y="-1209" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">dec_counter()</text>
</g>
<!-- A14&#45;&gt;A6 -->
<g id="edge14" class="edge">
<title>A14&#45;&gt;A6</title>
<path fill="none" stroke="#000000" d="M409.7752,-1195.7758C412.5746,-1182.5547 415.3943,-1169.2373 418.1831,-1156.0662"/>
<polygon fill="#000000" stroke="#000000" points="420.3088,-1146.0268 422.6397,-1156.7421 419.2731,-1150.9183 418.2373,-1155.8099 418.2373,-1155.8099 418.2373,-1155.8099 419.2731,-1150.9183 413.8349,-1154.8777 420.3088,-1146.0268 420.3088,-1146.0268"/>
<text text-anchor="middle" x="405.2609" y="-1173.292" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">use</text>
</g>
<!-- A14&#45;&gt;A11 -->
<g id="edge17" class="edge">
<title>A14&#45;&gt;A11</title>
<path fill="none" stroke="#000000" d="M341.1152,-1185.9405C320.5475,-1057.7747 293.7865,-891.0162 274.7116,-772.1524"/>
<polygon fill="none" stroke="#000000" points="337.6695,-1186.5583 342.7099,-1195.8774 344.5811,-1185.4491 337.6695,-1186.5583"/>
</g>
<!-- A15&#45;&gt;A14 -->
<g id="edge16" class="edge">
<title>A15&#45;&gt;A14</title>
<path fill="none" stroke="#000000" stroke-dasharray="5,2" d="M329.7896,-1673.8004C334.3532,-1641.3079 340.3126,-1598.8764 346.2965,-1556.2713"/>
<polygon fill="none" stroke="#000000" points="326.2837,-1673.5986 328.3587,-1683.9883 333.2156,-1674.5723 326.2837,-1673.5986"/>
</g>
<!-- A16 -->
<g id="node17" class="node">
<title>A16</title>
<polygon fill="none" stroke="#000000" points="385.6155,-1826 385.6155,-1858 460.6155,-1858 460.6155,-1826 385.6155,-1826"/>
<text text-anchor="start" x="405.333" y="-1839" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Modbus</text>
<polygon fill="none" stroke="#000000" points="385.6155,-1674 385.6155,-1826 460.6155,-1826 460.6155,-1674 385.6155,-1674"/>
<text text-anchor="start" x="414.777" y="-1807" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">que</text>
<text text-anchor="start" x="395.6055" y="-1783" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">snd_handler</text>
<text text-anchor="start" x="396.7205" y="-1771" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rsp_handler</text>
<text text-anchor="start" x="406.724" y="-1759" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">timeout</text>
<text text-anchor="start" x="397.005" y="-1747" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">max_retires</text>
<text text-anchor="start" x="405.0575" y="-1735" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">last_xxx</text>
<text text-anchor="start" x="417.007" y="-1723" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">err</text>
<text text-anchor="start" x="403.669" y="-1711" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">retry_cnt</text>
<text text-anchor="start" x="401.9945" y="-1699" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">req_pend</text>
<text text-anchor="start" x="416.452" y="-1687" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tim</text>
<polygon fill="none" stroke="#000000" points="385.6155,-1606 385.6155,-1674 460.6155,-1674 460.6155,-1606 385.6155,-1606"/>
<text text-anchor="start" x="397.0055" y="-1655" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">build_msg()</text>
<text text-anchor="start" x="400.3395" y="-1643" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">recv_req()</text>
<text text-anchor="start" x="397.8395" y="-1631" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">recv_resp()</text>
<text text-anchor="start" x="408.118" y="-1619" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A16&#45;&gt;A14 -->
<g id="edge18" class="edge">
<title>A16&#45;&gt;A14</title>
<path fill="none" stroke="#000000" d="M403.1382,-1596.041C401.2623,-1582.9463 399.3403,-1569.5297 397.4159,-1556.0971"/>
<polygon fill="#000000" stroke="#000000" points="404.563,-1605.9867 398.6903,-1596.726 403.8539,-1601.0373 403.1448,-1596.0878 403.1448,-1596.0878 403.1448,-1596.0878 403.8539,-1601.0373 407.5994,-1595.4496 404.563,-1605.9867 404.563,-1605.9867"/>
<text text-anchor="middle" x="408.3534" y="-1569.8414" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">has</text>
<text text-anchor="middle" x="393.6256" y="-1586.2424" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">0..1</text>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 34 KiB

43
app/docu/proxy_2.yuml Normal file
View File

@@ -0,0 +1,43 @@
// {type:class}
// {direction:topDown}
// {generate:true}
[note: Example of instantiation for a GEN3 inverter!{bg:cornsilk}]
[<<AbstractIterMeta>>||__iter__()]
[InverterBase|addr;remote:StreamPtr;local:StreamPtr|create_remote();;close()]
[InverterBase]^[InverterG3]
[InverterBase]++->[local:StreamPtr]
[InverterBase]++->[remote:StreamPtr]
[<<AsyncIfc>>||set_node_id();get_conn_no();;tx_add();tx_flush();tx_get();tx_peek();tx_log();tx_clear();tx_len();;fwd_add();fwd_log();rx_get();rx_peek();rx_log();rx_clear();rx_len();rx_set_cb();;prot_set_timeout_cb()]
[AsyncIfcImpl|fwd_fifo:ByteFifo;tx_fifo:ByteFifo;rx_fifo:ByteFifo;conn_no:Count;node_id;timeout_cb]
[AsyncStream|reader;writer;addr;r_addr;l_addr|;<async>loop;disc();close();healthy();;__async_read();__async_write();__async_forward()]
[AsyncStreamServer|create_remote|<async>server_loop();<async>_async_forward();<async>publish_outstanding_mqtt();close()]
[AsyncStreamClient||<async>client_loop();<async>_async_forward())]
[<<AsyncIfc>>]^-.-[AsyncIfcImpl]
[AsyncIfcImpl]^[AsyncStream]
[AsyncStream]^[AsyncStreamServer]
[AsyncStream]^[AsyncStreamClient]
[Talent|conn_no;addr;;await_conn_resp_cnt;id_str;contact_name;contact_mail;db:InfosG3;mb:Modbus;switch|msg_contact_info();msg_ota_update();msg_get_time();msg_collector_data();msg_inverter_data();msg_unknown();;healthy();close()]
[Talent]<-++[local:StreamPtr]
[local:StreamPtr]++->[AsyncStreamServer]
[Talent]<-0..1[remote:StreamPtr]
[remote:StreamPtr]0..1->[AsyncStreamClient]
[Infos|stat;new_stat_data;info_dev|static_init();dev_value();inc_counter();dec_counter();ha_proxy_conf;ha_conf;ha_remove;update_db;set_db_def_value;get_db_value;ignore_this_device]
[Infos]^[InfosG3||ha_confs();parse()]
[Talent]->[InfosG3]
[Message|server_side:bool;mb:Modbus;ifc:AsyncIfc;node_id;header_valid:bool;header_len;data_len;unique_id;sug_area:str;new_data:dict;state:State;shutdown_started:bool;modbus_elms;mb_timer:Timer;mb_timeout;mb_first_timeout;modbus_polling:bool|_set_mqtt_timestamp();_timeout();_send_modbus_cmd();<async> end_modbus_cmd();close();inc_counter();dec_counter()]
[Message]use->[<<AsyncIfc>>]
[<<ProtocolIfc>>|_registry|close()]
[<<AbstractIterMeta>>]^-.-[<<ProtocolIfc>>]
[<<ProtocolIfc>>]^-.-[Message]
[Message]^[Talent]
[Modbus|que;;snd_handler;rsp_handler;timeout;max_retires;last_xxx;err;retry_cnt;req_pend;tim|build_msg();recv_req();recv_resp();close()]
[Modbus]<0..1-has[Message]

382
app/docu/proxy_3.svg Normal file
View File

@@ -0,0 +1,382 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<!-- Generated by graphviz version 2.40.1 (20161225.0304)
-->
<!-- Title: G Pages: 1 -->
<svg width="597pt" height="1940pt"
viewBox="0.00 0.00 597.00 1940.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 1936)">
<title>G</title>
<polygon fill="#ffffff" stroke="transparent" points="-4,4 -4,-1936 593,-1936 593,4 -4,4"/>
<!-- A0 -->
<g id="node1" class="node">
<title>A0</title>
<polygon fill="#fff8dc" stroke="#000000" points="287.2332,-1912 172.7668,-1912 172.7668,-1868 293.2332,-1868 293.2332,-1906 287.2332,-1912"/>
<polyline fill="none" stroke="#000000" points="287.2332,-1912 287.2332,-1906 "/>
<polyline fill="none" stroke="#000000" points="293.2332,-1906 287.2332,-1906 "/>
<text text-anchor="middle" x="233" y="-1899" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Example of</text>
<text text-anchor="middle" x="233" y="-1887" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">instantiation for a</text>
<text text-anchor="middle" x="233" y="-1875" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">GEN3PLUS inverter!</text>
</g>
<!-- A1 -->
<g id="node2" class="node">
<title>A1</title>
<polygon fill="none" stroke="#000000" points="311,-1900 311,-1932 427,-1932 427,-1900 311,-1900"/>
<text text-anchor="start" x="320.649" y="-1913" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;&lt;AbstractIterMeta&gt;&gt;</text>
<polygon fill="none" stroke="#000000" points="311,-1880 311,-1900 427,-1900 427,-1880 311,-1880"/>
<polygon fill="none" stroke="#000000" points="311,-1848 311,-1880 427,-1880 427,-1848 311,-1848"/>
<text text-anchor="start" x="347.61" y="-1861" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__iter__()</text>
</g>
<!-- A15 -->
<g id="node16" class="node">
<title>A15</title>
<polygon fill="none" stroke="#000000" points="324,-1688 324,-1720 414,-1720 414,-1688 324,-1688"/>
<text text-anchor="start" x="333.7065" y="-1701" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;&lt;ProtocolIfc&gt;&gt;</text>
<polygon fill="none" stroke="#000000" points="324,-1656 324,-1688 414,-1688 414,-1656 324,-1656"/>
<text text-anchor="start" x="349.8335" y="-1669" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_registry</text>
<polygon fill="none" stroke="#000000" points="324,-1624 324,-1656 414,-1656 414,-1624 324,-1624"/>
<text text-anchor="start" x="354.0025" y="-1637" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A1&#45;&gt;A15 -->
<g id="edge15" class="edge">
<title>A1&#45;&gt;A15</title>
<path fill="none" stroke="#000000" stroke-dasharray="5,2" d="M369,-1837.756C369,-1802.0883 369,-1755.1755 369,-1720.3644"/>
<polygon fill="none" stroke="#000000" points="365.5001,-1837.9674 369,-1847.9674 372.5001,-1837.9674 365.5001,-1837.9674"/>
</g>
<!-- A2 -->
<g id="node3" class="node">
<title>A2</title>
<polygon fill="none" stroke="#000000" points="128,-632 128,-664 226,-664 226,-632 128,-632"/>
<text text-anchor="start" x="148.66" y="-645" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InverterBase</text>
<polygon fill="none" stroke="#000000" points="128,-576 128,-632 226,-632 226,-576 128,-576"/>
<text text-anchor="start" x="166.997" y="-613" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">addr</text>
<text text-anchor="start" x="137.553" y="-601" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">remote:StreamPtr</text>
<text text-anchor="start" x="142.832" y="-589" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">local:StreamPtr</text>
<polygon fill="none" stroke="#000000" points="128,-520 128,-576 226,-576 226,-520 128,-520"/>
<text text-anchor="start" x="141.442" y="-557" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">create_remote()</text>
<text text-anchor="start" x="162.0025" y="-533" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A3 -->
<g id="node4" class="node">
<title>A3</title>
<polygon fill="none" stroke="#000000" points="0,-302 0,-334 116,-334 116,-302 0,-302"/>
<text text-anchor="start" x="31.05" y="-315" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InverterG3P</text>
<polygon fill="none" stroke="#000000" points="0,-270 0,-302 116,-302 116,-270 0,-270"/>
<text text-anchor="start" x="9.6585" y="-283" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">forward_at_cmd_resp</text>
</g>
<!-- A2&#45;&gt;A3 -->
<g id="edge1" class="edge">
<title>A2&#45;&gt;A3</title>
<path fill="none" stroke="#000000" d="M143.4931,-510.3444C119.4997,-451.8732 88.4875,-376.2972 71.177,-334.1121"/>
<polygon fill="none" stroke="#000000" points="140.3971,-512.0193 147.4314,-519.942 146.873,-509.3619 140.3971,-512.0193"/>
</g>
<!-- A4 -->
<g id="node5" class="node">
<title>A4</title>
<polygon fill="none" stroke="#000000" points="230.3366,-320 133.6634,-320 133.6634,-284 230.3366,-284 230.3366,-320"/>
<text text-anchor="middle" x="182" y="-299" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">local:StreamPtr</text>
</g>
<!-- A2&#45;&gt;A4 -->
<g id="edge2" class="edge">
<title>A2&#45;&gt;A4</title>
<path fill="none" stroke="#000000" d="M178.4553,-507.5905C179.4883,-447.68 180.8113,-370.9429 181.5127,-330.266"/>
<polygon fill="#000000" stroke="#000000" points="178.4493,-507.9438 182.3453,-514.0119 178.2424,-519.942 174.3465,-513.8739 178.4493,-507.9438"/>
<polygon fill="#000000" stroke="#000000" points="181.6851,-320.2627 186.012,-330.3388 181.5989,-325.2619 181.5126,-330.2612 181.5126,-330.2612 181.5126,-330.2612 181.5989,-325.2619 177.0133,-330.1836 181.6851,-320.2627 181.6851,-320.2627"/>
</g>
<!-- A5 -->
<g id="node6" class="node">
<title>A5</title>
<polygon fill="none" stroke="#000000" points="355.3941,-320 248.6059,-320 248.6059,-284 355.3941,-284 355.3941,-320"/>
<text text-anchor="middle" x="302" y="-299" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">remote:StreamPtr</text>
</g>
<!-- A2&#45;&gt;A5 -->
<g id="edge3" class="edge">
<title>A2&#45;&gt;A5</title>
<path fill="none" stroke="#000000" d="M212.8581,-508.8093C238.9076,-448.3743 272.5536,-370.3156 290.1233,-329.5539"/>
<polygon fill="#000000" stroke="#000000" points="212.8095,-508.9221 214.1078,-516.0154 208.0595,-519.942 206.7612,-512.8487 212.8095,-508.9221"/>
<polygon fill="#000000" stroke="#000000" points="294.1282,-320.2627 294.3023,-331.2272 292.149,-324.8543 290.1698,-329.4459 290.1698,-329.4459 290.1698,-329.4459 292.149,-324.8543 286.0373,-327.6647 294.1282,-320.2627 294.1282,-320.2627"/>
</g>
<!-- A9 -->
<g id="node10" class="node">
<title>A9</title>
<polygon fill="none" stroke="#000000" points="183,-100 183,-132 361,-132 361,-100 183,-100"/>
<text text-anchor="start" x="227.5515" y="-113" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">AsyncStreamServer</text>
<polygon fill="none" stroke="#000000" points="183,-68 183,-100 361,-100 361,-68 183,-68"/>
<text text-anchor="start" x="239.771" y="-81" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">create_remote</text>
<polygon fill="none" stroke="#000000" points="183,0 183,-68 361,-68 361,0 183,0"/>
<text text-anchor="start" x="223.6575" y="-49" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;server_loop()</text>
<text text-anchor="start" x="214.4885" y="-37" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;_async_forward()</text>
<text text-anchor="start" x="192.809" y="-25" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;publish_outstanding_mqtt()</text>
<text text-anchor="start" x="257.0025" y="-13" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A4&#45;&gt;A9 -->
<g id="edge9" class="edge">
<title>A4&#45;&gt;A9</title>
<path fill="none" stroke="#000000" d="M193.2169,-272.5869C205.6662,-239.9419 226.2449,-185.9801 243.2484,-141.3932"/>
<polygon fill="#000000" stroke="#000000" points="193.1887,-272.661 194.7881,-279.6925 188.9127,-283.8733 187.3132,-276.8418 193.1887,-272.661"/>
<polygon fill="#000000" stroke="#000000" points="246.8183,-132.0321 247.4596,-142.9792 245.0366,-136.7039 243.2549,-141.3757 243.2549,-141.3757 243.2549,-141.3757 245.0366,-136.7039 239.0503,-139.7723 246.8183,-132.0321 246.8183,-132.0321"/>
</g>
<!-- A10 -->
<g id="node11" class="node">
<title>A10</title>
<polygon fill="none" stroke="#000000" points="424,-82 424,-114 562,-114 562,-82 424,-82"/>
<text text-anchor="start" x="450.497" y="-95" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">AsyncStreamClient</text>
<polygon fill="none" stroke="#000000" points="424,-62 424,-82 562,-82 562,-62 424,-62"/>
<polygon fill="none" stroke="#000000" points="424,-18 424,-62 562,-62 562,-18 424,-18"/>
<text text-anchor="start" x="446.878" y="-43" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;client_loop()</text>
<text text-anchor="start" x="433.824" y="-31" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;_async_forward())</text>
</g>
<!-- A5&#45;&gt;A10 -->
<g id="edge11" class="edge">
<title>A5&#45;&gt;A10</title>
<path fill="none" stroke="#000000" d="M308.989,-283.7374C318.9281,-259.1334 338.7724,-214.6635 364,-182 380.8963,-160.1235 402.2571,-139.0239 422.71,-120.9559"/>
<polygon fill="#000000" stroke="#000000" points="430.4931,-114.1842 425.9026,-124.143 426.721,-117.4662 422.9488,-120.7481 422.9488,-120.7481 422.9488,-120.7481 426.721,-117.4662 419.9951,-117.3532 430.4931,-114.1842 430.4931,-114.1842"/>
<text text-anchor="middle" x="308.0806" y="-260.758" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">0..1</text>
</g>
<!-- A6 -->
<g id="node7" class="node">
<title>A6</title>
<polygon fill="none" stroke="#000000" points="443,-1054 443,-1086 560,-1086 560,-1054 443,-1054"/>
<text text-anchor="start" x="470.929" y="-1067" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;&lt;AsyncIfc&gt;&gt;</text>
<polygon fill="none" stroke="#000000" points="443,-1034 443,-1054 560,-1054 560,-1034 443,-1034"/>
<polygon fill="none" stroke="#000000" points="443,-762 443,-1034 560,-1034 560,-762 443,-762"/>
<text text-anchor="start" x="470.936" y="-1015" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">set_node_id()</text>
<text text-anchor="start" x="469.266" y="-1003" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">get_conn_no()</text>
<text text-anchor="start" x="483.1635" y="-979" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_add()</text>
<text text-anchor="start" x="480.944" y="-967" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_flush()</text>
<text text-anchor="start" x="484.5535" y="-955" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_get()</text>
<text text-anchor="start" x="480.6635" y="-943" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_peek()</text>
<text text-anchor="start" x="484.8335" y="-931" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_log()</text>
<text text-anchor="start" x="480.669" y="-919" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_clear()</text>
<text text-anchor="start" x="484.8335" y="-907" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_len()</text>
<text text-anchor="start" x="479.2745" y="-883" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">fwd_add()</text>
<text text-anchor="start" x="480.9445" y="-871" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">fwd_log()</text>
<text text-anchor="start" x="484.2785" y="-859" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_get()</text>
<text text-anchor="start" x="480.3885" y="-847" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_peek()</text>
<text text-anchor="start" x="484.5585" y="-835" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_log()</text>
<text text-anchor="start" x="480.394" y="-823" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_clear()</text>
<text text-anchor="start" x="484.5585" y="-811" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_len()</text>
<text text-anchor="start" x="476.499" y="-799" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_set_cb()</text>
<text text-anchor="start" x="452.8795" y="-775" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">prot_set_timeout_cb()</text>
</g>
<!-- A7 -->
<g id="node8" class="node">
<title>A7</title>
<polygon fill="none" stroke="#000000" points="494,-622 494,-654 587,-654 587,-622 494,-622"/>
<text text-anchor="start" x="512.164" y="-635" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">AsyncIfcImpl</text>
<polygon fill="none" stroke="#000000" points="494,-530 494,-622 587,-622 587,-530 494,-530"/>
<text text-anchor="start" x="503.548" y="-603" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">fwd_fifo:ByteFifo</text>
<text text-anchor="start" x="507.437" y="-591" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tx_fifo:ByteFifo</text>
<text text-anchor="start" x="507.162" y="-579" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rx_fifo:ByteFifo</text>
<text text-anchor="start" x="506.596" y="-567" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">conn_no:Count</text>
<text text-anchor="start" x="522.7135" y="-555" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">node_id</text>
<text text-anchor="start" x="516.0495" y="-543" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">timeout_cb</text>
</g>
<!-- A6&#45;&gt;A7 -->
<g id="edge4" class="edge">
<title>A6&#45;&gt;A7</title>
<path fill="none" stroke="#000000" stroke-dasharray="5,2" d="M521.2574,-751.5524C525.3561,-716.6607 529.4102,-682.1494 532.6937,-654.1971"/>
<polygon fill="none" stroke="#000000" points="517.7337,-751.5502 520.043,-761.8903 524.6859,-752.367 517.7337,-751.5502"/>
</g>
<!-- A8 -->
<g id="node9" class="node">
<title>A8</title>
<polygon fill="none" stroke="#000000" points="487,-390 487,-422 589,-422 589,-390 487,-390"/>
<text text-anchor="start" x="508.274" y="-403" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">AsyncStream</text>
<polygon fill="none" stroke="#000000" points="487,-310 487,-390 589,-390 589,-310 487,-310"/>
<text text-anchor="start" x="523.553" y="-371" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">reader</text>
<text text-anchor="start" x="525.783" y="-359" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">writer</text>
<text text-anchor="start" x="527.997" y="-347" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">addr</text>
<text text-anchor="start" x="523.553" y="-335" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">r_addr</text>
<text text-anchor="start" x="524.108" y="-323" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">l_addr</text>
<polygon fill="none" stroke="#000000" points="487,-182 487,-310 589,-310 589,-182 487,-182"/>
<text text-anchor="start" x="509.654" y="-279" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;loop</text>
<text text-anchor="start" x="525.782" y="-267" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">disc()</text>
<text text-anchor="start" x="523.0025" y="-255" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
<text text-anchor="start" x="518.554" y="-243" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">healthy()</text>
<text text-anchor="start" x="503.2705" y="-219" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__async_read()</text>
<text text-anchor="start" x="502.721" y="-207" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__async_write()</text>
<text text-anchor="start" x="496.607" y="-195" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__async_forward()</text>
</g>
<!-- A7&#45;&gt;A8 -->
<g id="edge5" class="edge">
<title>A7&#45;&gt;A8</title>
<path fill="none" stroke="#000000" d="M539.5006,-519.5861C539.2971,-490.0737 539.0562,-455.1552 538.8278,-422.0295"/>
<polygon fill="none" stroke="#000000" points="536.002,-519.8121 539.5709,-529.7877 543.0018,-519.7638 536.002,-519.8121"/>
</g>
<!-- A8&#45;&gt;A9 -->
<g id="edge6" class="edge">
<title>A8&#45;&gt;A9</title>
<path fill="none" stroke="#000000" d="M480.5271,-185.826C479.3695,-184.5245 478.1938,-183.2483 477,-182 444.7093,-148.2346 400.4099,-121.5033 361.252,-102.2528"/>
<polygon fill="none" stroke="#000000" points="477.867,-188.1011 486.9604,-193.5382 483.2424,-183.6171 477.867,-188.1011"/>
</g>
<!-- A8&#45;&gt;A10 -->
<g id="edge7" class="edge">
<title>A8&#45;&gt;A10</title>
<path fill="none" stroke="#000000" d="M513.2286,-172.088C509.2911,-151.438 505.4474,-131.2796 502.1863,-114.1772"/>
<polygon fill="none" stroke="#000000" points="509.7933,-172.7584 515.1045,-181.9259 516.6695,-171.4473 509.7933,-172.7584"/>
</g>
<!-- A11 -->
<g id="node12" class="node">
<title>A11</title>
<polygon fill="none" stroke="#000000" points="244,-668 244,-700 354,-700 354,-668 244,-668"/>
<text text-anchor="start" x="271.495" y="-681" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">SolarmanV5</text>
<polygon fill="none" stroke="#000000" points="244,-552 244,-668 354,-668 354,-552 244,-552"/>
<text text-anchor="start" x="279.823" y="-649" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">conn_no</text>
<text text-anchor="start" x="288.997" y="-637" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">addr</text>
<text text-anchor="start" x="253.994" y="-625" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">inverter:InverterG3P</text>
<text text-anchor="start" x="283.998" y="-613" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">control</text>
<text text-anchor="start" x="287.0575" y="-601" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">serial</text>
<text text-anchor="start" x="292.056" y="-589" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">snr</text>
<text text-anchor="start" x="271.21" y="-577" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">db:InfosG3P</text>
<text text-anchor="start" x="285.112" y="-565" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">switch</text>
<polygon fill="none" stroke="#000000" points="244,-484 244,-552 354,-552 354,-484 244,-484"/>
<text text-anchor="start" x="263.4405" y="-533" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_unknown()</text>
<text text-anchor="start" x="279.554" y="-509" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">healthy()</text>
<text text-anchor="start" x="284.0025" y="-497" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A11&#45;&gt;A4 -->
<g id="edge8" class="edge">
<title>A11&#45;&gt;A4</title>
<path fill="none" stroke="#000000" d="M251.4932,-474.2481C230.4181,-422.0107 207.3684,-364.879 193.8227,-331.3042"/>
<polygon fill="#000000" stroke="#000000" points="255.2671,-483.6023 247.3524,-476.0123 253.3964,-478.9655 251.5256,-474.3286 251.5256,-474.3286 251.5256,-474.3286 253.3964,-478.9655 255.6988,-472.6449 255.2671,-483.6023 255.2671,-483.6023"/>
<polygon fill="#000000" stroke="#000000" points="193.7733,-331.1815 187.8189,-327.1139 189.2835,-320.053 195.2379,-324.1207 193.7733,-331.1815"/>
</g>
<!-- A11&#45;&gt;A5 -->
<g id="edge10" class="edge">
<title>A11&#45;&gt;A5</title>
<path fill="none" stroke="#000000" d="M300.2256,-473.5237C300.8304,-415.0627 301.4968,-350.6465 301.8132,-320.053"/>
<polygon fill="#000000" stroke="#000000" points="300.1214,-483.6023 295.7251,-473.5562 300.1731,-478.6026 300.2249,-473.6028 300.2249,-473.6028 300.2249,-473.6028 300.1731,-478.6026 304.7247,-473.6494 300.1214,-483.6023 300.1214,-483.6023"/>
<text text-anchor="middle" x="310.0777" y="-335.2657" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">0..1</text>
</g>
<!-- A13 -->
<g id="node14" class="node">
<title>A13</title>
<polygon fill="none" stroke="#000000" points="374,-336 374,-368 469,-368 469,-336 374,-336"/>
<text text-anchor="start" x="400.6585" y="-349" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InfosG3P</text>
<polygon fill="none" stroke="#000000" points="374,-304 374,-336 469,-336 469,-304 374,-304"/>
<text text-anchor="start" x="383.7125" y="-317" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">client_mode:bool</text>
<polygon fill="none" stroke="#000000" points="374,-236 374,-304 469,-304 469,-236 374,-236"/>
<text text-anchor="start" x="397.884" y="-285" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_confs()</text>
<text text-anchor="start" x="405.668" y="-273" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">parse()</text>
<text text-anchor="start" x="409.282" y="-261" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">calc()</text>
<text text-anchor="start" x="407.6135" y="-249" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">build()</text>
</g>
<!-- A11&#45;&gt;A13 -->
<g id="edge13" class="edge">
<title>A11&#45;&gt;A13</title>
<path fill="none" stroke="#000000" d="M344.6018,-483.6023C359.4473,-448.3138 375.5948,-409.9305 389.2039,-377.5809"/>
<polygon fill="#000000" stroke="#000000" points="393.1562,-368.1861 393.4263,-379.1487 391.2173,-372.7949 389.2784,-377.4036 389.2784,-377.4036 389.2784,-377.4036 391.2173,-372.7949 385.1305,-375.6586 393.1562,-368.1861 393.1562,-368.1861"/>
</g>
<!-- A12 -->
<g id="node13" class="node">
<title>A12</title>
<polygon fill="none" stroke="#000000" points="373,-680 373,-712 476,-712 476,-680 373,-680"/>
<text text-anchor="start" x="413.662" y="-693" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Infos</text>
<polygon fill="none" stroke="#000000" points="373,-624 373,-680 476,-680 476,-624 373,-624"/>
<text text-anchor="start" x="416.4415" y="-661" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">stat</text>
<text text-anchor="start" x="391.986" y="-649" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">new_stat_data</text>
<text text-anchor="start" x="405.6035" y="-637" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">info_dev</text>
<polygon fill="none" stroke="#000000" points="373,-472 373,-624 476,-624 476,-472 373,-472"/>
<text text-anchor="start" x="400.3355" y="-605" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">static_init()</text>
<text text-anchor="start" x="398.3845" y="-593" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">dev_value()</text>
<text text-anchor="start" x="395.3305" y="-581" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">inc_counter()</text>
<text text-anchor="start" x="393.6605" y="-569" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">dec_counter()</text>
<text text-anchor="start" x="391.71" y="-557" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_proxy_conf</text>
<text text-anchor="start" x="406.713" y="-545" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_conf</text>
<text text-anchor="start" x="399.494" y="-533" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_remove</text>
<text text-anchor="start" x="400.8745" y="-521" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">update_db</text>
<text text-anchor="start" x="385.037" y="-509" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">set_db_def_value</text>
<text text-anchor="start" x="394.4855" y="-497" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">get_db_value</text>
<text text-anchor="start" x="382.8225" y="-485" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ignore_this_device</text>
</g>
<!-- A12&#45;&gt;A13 -->
<g id="edge12" class="edge">
<title>A12&#45;&gt;A13</title>
<path fill="none" stroke="#000000" d="M422.6543,-461.9134C422.3183,-429.4373 421.9719,-395.9527 421.6835,-368.0691"/>
<polygon fill="none" stroke="#000000" points="419.1548,-461.9893 422.7581,-471.9525 426.1544,-461.9168 419.1548,-461.9893"/>
</g>
<!-- A14 -->
<g id="node15" class="node">
<title>A14</title>
<polygon fill="none" stroke="#000000" points="345,-1464 345,-1496 494,-1496 494,-1464 345,-1464"/>
<text text-anchor="start" x="399.2175" y="-1477" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Message</text>
<polygon fill="none" stroke="#000000" points="345,-1240 345,-1464 494,-1464 494,-1240 345,-1240"/>
<text text-anchor="start" x="382.8265" y="-1445" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">server_side:bool</text>
<text text-anchor="start" x="393.384" y="-1433" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">mb:Modbus</text>
<text text-anchor="start" x="394.2185" y="-1421" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ifc:AsyncIfc</text>
<text text-anchor="start" x="401.7135" y="-1409" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">node_id</text>
<text text-anchor="start" x="380.043" y="-1397" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">header_valid:bool</text>
<text text-anchor="start" x="394.49" y="-1385" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">header_len</text>
<text text-anchor="start" x="400.324" y="-1373" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">data_len</text>
<text text-anchor="start" x="397.8245" y="-1361" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">unique_id</text>
<text text-anchor="start" x="391.715" y="-1349" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">sug_area:str</text>
<text text-anchor="start" x="388.656" y="-1337" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">new_data:dict</text>
<text text-anchor="start" x="395.6" y="-1325" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">state:State</text>
<text text-anchor="start" x="369.2045" y="-1313" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">shutdown_started:bool</text>
<text text-anchor="start" x="388.3845" y="-1301" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">modbus_elms</text>
<text text-anchor="start" x="384.507" y="-1289" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">mb_timer:Timer</text>
<text text-anchor="start" x="393.385" y="-1277" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">mb_timeout</text>
<text text-anchor="start" x="382.5525" y="-1265" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">mb_first_timeout</text>
<text text-anchor="start" x="373.654" y="-1253" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">modbus_polling:bool</text>
<polygon fill="none" stroke="#000000" points="345,-1136 345,-1240 494,-1240 494,-1136 345,-1136"/>
<text text-anchor="start" x="368.3845" y="-1221" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_set_mqtt_timestamp()</text>
<text text-anchor="start" x="397" y="-1209" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_timeout()</text>
<text text-anchor="start" x="369.7675" y="-1197" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_send_modbus_cmd()</text>
<text text-anchor="start" x="354.7595" y="-1185" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt; end_modbus_cmd()</text>
<text text-anchor="start" x="404.5025" y="-1173" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
<text text-anchor="start" x="390.3305" y="-1161" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">inc_counter()</text>
<text text-anchor="start" x="388.6605" y="-1149" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">dec_counter()</text>
</g>
<!-- A14&#45;&gt;A6 -->
<g id="edge14" class="edge">
<title>A14&#45;&gt;A6</title>
<path fill="none" stroke="#000000" d="M456.7,-1135.7758C459.4656,-1122.5547 462.2514,-1109.2373 465.0066,-1096.0662"/>
<polygon fill="#000000" stroke="#000000" points="467.1066,-1086.0268 469.4637,-1096.7363 466.0828,-1090.9209 465.059,-1095.8149 465.059,-1095.8149 465.059,-1095.8149 466.0828,-1090.9209 460.6544,-1094.8935 467.1066,-1086.0268 467.1066,-1086.0268"/>
<text text-anchor="middle" x="452.138" y="-1113.3031" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">use</text>
</g>
<!-- A14&#45;&gt;A11 -->
<g id="edge17" class="edge">
<title>A14&#45;&gt;A11</title>
<path fill="none" stroke="#000000" d="M387.4309,-1125.5329C364.9447,-989.8666 335.5291,-812.3923 316.9437,-700.2604"/>
<polygon fill="none" stroke="#000000" points="384.0176,-1126.3448 389.1057,-1135.6378 390.9234,-1125.2001 384.0176,-1126.3448"/>
</g>
<!-- A15&#45;&gt;A14 -->
<g id="edge16" class="edge">
<title>A15&#45;&gt;A14</title>
<path fill="none" stroke="#000000" stroke-dasharray="5,2" d="M377.1741,-1613.8004C381.7377,-1581.3079 387.6971,-1538.8764 393.681,-1496.2713"/>
<polygon fill="none" stroke="#000000" points="373.6682,-1613.5986 375.7432,-1623.9883 380.6001,-1614.5723 373.6682,-1613.5986"/>
</g>
<!-- A16 -->
<g id="node17" class="node">
<title>A16</title>
<polygon fill="none" stroke="#000000" points="433,-1766 433,-1798 508,-1798 508,-1766 433,-1766"/>
<text text-anchor="start" x="452.7175" y="-1779" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Modbus</text>
<polygon fill="none" stroke="#000000" points="433,-1614 433,-1766 508,-1766 508,-1614 433,-1614"/>
<text text-anchor="start" x="462.1615" y="-1747" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">que</text>
<text text-anchor="start" x="442.99" y="-1723" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">snd_handler</text>
<text text-anchor="start" x="444.105" y="-1711" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rsp_handler</text>
<text text-anchor="start" x="454.1085" y="-1699" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">timeout</text>
<text text-anchor="start" x="444.3895" y="-1687" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">max_retires</text>
<text text-anchor="start" x="452.442" y="-1675" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">last_xxx</text>
<text text-anchor="start" x="464.3915" y="-1663" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">err</text>
<text text-anchor="start" x="451.0535" y="-1651" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">retry_cnt</text>
<text text-anchor="start" x="449.379" y="-1639" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">req_pend</text>
<text text-anchor="start" x="463.8365" y="-1627" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tim</text>
<polygon fill="none" stroke="#000000" points="433,-1546 433,-1614 508,-1614 508,-1546 433,-1546"/>
<text text-anchor="start" x="444.39" y="-1595" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">build_msg()</text>
<text text-anchor="start" x="447.724" y="-1583" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">recv_req()</text>
<text text-anchor="start" x="445.224" y="-1571" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">recv_resp()</text>
<text text-anchor="start" x="455.5025" y="-1559" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A16&#45;&gt;A14 -->
<g id="edge18" class="edge">
<title>A16&#45;&gt;A14</title>
<path fill="none" stroke="#000000" d="M450.5227,-1536.041C448.6468,-1522.9463 446.7248,-1509.5297 444.8004,-1496.0971"/>
<polygon fill="#000000" stroke="#000000" points="451.9475,-1545.9867 446.0748,-1536.726 451.2384,-1541.0373 450.5293,-1536.0878 450.5293,-1536.0878 450.5293,-1536.0878 451.2384,-1541.0373 454.9839,-1535.4496 451.9475,-1545.9867 451.9475,-1545.9867"/>
<text text-anchor="middle" x="455.7379" y="-1509.8414" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">has</text>
<text text-anchor="middle" x="441.0101" y="-1526.2424" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">0..1</text>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 32 KiB

43
app/docu/proxy_3.yuml Normal file
View File

@@ -0,0 +1,43 @@
// {type:class}
// {direction:topDown}
// {generate:true}
[note: Example of instantiation for a GEN3PLUS inverter!{bg:cornsilk}]
[<<AbstractIterMeta>>||__iter__()]
[InverterBase|addr;remote:StreamPtr;local:StreamPtr|create_remote();;close()]
[InverterBase]^[InverterG3P|forward_at_cmd_resp;]
[InverterBase]++->[local:StreamPtr]
[InverterBase]++->[remote:StreamPtr]
[<<AsyncIfc>>||set_node_id();get_conn_no();;tx_add();tx_flush();tx_get();tx_peek();tx_log();tx_clear();tx_len();;fwd_add();fwd_log();rx_get();rx_peek();rx_log();rx_clear();rx_len();rx_set_cb();;prot_set_timeout_cb()]
[AsyncIfcImpl|fwd_fifo:ByteFifo;tx_fifo:ByteFifo;rx_fifo:ByteFifo;conn_no:Count;node_id;timeout_cb]
[AsyncStream|reader;writer;addr;r_addr;l_addr|;<async>loop;disc();close();healthy();;__async_read();__async_write();__async_forward()]
[AsyncStreamServer|create_remote|<async>server_loop();<async>_async_forward();<async>publish_outstanding_mqtt();close()]
[AsyncStreamClient||<async>client_loop();<async>_async_forward())]
[<<AsyncIfc>>]^-.-[AsyncIfcImpl]
[AsyncIfcImpl]^[AsyncStream]
[AsyncStream]^[AsyncStreamServer]
[AsyncStream]^[AsyncStreamClient]
[SolarmanV5|conn_no;addr;inverter:InverterG3P;control;serial;snr;db:InfosG3P;switch|msg_unknown();;healthy();close()]
[SolarmanV5]<-++[local:StreamPtr]
[local:StreamPtr]++->[AsyncStreamServer]
[SolarmanV5]<-0..1[remote:StreamPtr]
[remote:StreamPtr]0..1->[AsyncStreamClient]
[Infos|stat;new_stat_data;info_dev|static_init();dev_value();inc_counter();dec_counter();ha_proxy_conf;ha_conf;ha_remove;update_db;set_db_def_value;get_db_value;ignore_this_device]
[Infos]^[InfosG3P|client_mode:bool|ha_confs();parse();calc();build()]
[SolarmanV5]->[InfosG3P]
[Message|server_side:bool;mb:Modbus;ifc:AsyncIfc;node_id;header_valid:bool;header_len;data_len;unique_id;sug_area:str;new_data:dict;state:State;shutdown_started:bool;modbus_elms;mb_timer:Timer;mb_timeout;mb_first_timeout;modbus_polling:bool|_set_mqtt_timestamp();_timeout();_send_modbus_cmd();<async> end_modbus_cmd();close();inc_counter();dec_counter()]
[Message]use->[<<AsyncIfc>>]
[<<ProtocolIfc>>|_registry|close()]
[<<AbstractIterMeta>>]^-.-[<<ProtocolIfc>>]
[<<ProtocolIfc>>]^-.-[Message]
[Message]^[SolarmanV5]
[Modbus|que;;snd_handler;rsp_handler;timeout;max_retires;last_xxx;err;retry_cnt;req_pend;tim|build_msg();recv_req();recv_resp();close()]
[Modbus]<0..1-has[Message]

View File

@@ -2,6 +2,8 @@
set -e
user="$(id -u)"
export VERSION=$(cat /proxy-version.txt)
echo "######################################################"
echo "# prepare: '$SERVICE_NAME' Version:$VERSION"
echo "# for running with UserID:$UID, GroupID:$GID"

View File

@@ -1,384 +0,0 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<!-- Generated by graphviz version 2.40.1 (20161225.0304)
-->
<!-- Title: G Pages: 1 -->
<svg width="691pt" height="1312pt"
viewBox="0.00 0.00 691.35 1312.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 1308)">
<title>G</title>
<polygon fill="#ffffff" stroke="transparent" points="-4,4 -4,-1308 687.348,-1308 687.348,4 -4,4"/>
<!-- A0 -->
<g id="node1" class="node">
<title>A0</title>
<polygon fill="#fff8dc" stroke="#000000" points="108.5444,-1208 .1516,-1208 .1516,-1172 114.5444,-1172 114.5444,-1202 108.5444,-1208"/>
<polyline fill="none" stroke="#000000" points="108.5444,-1208 108.5444,-1202 "/>
<polyline fill="none" stroke="#000000" points="114.5444,-1202 108.5444,-1202 "/>
<text text-anchor="middle" x="57.348" y="-1193" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">You can stick notes</text>
<text text-anchor="middle" x="57.348" y="-1181" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">on diagrams too!</text>
</g>
<!-- A1 -->
<g id="node2" class="node">
<title>A1</title>
<polygon fill="none" stroke="#000000" points="657.0297,-906 587.6663,-906 587.6663,-870 657.0297,-870 657.0297,-906"/>
<text text-anchor="middle" x="622.348" y="-885" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Singleton</text>
</g>
<!-- A2 -->
<g id="node3" class="node">
<title>A2</title>
<polygon fill="none" stroke="#000000" points="561.348,-608 561.348,-640 683.348,-640 683.348,-608 561.348,-608"/>
<text text-anchor="start" x="612.625" y="-621" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Mqtt</text>
<polygon fill="none" stroke="#000000" points="561.348,-552 561.348,-608 683.348,-608 683.348,-552 561.348,-552"/>
<text text-anchor="start" x="579.8355" y="-589" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;static&gt;ha_restarts</text>
<text text-anchor="start" x="587.6145" y="-577" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;static&gt;__client</text>
<text text-anchor="start" x="571.2215" y="-565" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;static&gt;__cb_MqttIsUp</text>
<polygon fill="none" stroke="#000000" points="561.348,-508 561.348,-552 683.348,-552 683.348,-508 561.348,-508"/>
<text text-anchor="start" x="584.284" y="-533" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;publish()</text>
<text text-anchor="start" x="588.4525" y="-521" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;close()</text>
</g>
<!-- A1&#45;&gt;A2 -->
<g id="edge1" class="edge">
<title>A1&#45;&gt;A2</title>
<path fill="none" stroke="#000000" d="M622.348,-859.5395C622.348,-810.311 622.348,-708.0351 622.348,-640.2069"/>
<polygon fill="none" stroke="#000000" points="618.8481,-859.7608 622.348,-869.7608 625.8481,-859.7608 618.8481,-859.7608"/>
</g>
<!-- A11 -->
<g id="node12" class="node">
<title>A11</title>
<polygon fill="none" stroke="#000000" points="568.348,-324 568.348,-356 676.348,-356 676.348,-324 568.348,-324"/>
<text text-anchor="start" x="605.4015" y="-337" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Inverter</text>
<polygon fill="none" stroke="#000000" points="568.348,-232 568.348,-324 676.348,-324 676.348,-232 568.348,-232"/>
<text text-anchor="start" x="598.452" y="-305" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">cls.db_stat</text>
<text text-anchor="start" x="591.7885" y="-293" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">cls.entity_prfx</text>
<text text-anchor="start" x="582.6235" y="-281" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">cls.discovery_prfx</text>
<text text-anchor="start" x="582.0595" y="-269" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">cls.proxy_node_id</text>
<text text-anchor="start" x="578.1705" y="-257" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">cls.proxy_unique_id</text>
<text text-anchor="start" x="594.0135" y="-245" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">cls.mqtt:Mqtt</text>
<polygon fill="none" stroke="#000000" points="568.348,-212 568.348,-232 676.348,-232 676.348,-212 568.348,-212"/>
</g>
<!-- A2&#45;&gt;A11 -->
<g id="edge13" class="edge">
<title>A2&#45;&gt;A11</title>
<path fill="none" stroke="#000000" d="M622.348,-507.8316C622.348,-462.6124 622.348,-402.6972 622.348,-356.2361"/>
</g>
<!-- A3 -->
<g id="node4" class="node">
<title>A3</title>
<polygon fill="none" stroke="#000000" points="257.348,-366 257.348,-398 364.348,-398 364.348,-366 257.348,-366"/>
<text text-anchor="start" x="293.0655" y="-379" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Modbus</text>
<polygon fill="none" stroke="#000000" points="257.348,-226 257.348,-366 364.348,-366 364.348,-226 257.348,-226"/>
<text text-anchor="start" x="302.5095" y="-347" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">que</text>
<text text-anchor="start" x="283.338" y="-323" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">snd_handler</text>
<text text-anchor="start" x="284.453" y="-311" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">rsp_handler</text>
<text text-anchor="start" x="266.9565" y="-299" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">timeout:max_retires</text>
<text text-anchor="start" x="292.79" y="-287" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">last_xxx</text>
<text text-anchor="start" x="304.7395" y="-275" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">err</text>
<text text-anchor="start" x="291.4015" y="-263" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">retry_cnt</text>
<text text-anchor="start" x="289.727" y="-251" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">req_pend</text>
<text text-anchor="start" x="304.1845" y="-239" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">tim</text>
<polygon fill="none" stroke="#000000" points="257.348,-170 257.348,-226 364.348,-226 364.348,-170 257.348,-170"/>
<text text-anchor="start" x="284.738" y="-207" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">build_msg()</text>
<text text-anchor="start" x="288.072" y="-195" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">recv_req()</text>
<text text-anchor="start" x="285.572" y="-183" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">recv_resp()</text>
</g>
<!-- A4 -->
<g id="node5" class="node">
<title>A4</title>
<polygon fill="none" stroke="#000000" points="263.348,-1200 263.348,-1232 334.348,-1232 334.348,-1200 263.348,-1200"/>
<text text-anchor="start" x="273.293" y="-1213" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">IterRegistry</text>
<polygon fill="none" stroke="#000000" points="263.348,-1180 263.348,-1200 334.348,-1200 334.348,-1180 263.348,-1180"/>
<polygon fill="none" stroke="#000000" points="263.348,-1148 263.348,-1180 334.348,-1180 334.348,-1148 263.348,-1148"/>
<text text-anchor="start" x="280.787" y="-1161" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__iter__</text>
</g>
<!-- A5 -->
<g id="node6" class="node">
<title>A5</title>
<polygon fill="none" stroke="#000000" points="231.348,-994 231.348,-1026 365.348,-1026 365.348,-994 231.348,-994"/>
<text text-anchor="start" x="278.0655" y="-1007" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Message</text>
<polygon fill="none" stroke="#000000" points="231.348,-818 231.348,-994 365.348,-994 365.348,-818 231.348,-818"/>
<text text-anchor="start" x="261.6745" y="-975" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">server_side:bool</text>
<text text-anchor="start" x="258.891" y="-963" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">header_valid:bool</text>
<text text-anchor="start" x="251.662" y="-951" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">header_len:unsigned</text>
<text text-anchor="start" x="257.496" y="-939" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">data_len:unsigned</text>
<text text-anchor="start" x="276.6725" y="-927" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">unique_id</text>
<text text-anchor="start" x="280.5615" y="-915" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">node_id</text>
<text text-anchor="start" x="277.5065" y="-903" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">sug_area</text>
<text text-anchor="start" x="248.337" y="-891" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_recv_buffer:bytearray</text>
<text text-anchor="start" x="246.9425" y="-879" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_send_buffer:bytearray</text>
<text text-anchor="start" x="241.1145" y="-867" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_forward_buffer:bytearray</text>
<text text-anchor="start" x="280.5615" y="-855" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">db:Infos</text>
<text text-anchor="start" x="269.174" y="-843" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">new_data:list</text>
<text text-anchor="start" x="287.51" y="-831" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">state</text>
<polygon fill="none" stroke="#000000" points="231.348,-750 231.348,-818 365.348,-818 365.348,-750 231.348,-750"/>
<text text-anchor="start" x="248.0575" y="-799" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">_read():void&lt;abstract&gt;</text>
<text text-anchor="start" x="272.7925" y="-787" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close():void</text>
<text text-anchor="start" x="258.6205" y="-775" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">inc_counter():void</text>
<text text-anchor="start" x="256.9505" y="-763" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">dec_counter():void</text>
</g>
<!-- A4&#45;&gt;A5 -->
<g id="edge2" class="edge">
<title>A4&#45;&gt;A5</title>
<path fill="none" stroke="#000000" d="M298.348,-1137.5879C298.348,-1106.6429 298.348,-1065.8843 298.348,-1026.2983"/>
<polygon fill="none" stroke="#000000" points="294.8481,-1137.6902 298.348,-1147.6902 301.8481,-1137.6902 294.8481,-1137.6902"/>
</g>
<!-- A6 -->
<g id="node7" class="node">
<title>A6</title>
<polygon fill="none" stroke="#000000" points="370.348,-668 370.348,-700 484.348,-700 484.348,-668 370.348,-668"/>
<text text-anchor="start" x="413.456" y="-681" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Talent</text>
<polygon fill="none" stroke="#000000" points="370.348,-564 370.348,-668 484.348,-668 484.348,-564 370.348,-564"/>
<text text-anchor="start" x="380.111" y="-649" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">await_conn_resp_cnt</text>
<text text-anchor="start" x="415.1255" y="-637" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">id_str</text>
<text text-anchor="start" x="395.948" y="-625" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">contact_name</text>
<text text-anchor="start" x="399.288" y="-613" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">contact_mail</text>
<text text-anchor="start" x="402.8925" y="-601" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">db:InfosG3</text>
<text text-anchor="start" x="401.232" y="-589" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">mb:Modbus</text>
<text text-anchor="start" x="413.46" y="-577" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">switch</text>
<polygon fill="none" stroke="#000000" points="370.348,-448 370.348,-564 484.348,-564 484.348,-448 370.348,-448"/>
<text text-anchor="start" x="384.8405" y="-545" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_contact_info()</text>
<text text-anchor="start" x="386.7805" y="-533" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_ota_update()</text>
<text text-anchor="start" x="392.6245" y="-521" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_get_time()</text>
<text text-anchor="start" x="380.6765" y="-509" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_collector_data()</text>
<text text-anchor="start" x="382.6215" y="-497" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_inverter_data()</text>
<text text-anchor="start" x="391.7885" y="-485" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_unknown()</text>
<text text-anchor="start" x="412.3505" y="-461" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A5&#45;&gt;A6 -->
<g id="edge3" class="edge">
<title>A5&#45;&gt;A6</title>
<path fill="none" stroke="#000000" d="M358.9294,-740.5383C364.4479,-727.1056 370.0049,-713.5794 375.4378,-700.355"/>
<polygon fill="none" stroke="#000000" points="355.6797,-739.2382 355.117,-749.8181 362.1546,-741.8983 355.6797,-739.2382"/>
</g>
<!-- A7 -->
<g id="node8" class="node">
<title>A7</title>
<polygon fill="none" stroke="#000000" points="127.348,-632 127.348,-664 218.348,-664 218.348,-632 127.348,-632"/>
<text text-anchor="start" x="145.343" y="-645" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">SolarmanV5</text>
<polygon fill="none" stroke="#000000" points="127.348,-540 127.348,-632 218.348,-632 218.348,-540 127.348,-540"/>
<text text-anchor="start" x="157.846" y="-613" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">control</text>
<text text-anchor="start" x="160.9055" y="-601" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">serial</text>
<text text-anchor="start" x="165.904" y="-589" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">snr</text>
<text text-anchor="start" x="145.058" y="-577" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">db:InfosG3P</text>
<text text-anchor="start" x="146.732" y="-565" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">mb:Modbus</text>
<text text-anchor="start" x="158.96" y="-553" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">switch</text>
<polygon fill="none" stroke="#000000" points="127.348,-484 127.348,-540 218.348,-540 218.348,-484 127.348,-484"/>
<text text-anchor="start" x="137.2885" y="-521" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">msg_unknown()</text>
<text text-anchor="start" x="157.8505" y="-497" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A5&#45;&gt;A7 -->
<g id="edge4" class="edge">
<title>A5&#45;&gt;A7</title>
<path fill="none" stroke="#000000" d="M239.076,-740.2903C228.6761,-714.3733 218.1403,-688.1174 208.6075,-664.3611"/>
<polygon fill="none" stroke="#000000" points="235.9268,-741.8409 242.8992,-749.8181 242.4233,-739.2339 235.9268,-741.8409"/>
</g>
<!-- A6&#45;&gt;A3 -->
<g id="edge6" class="edge">
<title>A6&#45;&gt;A3</title>
<path fill="none" stroke="#000000" d="M376.3705,-447.6454C371.0187,-434.3805 365.5816,-420.9039 360.2423,-407.6696"/>
<polygon fill="#000000" stroke="#000000" points="356.3743,-398.0824 364.289,-405.6724 358.2451,-402.7192 360.1158,-407.3561 360.1158,-407.3561 360.1158,-407.3561 358.2451,-402.7192 355.9427,-409.0397 356.3743,-398.0824 356.3743,-398.0824"/>
<text text-anchor="middle" x="370.9946" y="-408.7296" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">1</text>
<text text-anchor="middle" x="361.7502" y="-430.9982" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">has</text>
</g>
<!-- A8 -->
<g id="node9" class="node">
<title>A8</title>
<polygon fill="none" stroke="#000000" points="382.348,-300 382.348,-332 532.348,-332 532.348,-300 382.348,-300"/>
<text text-anchor="start" x="425.3935" y="-313" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ConnectionG3</text>
<polygon fill="none" stroke="#000000" points="382.348,-268 382.348,-300 532.348,-300 532.348,-268 382.348,-268"/>
<text text-anchor="start" x="392.335" y="-281" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">remoteStream:ConnectionG3</text>
<polygon fill="none" stroke="#000000" points="382.348,-236 382.348,-268 532.348,-268 532.348,-236 382.348,-236"/>
<text text-anchor="start" x="442.3505" y="-249" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A6&#45;&gt;A8 -->
<g id="edge5" class="edge">
<title>A6&#45;&gt;A8</title>
<path fill="none" stroke="#000000" d="M441.464,-437.5454C445.3714,-399.7739 449.3591,-361.2265 452.3615,-332.203"/>
<polygon fill="none" stroke="#000000" points="437.9668,-437.3383 440.4192,-447.6454 444.9297,-438.0587 437.9668,-437.3383"/>
</g>
<!-- A7&#45;&gt;A3 -->
<g id="edge8" class="edge">
<title>A7&#45;&gt;A3</title>
<path fill="none" stroke="#000000" d="M210.935,-483.8952C216.3404,-471.7801 221.9084,-459.553 227.348,-448 235.1472,-431.4354 243.6196,-414.0579 252.0717,-397.0641"/>
<polygon fill="#000000" stroke="#000000" points="256.7608,-387.6701 256.3209,-398.6272 254.5277,-392.1437 252.2946,-396.6174 252.2946,-396.6174 252.2946,-396.6174 254.5277,-392.1437 248.2683,-394.6076 256.7608,-387.6701 256.7608,-387.6701"/>
<text text-anchor="middle" x="256.228" y="-404.663" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">1</text>
<text text-anchor="middle" x="210.6174" y="-460.8977" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">has</text>
</g>
<!-- A9 -->
<g id="node10" class="node">
<title>A9</title>
<polygon fill="none" stroke="#000000" points="64.348,-300 64.348,-332 220.348,-332 220.348,-300 64.348,-300"/>
<text text-anchor="start" x="107.059" y="-313" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ConnectionG3P</text>
<polygon fill="none" stroke="#000000" points="64.348,-268 64.348,-300 220.348,-300 220.348,-268 64.348,-268"/>
<text text-anchor="start" x="74.0005" y="-281" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">remoteStream:ConnectionG3P</text>
<polygon fill="none" stroke="#000000" points="64.348,-236 64.348,-268 220.348,-268 220.348,-236 64.348,-236"/>
<text text-anchor="start" x="127.3505" y="-249" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A7&#45;&gt;A9 -->
<g id="edge7" class="edge">
<title>A7&#45;&gt;A9</title>
<path fill="none" stroke="#000000" d="M161.9757,-473.7349C157.0236,-425.8645 151.3255,-370.7828 147.349,-332.3431"/>
<polygon fill="none" stroke="#000000" points="158.5098,-474.2451 163.0203,-483.8319 165.4726,-473.5248 158.5098,-474.2451"/>
</g>
<!-- A8&#45;&gt;A8 -->
<g id="edge15" class="edge">
<title>A8&#45;&gt;A8</title>
<path fill="none" stroke="#000000" d="M532.5164,-321.6908C543.1874,-315.5948 550.348,-303.0313 550.348,-284 550.348,-270.3213 546.6488,-259.9838 540.6058,-252.9875"/>
<polygon fill="#000000" stroke="#000000" points="532.5164,-246.3092 543.0929,-249.2054 536.3722,-249.4924 540.228,-252.6756 540.228,-252.6756 540.228,-252.6756 536.3722,-249.4924 537.3632,-256.1459 532.5164,-246.3092 532.5164,-246.3092"/>
<text text-anchor="middle" x="551.8757" y="-248.3308" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">0..1</text>
<text text-anchor="middle" x="543.0584" y="-301.6947" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">has</text>
</g>
<!-- A12 -->
<g id="node13" class="node">
<title>A12</title>
<polygon fill="none" stroke="#000000" points="478.348,-88 478.348,-120 600.348,-120 600.348,-88 478.348,-88"/>
<text text-anchor="start" x="515.7325" y="-101" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InverterG3</text>
<polygon fill="none" stroke="#000000" points="478.348,-56 478.348,-88 600.348,-88 600.348,-56 478.348,-56"/>
<text text-anchor="start" x="508.7835" y="-69" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__ha_restarts</text>
<polygon fill="none" stroke="#000000" points="478.348,0 478.348,-56 600.348,-56 600.348,0 478.348,0"/>
<text text-anchor="start" x="487.9515" y="-37" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">async_create_remote()</text>
<text text-anchor="start" x="524.3505" y="-13" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A8&#45;&gt;A12 -->
<g id="edge14" class="edge">
<title>A8&#45;&gt;A12</title>
<path fill="none" stroke="#000000" d="M478.5265,-226.1465C490.409,-193.6871 505.2165,-153.2373 517.2458,-120.3767"/>
<polygon fill="none" stroke="#000000" points="475.09,-225.3526 474.9391,-235.9464 481.6634,-227.759 475.09,-225.3526"/>
</g>
<!-- A9&#45;&gt;A9 -->
<g id="edge17" class="edge">
<title>A9&#45;&gt;A9</title>
<path fill="none" stroke="#000000" d="M220.6951,-321.2601C231.2923,-315.0138 238.348,-302.5938 238.348,-284 238.348,-270.6357 234.703,-260.4608 228.7179,-253.4753"/>
<polygon fill="#000000" stroke="#000000" points="220.6951,-246.7399 231.2473,-249.7232 224.5245,-249.9548 228.3539,-253.1697 228.3539,-253.1697 228.3539,-253.1697 224.5245,-249.9548 225.4605,-256.6162 220.6951,-246.7399 220.6951,-246.7399"/>
<text text-anchor="middle" x="240.0123" y="-248.9211" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">0..1</text>
<text text-anchor="middle" x="231.039" y="-301.1428" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">has</text>
</g>
<!-- A13 -->
<g id="node14" class="node">
<title>A13</title>
<polygon fill="none" stroke="#000000" points="251.348,-88 251.348,-120 373.348,-120 373.348,-88 251.348,-88"/>
<text text-anchor="start" x="285.398" y="-101" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InverterG3P</text>
<polygon fill="none" stroke="#000000" points="251.348,-56 251.348,-88 373.348,-88 373.348,-56 251.348,-56"/>
<text text-anchor="start" x="281.7835" y="-69" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__ha_restarts</text>
<polygon fill="none" stroke="#000000" points="251.348,0 251.348,-56 373.348,-56 373.348,0 251.348,0"/>
<text text-anchor="start" x="260.9515" y="-37" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">async_create_remote()</text>
<text text-anchor="start" x="297.3505" y="-13" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
</g>
<!-- A9&#45;&gt;A13 -->
<g id="edge16" class="edge">
<title>A9&#45;&gt;A13</title>
<path fill="none" stroke="#000000" d="M184.9859,-227.8183C209.8288,-195.0842 241.1576,-153.8039 266.5264,-120.3767"/>
<polygon fill="none" stroke="#000000" points="182.0747,-225.8647 178.8173,-235.9464 187.6507,-230.0965 182.0747,-225.8647"/>
</g>
<!-- A10 -->
<g id="node11" class="node">
<title>A10</title>
<polygon fill="none" stroke="#000000" points="236.348,-662 236.348,-694 352.348,-694 352.348,-662 236.348,-662"/>
<text text-anchor="start" x="264.622" y="-675" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">AsyncStream</text>
<polygon fill="none" stroke="#000000" points="236.348,-582 236.348,-662 352.348,-662 352.348,-582 236.348,-582"/>
<text text-anchor="start" x="279.901" y="-643" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">reader</text>
<text text-anchor="start" x="282.131" y="-631" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">writer</text>
<text text-anchor="start" x="284.345" y="-619" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">addr</text>
<text text-anchor="start" x="279.901" y="-607" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">r_addr</text>
<text text-anchor="start" x="280.456" y="-595" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">l_addr</text>
<polygon fill="none" stroke="#000000" points="236.348,-454 236.348,-582 352.348,-582 352.348,-454 236.348,-454"/>
<text text-anchor="start" x="246.0055" y="-563" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;server_loop()</text>
<text text-anchor="start" x="248.226" y="-551" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;client_loop()</text>
<text text-anchor="start" x="266.002" y="-539" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">&lt;async&gt;loop</text>
<text text-anchor="start" x="282.13" y="-527" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">disc()</text>
<text text-anchor="start" x="279.3505" y="-515" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">close()</text>
<text text-anchor="start" x="259.6185" y="-491" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__async_read()</text>
<text text-anchor="start" x="264.628" y="-479" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">async_write()</text>
<text text-anchor="start" x="252.955" y="-467" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">__async_forward()</text>
</g>
<!-- A10&#45;&gt;A8 -->
<g id="edge9" class="edge">
<title>A10&#45;&gt;A8</title>
<path fill="none" stroke="#000000" d="M357.3002,-455.2837C358.6569,-452.8318 360.0073,-450.4016 361.348,-448 383.2991,-408.6787 409.1348,-364.6637 428.4716,-332.1398"/>
<polygon fill="none" stroke="#000000" points="354.1241,-453.7956 352.3659,-464.2436 360.2557,-457.1724 354.1241,-453.7956"/>
</g>
<!-- A10&#45;&gt;A9 -->
<g id="edge10" class="edge">
<title>A10&#45;&gt;A9</title>
<path fill="none" stroke="#000000" d="M231.478,-454.0506C209.0706,-411.2997 185.0929,-365.5527 167.6574,-332.2876"/>
<polygon fill="none" stroke="#000000" points="228.4903,-455.8898 236.2327,-463.1221 234.6903,-452.6401 228.4903,-455.8898"/>
</g>
<!-- A11&#45;&gt;A12 -->
<g id="edge11" class="edge">
<title>A11&#45;&gt;A12</title>
<path fill="none" stroke="#000000" d="M592.1173,-202.4136C582.0634,-175.28 571.0546,-145.5697 561.7056,-120.3387"/>
<polygon fill="none" stroke="#000000" points="588.8729,-203.7311 595.6294,-211.892 595.4368,-201.2989 588.8729,-203.7311"/>
</g>
<!-- A11&#45;&gt;A13 -->
<g id="edge12" class="edge">
<title>A11&#45;&gt;A13</title>
<path fill="none" stroke="#000000" d="M586.2753,-202.8933C578.5712,-190.8884 569.6045,-179.4114 559.348,-170 530.8998,-143.8959 437.024,-105.9199 373.5518,-82.1078"/>
<polygon fill="none" stroke="#000000" points="583.4606,-204.9994 591.6624,-211.7061 589.4331,-201.3485 583.4606,-204.9994"/>
</g>
<!-- A14 -->
<g id="node15" class="node">
<title>A14</title>
<polygon fill="none" stroke="#000000" points="133.348,-1272 133.348,-1304 236.348,-1304 236.348,-1272 133.348,-1272"/>
<text text-anchor="start" x="174.01" y="-1285" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">Infos</text>
<polygon fill="none" stroke="#000000" points="133.348,-1216 133.348,-1272 236.348,-1272 236.348,-1216 133.348,-1216"/>
<text text-anchor="start" x="176.7895" y="-1253" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">stat</text>
<text text-anchor="start" x="152.334" y="-1241" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">new_stat_data</text>
<text text-anchor="start" x="165.9515" y="-1229" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">info_dev</text>
<polygon fill="none" stroke="#000000" points="133.348,-1076 133.348,-1216 236.348,-1216 236.348,-1076 133.348,-1076"/>
<text text-anchor="start" x="160.6835" y="-1197" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">static_init()</text>
<text text-anchor="start" x="158.7325" y="-1185" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">dev_value()</text>
<text text-anchor="start" x="155.6785" y="-1173" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">inc_counter()</text>
<text text-anchor="start" x="154.0085" y="-1161" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">dec_counter()</text>
<text text-anchor="start" x="152.058" y="-1149" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_proxy_conf</text>
<text text-anchor="start" x="167.061" y="-1137" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_conf</text>
<text text-anchor="start" x="161.2225" y="-1125" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">update_db</text>
<text text-anchor="start" x="145.385" y="-1113" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">set_db_def_value</text>
<text text-anchor="start" x="154.8335" y="-1101" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">get_db_value</text>
<text text-anchor="start" x="143.1705" y="-1089" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ignore_this_device</text>
</g>
<!-- A15 -->
<g id="node16" class="node">
<title>A15</title>
<polygon fill="none" stroke="#000000" points="386.348,-904 386.348,-936 453.348,-936 453.348,-904 386.348,-904"/>
<text text-anchor="start" x="402.341" y="-917" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InfosG3</text>
<polygon fill="none" stroke="#000000" points="386.348,-884 386.348,-904 453.348,-904 453.348,-884 386.348,-884"/>
<polygon fill="none" stroke="#000000" points="386.348,-840 386.348,-884 453.348,-884 453.348,-840 386.348,-840"/>
<text text-anchor="start" x="396.232" y="-865" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_confs()</text>
<text text-anchor="start" x="404.016" y="-853" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">parse()</text>
</g>
<!-- A14&#45;&gt;A15 -->
<g id="edge18" class="edge">
<title>A14&#45;&gt;A15</title>
<path fill="none" stroke="#000000" d="M242.8857,-1086.9876C246.5464,-1083.0913 250.3682,-1079.4032 254.348,-1076 298.2601,-1038.4501 335.1504,-1068.4478 374.348,-1026 397.0004,-1001.4693 408.2589,-965.3633 413.8498,-936.2357"/>
<polygon fill="none" stroke="#000000" points="240.0515,-1084.9088 236.0452,-1094.717 245.2936,-1089.548 240.0515,-1084.9088"/>
</g>
<!-- A16 -->
<g id="node17" class="node">
<title>A16</title>
<polygon fill="none" stroke="#000000" points="142.348,-904 142.348,-936 209.348,-936 209.348,-904 142.348,-904"/>
<text text-anchor="start" x="155.0065" y="-917" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">InfosG3P</text>
<polygon fill="none" stroke="#000000" points="142.348,-884 142.348,-904 209.348,-904 209.348,-884 142.348,-884"/>
<polygon fill="none" stroke="#000000" points="142.348,-840 142.348,-884 209.348,-884 209.348,-840 142.348,-840"/>
<text text-anchor="start" x="152.232" y="-865" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">ha_confs()</text>
<text text-anchor="start" x="160.016" y="-853" font-family="Helvetica,sans-Serif" font-size="10.00" fill="#000000">parse()</text>
</g>
<!-- A14&#45;&gt;A16 -->
<g id="edge19" class="edge">
<title>A14&#45;&gt;A16</title>
<path fill="none" stroke="#000000" d="M180.6399,-1065.5724C179.2846,-1020.0932 177.8303,-971.2935 176.7899,-936.3828"/>
<polygon fill="none" stroke="#000000" points="177.1491,-1065.9355 180.9455,-1075.8267 184.146,-1065.7269 177.1491,-1065.9355"/>
</g>
<!-- A15&#45;&gt;A6 -->
<g id="edge21" class="edge">
<title>A15&#45;&gt;A6</title>
<path fill="none" stroke="#000000" d="M420.5717,-839.9684C421.4566,-805.2366 422.6992,-756.4655 423.879,-710.1572"/>
<polygon fill="#000000" stroke="#000000" points="424.1376,-700.0098 428.3813,-710.1212 424.0102,-705.0082 423.8828,-710.0066 423.8828,-710.0066 423.8828,-710.0066 424.0102,-705.0082 419.3842,-709.8919 424.1376,-700.0098 424.1376,-700.0098"/>
</g>
<!-- A16&#45;&gt;A7 -->
<g id="edge20" class="edge">
<title>A16&#45;&gt;A7</title>
<path fill="none" stroke="#000000" d="M174.8891,-839.9684C174.4696,-796.0581 173.8357,-729.7079 173.3059,-674.2644"/>
<polygon fill="#000000" stroke="#000000" points="173.2083,-664.0467 177.8037,-674.0032 173.2561,-669.0465 173.304,-674.0463 173.304,-674.0463 173.304,-674.0463 173.2561,-669.0465 168.8042,-674.0893 173.2083,-664.0467 173.2083,-664.0467"/>
</g>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 32 KiB

View File

@@ -1,28 +0,0 @@
// {type:class}
// {direction:topDown}
// {generate:true}
[note: You can stick notes on diagrams too!{bg:cornsilk}]
[Singleton]^[Mqtt|<static>ha_restarts;<static>__client;<static>__cb_MqttIsUp|<async>publish();<async>close()]
[Modbus|que;;snd_handler;rsp_handler;timeout:max_retires;last_xxx;err;retry_cnt;req_pend;tim|build_msg();recv_req();recv_resp()]
[IterRegistry||__iter__]^[Message|server_side:bool;header_valid:bool;header_len:unsigned;data_len:unsigned;unique_id;node_id;sug_area;_recv_buffer:bytearray;_send_buffer:bytearray;_forward_buffer:bytearray;db:Infos;new_data:list;state|_read():void<abstract>;close():void;inc_counter():void;dec_counter():void]
[Message]^[Talent|await_conn_resp_cnt;id_str;contact_name;contact_mail;db:InfosG3;mb:Modbus;switch|msg_contact_info();msg_ota_update();msg_get_time();msg_collector_data();msg_inverter_data();msg_unknown();;close()]
[Message]^[SolarmanV5|control;serial;snr;db:InfosG3P;mb:Modbus;switch|msg_unknown();;close()]
[Talent]^[ConnectionG3|remoteStream:ConnectionG3|close()]
[Talent]has-1>[Modbus]
[SolarmanV5]^[ConnectionG3P|remoteStream:ConnectionG3P|close()]
[SolarmanV5]has-1>[Modbus]
[AsyncStream|reader;writer;addr;r_addr;l_addr|<async>server_loop();<async>client_loop();<async>loop;disc();close();;__async_read();async_write();__async_forward()]^[ConnectionG3]
[AsyncStream]^[ConnectionG3P]
[Inverter|cls.db_stat;cls.entity_prfx;cls.discovery_prfx;cls.proxy_node_id;cls.proxy_unique_id;cls.mqtt:Mqtt|]^[InverterG3|__ha_restarts|async_create_remote();;close()]
[Inverter]^[InverterG3P|__ha_restarts|async_create_remote();;close()]
[Mqtt]-[Inverter]
[ConnectionG3]^[InverterG3]
[ConnectionG3]has-0..1>[ConnectionG3]
[ConnectionG3P]^[InverterG3P]
[ConnectionG3P]has-0..1>[ConnectionG3P]
[Infos|stat;new_stat_data;info_dev|static_init();dev_value();inc_counter();dec_counter();ha_proxy_conf;ha_conf;update_db;set_db_def_value;get_db_value;ignore_this_device]^[InfosG3||ha_confs();parse()]
[Infos]^[InfosG3P||ha_confs();parse()]
[InfosG3P]->[SolarmanV5]
[InfosG3]->[Talent]

View File

@@ -0,0 +1,8 @@
flake8==7.2.0
pytest==8.3.5
pytest-asyncio==0.26.0
pytest-cov==6.1.1
python-dotenv==1.1.0
mock==5.2.0
coverage==7.8.0
jinja2-cli==0.8.2

View File

@@ -1,4 +1,4 @@
aiomqtt==2.0.1
schema==0.7.5
aiocron==1.8
aiohttp==3.9.5
aiomqtt==2.3.1
schema==0.7.7
aiocron==2.1
aiohttp==3.11.16

104
app/src/async_ifc.py Normal file
View File

@@ -0,0 +1,104 @@
from abc import ABC, abstractmethod
class AsyncIfc(ABC):
@abstractmethod
def get_conn_no(self):
pass # pragma: no cover
@abstractmethod
def set_node_id(self, value: str):
pass # pragma: no cover
#
# TX - QUEUE
#
@abstractmethod
def tx_add(self, data: bytearray):
''' add data to transmit queue'''
pass # pragma: no cover
@abstractmethod
def tx_flush(self):
''' send transmit queue and clears it'''
pass # pragma: no cover
@abstractmethod
def tx_peek(self, size: int = None) -> bytearray:
'''returns size numbers of byte without removing them'''
pass # pragma: no cover
@abstractmethod
def tx_log(self, level, info):
''' log the transmit queue'''
pass # pragma: no cover
@abstractmethod
def tx_clear(self):
''' clear transmit queue'''
pass # pragma: no cover
@abstractmethod
def tx_len(self):
''' get numner of bytes in the transmit queue'''
pass # pragma: no cover
#
# FORWARD - QUEUE
#
@abstractmethod
def fwd_add(self, data: bytearray):
''' add data to forward queue'''
pass # pragma: no cover
@abstractmethod
def fwd_log(self, level, info):
''' log the forward queue'''
pass # pragma: no cover
#
# RX - QUEUE
#
@abstractmethod
def rx_get(self, size: int = None) -> bytearray:
'''removes size numbers of bytes and return them'''
pass # pragma: no cover
@abstractmethod
def rx_peek(self, size: int = None) -> bytearray:
'''returns size numbers of byte without removing them'''
pass # pragma: no cover
@abstractmethod
def rx_log(self, level, info):
''' logs the receive queue'''
pass # pragma: no cover
@abstractmethod
def rx_clear(self):
''' clear receive queue'''
pass # pragma: no cover
@abstractmethod
def rx_len(self):
''' get numner of bytes in the receive queue'''
pass # pragma: no cover
@abstractmethod
def rx_set_cb(self, callback):
pass # pragma: no cover
#
# Protocol Callbacks
#
@abstractmethod
def prot_set_timeout_cb(self, callback):
pass # pragma: no cover
@abstractmethod
def prot_set_init_new_client_conn_cb(self, callback):
pass # pragma: no cover
@abstractmethod
def prot_set_update_header_cb(self, callback):
pass # pragma: no cover

View File

@@ -3,108 +3,180 @@ import logging
import traceback
import time
from asyncio import StreamReader, StreamWriter
from messages import hex_dump_memory, State
from typing import Self
from itertools import count
from proxy import Proxy
from byte_fifo import ByteFifo
from async_ifc import AsyncIfc
from infos import Infos
import gc
logger = logging.getLogger('conn')
class AsyncStream():
class AsyncIfcImpl(AsyncIfc):
_ids = count(0)
def __init__(self) -> None:
logger.debug('AsyncIfcImpl.__init__')
self.fwd_fifo = ByteFifo()
self.tx_fifo = ByteFifo()
self.rx_fifo = ByteFifo()
self.conn_no = next(self._ids)
self.node_id = ''
self.timeout_cb = None
self.init_new_client_conn_cb = None
self.update_header_cb = None
def close(self):
self.timeout_cb = None
self.fwd_fifo.reg_trigger(None)
self.tx_fifo.reg_trigger(None)
self.rx_fifo.reg_trigger(None)
def set_node_id(self, value: str):
self.node_id = value
def get_conn_no(self):
return self.conn_no
def tx_add(self, data: bytearray):
''' add data to transmit queue'''
self.tx_fifo += data
def tx_flush(self):
''' send transmit queue and clears it'''
self.tx_fifo()
def tx_peek(self, size: int = None) -> bytearray:
'''returns size numbers of byte without removing them'''
return self.tx_fifo.peek(size)
def tx_log(self, level, info):
''' log the transmit queue'''
self.tx_fifo.logging(level, info)
def tx_clear(self):
''' clear transmit queue'''
self.tx_fifo.clear()
def tx_len(self):
''' get numner of bytes in the transmit queue'''
return len(self.tx_fifo)
def fwd_add(self, data: bytearray):
''' add data to forward queue'''
self.fwd_fifo += data
def fwd_log(self, level, info):
''' log the forward queue'''
self.fwd_fifo.logging(level, info)
def rx_get(self, size: int = None) -> bytearray:
'''removes size numbers of bytes and return them'''
return self.rx_fifo.get(size)
def rx_peek(self, size: int = None) -> bytearray:
'''returns size numbers of byte without removing them'''
return self.rx_fifo.peek(size)
def rx_log(self, level, info):
''' logs the receive queue'''
self.rx_fifo.logging(level, info)
def rx_clear(self):
''' clear receive queue'''
self.rx_fifo.clear()
def rx_len(self):
''' get numner of bytes in the receive queue'''
return len(self.rx_fifo)
def rx_set_cb(self, callback):
self.rx_fifo.reg_trigger(callback)
def prot_set_timeout_cb(self, callback):
self.timeout_cb = callback
def prot_set_init_new_client_conn_cb(self, callback):
self.init_new_client_conn_cb = callback
def prot_set_update_header_cb(self, callback):
self.update_header_cb = callback
class StreamPtr():
'''Descr StreamPtr'''
def __init__(self, _stream, _ifc=None):
self.stream = _stream
self.ifc = _ifc
@property
def ifc(self):
return self._ifc
@ifc.setter
def ifc(self, value):
self._ifc = value
@property
def stream(self):
return self._stream
@stream.setter
def stream(self, value):
self._stream = value
class AsyncStream(AsyncIfcImpl):
MAX_PROC_TIME = 2
'''maximum processing time for a received msg in sec'''
MAX_START_TIME = 400
'''maximum time without a received msg in sec'''
MAX_INV_IDLE_TIME = 90
MAX_INV_IDLE_TIME = 120
'''maximum time without a received msg from the inverter in sec'''
MAX_CLOUD_IDLE_TIME = 360
'''maximum time without a received msg from cloud side in sec'''
MAX_DEF_IDLE_TIME = 360
'''maximum default time without a received msg in sec'''
def __init__(self, reader: StreamReader, writer: StreamWriter,
addr) -> None:
rstream: "StreamPtr") -> None:
AsyncIfcImpl.__init__(self)
logger.debug('AsyncStream.__init__')
self.reader = reader
self.writer = writer
self.addr = addr
self.r_addr = ''
self.l_addr = ''
self.conn_no = next(self._ids)
self.remote = rstream
self.tx_fifo.reg_trigger(self.__write_cb)
self._reader = reader
self._writer = writer
self.r_addr = writer.get_extra_info('peername')
self.l_addr = writer.get_extra_info('sockname')
self.proc_start = None # start processing start timestamp
self.proc_max = 0
self.async_publ_mqtt = None # will be set AsyncStreamServer only
def __write_cb(self):
self._writer.write(self.tx_fifo.get())
def __timeout(self) -> int:
if self.state == State.init:
to = self.MAX_START_TIME
else:
if self.server_side:
to = self.MAX_INV_IDLE_TIME
else:
to = self.MAX_CLOUD_IDLE_TIME
return to
async def server_loop(self, addr: str) -> None:
'''Loop for receiving messages from the inverter (server-side)'''
logger.info(f'[{self.node_id}:{self.conn_no}] '
f'Accept connection from {addr}')
self.inc_counter('Inverter_Cnt')
await self.loop()
self.dec_counter('Inverter_Cnt')
logger.info(f'[{self.node_id}:{self.conn_no}] Server loop stopped for'
f' r{self.r_addr}')
# if the server connection closes, we also have to disconnect
# the connection to te TSUN cloud
if self.remoteStream:
logger.info(f'[{self.node_id}:{self.conn_no}] disc client '
f'connection: [{self.remoteStream.node_id}:'
f'{self.remoteStream.conn_no}]')
await self.remoteStream.disc()
try:
await self._async_publ_mqtt_proxy_stat('proxy')
except Exception:
pass
async def client_loop(self, addr: str) -> None:
'''Loop for receiving messages from the TSUN cloud (client-side)'''
clientStream = await self.remoteStream.loop()
logger.info(f'[{clientStream.node_id}:{clientStream.conn_no}] '
'Client loop stopped for'
f' l{clientStream.l_addr}')
# if the client connection closes, we don't touch the server
# connection. Instead we erase the client connection stream,
# thus on the next received packet from the inverter, we can
# establish a new connection to the TSUN cloud
# erase backlink to inverter
clientStream.remoteStream = None
if self.remoteStream == clientStream:
# logging.debug(f'Client l{clientStream.l_addr} refs:'
# f' {gc.get_referrers(clientStream)}')
# than erase client connection
self.remoteStream = None
if self.timeout_cb:
return self.timeout_cb()
return 360
async def loop(self) -> Self:
"""Async loop handler for precessing all received messages"""
self.r_addr = self.writer.get_extra_info('peername')
self.l_addr = self.writer.get_extra_info('sockname')
self.proc_start = time.time()
while True:
try:
proc = time.time() - self.proc_start
if proc > self.proc_max:
self.proc_max = proc
self.proc_start = None
self.__calc_proc_time()
dead_conn_to = self.__timeout()
await asyncio.wait_for(self.__async_read(),
dead_conn_to)
if self.unique_id:
await self.async_write()
await self.__async_forward()
await self.__async_write()
await self.__async_forward()
if self.async_publ_mqtt:
await self.async_publ_mqtt()
except asyncio.TimeoutError:
@@ -112,7 +184,6 @@ class AsyncStream():
f'connection timeout ({dead_conn_to}s) '
f'for {self.l_addr}')
await self.disc()
self.close()
return self
except OSError as error:
@@ -120,55 +191,53 @@ class AsyncStream():
f'{error} for l{self.l_addr} | '
f'r{self.r_addr}')
await self.disc()
self.close()
return self
except RuntimeError as error:
logger.info(f'[{self.node_id}:{self.conn_no}] '
f'{error} for {self.l_addr}')
await self.disc()
self.close()
return self
except Exception:
self.inc_counter('SW_Exception')
Infos.inc_counter('SW_Exception')
logger.error(
f"Exception for {self.addr}:\n"
f"Exception for {self.r_addr}:\n"
f"{traceback.format_exc()}")
await asyncio.sleep(0) # be cooperative to other task
async def async_write(self, headline: str = 'Transmit to ') -> None:
"""Async write handler to transmit the send_buffer"""
if self._send_buffer:
hex_dump_memory(logging.INFO, f'{headline}{self.addr}:',
self._send_buffer, len(self._send_buffer))
self.writer.write(self._send_buffer)
await self.writer.drain()
self._send_buffer = bytearray(0) # self._send_buffer[sent:]
def __calc_proc_time(self):
if self.proc_start:
proc = time.time() - self.proc_start
if proc > self.proc_max:
self.proc_max = proc
self.proc_start = None
async def disc(self) -> None:
"""Async disc handler for graceful disconnect"""
if self.writer.is_closing():
if self._writer.is_closing():
return
logger.debug(f'AsyncStream.disc() l{self.l_addr} | r{self.r_addr}')
self.writer.close()
await self.writer.wait_closed()
self._writer.close()
await self._writer.wait_closed()
def close(self) -> None:
logging.debug(f'AsyncStream.close() l{self.l_addr} | r{self.r_addr}')
"""close handler for a no waiting disconnect
hint: must be called before releasing the connection instance
"""
self.reader.feed_eof() # abort awaited read
if self.writer.is_closing():
super().close()
self._reader.feed_eof() # abort awaited read
if self._writer.is_closing():
return
logger.debug(f'AsyncStream.close() l{self.l_addr} | r{self.r_addr}')
self.writer.close()
self._writer.close()
def healthy(self) -> bool:
elapsed = 0
if self.proc_start is not None:
elapsed = time.time() - self.proc_start
if self.state == State.closed or elapsed > self.MAX_PROC_TIME:
if elapsed > self.MAX_PROC_TIME:
logging.debug(f'[{self.node_id}:{self.conn_no}:'
f'{type(self).__name__}]'
f' act:{round(1000*elapsed)}ms'
@@ -181,61 +250,148 @@ class AsyncStream():
'''
async def __async_read(self) -> None:
"""Async read handler to read received data from TCP stream"""
data = await self.reader.read(4096)
data = await self._reader.read(4096)
if data:
self.proc_start = time.time()
self._recv_buffer += data
wait = self.read() # call read in parent class
if wait > 0:
self.rx_fifo += data
wait = self.rx_fifo() # call read in parent class
if wait and wait > 0:
await asyncio.sleep(wait)
else:
raise RuntimeError("Peer closed.")
async def __async_write(self, headline: str = 'Transmit to ') -> None:
"""Async write handler to transmit the send_buffer"""
if len(self.tx_fifo) > 0:
self.tx_fifo.logging(logging.INFO, f'{headline}{self.r_addr}:')
self._writer.write(self.tx_fifo.get())
await self._writer.drain()
async def __async_forward(self) -> None:
"""forward handler transmits data over the remote connection"""
if not self._forward_buffer:
if len(self.fwd_fifo) == 0:
return
try:
if not self.remoteStream:
await self.async_create_remote()
if self.remoteStream:
if self.remoteStream._init_new_client_conn():
await self.remoteStream.async_write()
if self.remoteStream:
self.remoteStream._update_header(self._forward_buffer)
hex_dump_memory(logging.INFO,
f'Forward to {self.remoteStream.addr}:',
self._forward_buffer,
len(self._forward_buffer))
self.remoteStream.writer.write(self._forward_buffer)
await self.remoteStream.writer.drain()
self._forward_buffer = bytearray(0)
await self._async_forward()
except OSError as error:
if self.remoteStream:
rmt = self.remoteStream
self.remoteStream = None
logger.error(f'[{rmt.node_id}:{rmt.conn_no}] Fwd: {error} for '
f'l{rmt.l_addr} | r{rmt.r_addr}')
await rmt.disc()
rmt.close()
if self.remote.stream:
rmt = self.remote
logger.error(f'[{rmt.stream.node_id}:{rmt.stream.conn_no}] '
f'Fwd: {error} for '
f'l{rmt.ifc.l_addr} | r{rmt.ifc.r_addr}')
await rmt.ifc.disc()
if rmt.ifc.close_cb:
rmt.ifc.close_cb()
except RuntimeError as error:
if self.remoteStream:
rmt = self.remoteStream
self.remoteStream = None
logger.info(f'[{rmt.node_id}:{rmt.conn_no}] '
f'Fwd: {error} for {rmt.l_addr}')
await rmt.disc()
rmt.close()
if self.remote.stream:
rmt = self.remote
logger.info(f'[{rmt.stream.node_id}:{rmt.stream.conn_no}] '
f'Fwd: {error} for {rmt.ifc.l_addr}')
await rmt.ifc.disc()
if rmt.ifc.close_cb:
rmt.ifc.close_cb()
except Exception:
self.inc_counter('SW_Exception')
Infos.inc_counter('SW_Exception')
logger.error(
f"Fwd Exception for {self.addr}:\n"
f"Fwd Exception for {self.r_addr}:\n"
f"{traceback.format_exc()}")
def __del__(self):
logger.debug(
f"AsyncStream.__del__ l{self.l_addr} | r{self.r_addr}")
async def publish_outstanding_mqtt(self):
'''Publish all outstanding MQTT topics'''
try:
await self.async_publ_mqtt()
await Proxy._async_publ_mqtt_proxy_stat('proxy')
except Exception:
pass
class AsyncStreamServer(AsyncStream):
def __init__(self, reader: StreamReader, writer: StreamWriter,
async_publ_mqtt, create_remote,
rstream: "StreamPtr") -> None:
AsyncStream.__init__(self, reader, writer, rstream)
self.create_remote = create_remote
self.async_publ_mqtt = async_publ_mqtt
def close(self) -> None:
logging.debug('AsyncStreamServer.close()')
self.create_remote = None
self.async_publ_mqtt = None
super().close()
async def server_loop(self) -> None:
'''Loop for receiving messages from the inverter (server-side)'''
logger.info(f'[{self.node_id}:{self.conn_no}] '
f'Accept connection from {self.r_addr}')
Infos.inc_counter('Inverter_Cnt')
await self.publish_outstanding_mqtt()
await self.loop()
Infos.dec_counter('Inverter_Cnt')
await self.publish_outstanding_mqtt()
logger.info(f'[{self.node_id}:{self.conn_no}] Server loop stopped for'
f' r{self.r_addr}')
# if the server connection closes, we also have to disconnect
# the connection to te TSUN cloud
if self.remote and self.remote.stream:
logger.info(f'[{self.node_id}:{self.conn_no}] disc client '
f'connection: [{self.remote.ifc.node_id}:'
f'{self.remote.ifc.conn_no}]')
await self.remote.ifc.disc()
async def _async_forward(self) -> None:
"""forward handler transmits data over the remote connection"""
if not self.remote.stream:
await self.create_remote()
if self.remote.stream and \
self.remote.ifc.init_new_client_conn_cb():
await self.remote.ifc._AsyncStream__async_write()
if self.remote.stream:
self.remote.ifc.update_header_cb(self.fwd_fifo.peek())
self.fwd_fifo.logging(logging.INFO, 'Forward to '
f'{self.remote.ifc.r_addr}:')
self.remote.ifc._writer.write(self.fwd_fifo.get())
await self.remote.ifc._writer.drain()
class AsyncStreamClient(AsyncStream):
def __init__(self, reader: StreamReader, writer: StreamWriter,
rstream: "StreamPtr", close_cb) -> None:
AsyncStream.__init__(self, reader, writer, rstream)
self.close_cb = close_cb
async def disc(self) -> None:
logging.debug('AsyncStreamClient.disc()')
self.remote = None
await super().disc()
def close(self) -> None:
logging.debug('AsyncStreamClient.close()')
self.close_cb = None
super().close()
async def client_loop(self, _: str) -> None:
'''Loop for receiving messages from the TSUN cloud (client-side)'''
Infos.inc_counter('Cloud_Conn_Cnt')
await self.publish_outstanding_mqtt()
await self.loop()
Infos.dec_counter('Cloud_Conn_Cnt')
await self.publish_outstanding_mqtt()
logger.info(f'[{self.node_id}:{self.conn_no}] '
'Client loop stopped for'
f' l{self.l_addr}')
if self.close_cb:
self.close_cb()
async def _async_forward(self) -> None:
"""forward handler transmits data over the remote connection"""
if self.remote.stream:
self.remote.ifc.update_header_cb(self.fwd_fifo.peek())
self.fwd_fifo.logging(logging.INFO, 'Forward to '
f'{self.remote.ifc.r_addr}:')
self.remote.ifc._writer.write(self.fwd_fifo.get())
await self.remote.ifc._writer.drain()

52
app/src/byte_fifo.py Normal file
View File

@@ -0,0 +1,52 @@
from messages import hex_dump_str, hex_dump_memory
class ByteFifo:
""" a byte FIFO buffer with trigger callback """
__slots__ = ('__buf', '__trigger_cb')
def __init__(self):
self.__buf = bytearray()
self.__trigger_cb = None
def reg_trigger(self, cb) -> None:
self.__trigger_cb = cb
def __iadd__(self, data):
self.__buf.extend(data)
return self
def __call__(self):
'''triggers the observer'''
if callable(self.__trigger_cb):
return self.__trigger_cb()
return None
def get(self, size: int = None) -> bytearray:
'''removes size numbers of byte and return them'''
if not size:
data = self.__buf
self.clear()
else:
data = self.__buf[:size]
# The fast delete syntax
self.__buf[:size] = b''
return data
def peek(self, size: int = None) -> bytearray:
'''returns size numbers of byte without removing them'''
if not size:
return self.__buf
return self.__buf[:size]
def clear(self):
self.__buf = bytearray()
def __len__(self) -> int:
return len(self.__buf)
def __str__(self) -> str:
return hex_dump_str(self.__buf, self.__len__())
def logging(self, level, info):
hex_dump_memory(level, info, self.__buf, self.__len__())

249
app/src/cnf/config.py Normal file
View File

@@ -0,0 +1,249 @@
'''Config module handles the proxy configuration'''
import shutil
import logging
from abc import ABC, abstractmethod
from schema import Schema, And, Or, Use, Optional
class ConfigIfc(ABC):
'''Abstract basis class for config readers'''
def __init__(self):
Config.add(self)
@abstractmethod
def get_config(self) -> dict: # pragma: no cover
'''get the unverified config from the reader'''
pass
@abstractmethod
def descr(self) -> str: # pragma: no cover
'''return a descriction of the source, e.g. the file name'''
pass
def _extend_key(self, conf, key, val):
'''split a dotted dict key into a hierarchical dict tree '''
lst = key.split('.')
d = conf
for i, idx in enumerate(lst, 1): # pragma: no branch
if i == len(lst):
d[idx] = val
break
if idx not in d:
d[idx] = {}
d = d[idx]
class Config():
'''Static class Config build and sanitize the internal config dictenary.
Using config readers, a partial configuration is added to config.
Config readers are a derivation of the abstract ConfigIfc reader.
When a config reader is instantiated, theits `get_config` method is
called automatically and afterwards the config will be merged.
'''
conf_schema = Schema({
'tsun': {
'enabled': Use(bool),
'host': Use(str),
'port': And(Use(int), lambda n: 1024 <= n <= 65535)
},
'solarman': {
'enabled': Use(bool),
'host': Use(str),
'port': And(Use(int), lambda n: 1024 <= n <= 65535)
},
'mqtt': {
'host': Use(str),
'port': And(Use(int), lambda n: 1024 <= n <= 65535),
'user': Or(None, And(Use(str),
Use(lambda s: s if len(s) > 0 else None))),
'passwd': Or(None, And(Use(str),
Use(lambda s: s if len(s) > 0 else None)))
},
'ha': {
'auto_conf_prefix': Use(str),
'discovery_prefix': Use(str),
'entity_prefix': Use(str),
'proxy_node_id': Use(str),
'proxy_unique_id': Use(str)
},
'gen3plus': {
'at_acl': {
Or('mqtt', 'tsun'): {
'allow': [str],
Optional('block', default=[]): [str]
}
}
},
'inverters': {
'allow_all': Use(bool),
And(Use(str), lambda s: len(s) == 16): {
Optional('monitor_sn', default=0): Use(int),
Optional('node_id', default=""): And(Use(str),
Use(lambda s: s + '/'
if len(s) > 0
and s[-1] != '/'
else s)),
Optional('client_mode'): {
'host': Use(str),
Optional('port', default=8899):
And(Use(int), lambda n: 1024 <= n <= 65535),
Optional('forward', default=False): Use(bool),
},
Optional('modbus_polling', default=True): Use(bool),
Optional('modbus_scanning'): {
'start': Use(int),
Optional('step', default=0x400): Use(int),
Optional('bytes', default=0x10): Use(int),
},
Optional('suggested_area', default=""): Use(str),
Optional('sensor_list', default=0): Use(int),
Optional('pv1'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
},
Optional('pv2'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
},
Optional('pv3'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
},
Optional('pv4'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
},
Optional('pv5'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
},
Optional('pv6'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
}
}
},
'batteries': {
And(Use(str), lambda s: len(s) == 16): {
Optional('monitor_sn', default=0): Use(int),
Optional('node_id', default=""): And(Use(str),
Use(lambda s: s + '/'
if len(s) > 0
and s[-1] != '/'
else s)),
Optional('client_mode'): {
'host': Use(str),
Optional('port', default=8899):
And(Use(int), lambda n: 1024 <= n <= 65535),
Optional('forward', default=False): Use(bool),
},
Optional('modbus_polling', default=True): Use(bool),
Optional('modbus_scanning'): {
'start': Use(int),
Optional('step', default=0x400): Use(int),
Optional('bytes', default=0x10): Use(int),
},
Optional('suggested_area', default=""): Use(str),
Optional('sensor_list', default=0): Use(int),
Optional('pv1'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
},
Optional('pv2'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
}
}
}
}, ignore_extra_keys=True
)
@classmethod
def init(cls, def_reader: ConfigIfc) -> None | str:
'''Initialise the Proxy-Config
Copy the internal default config file into the config directory
and initialise the Config with the default configuration '''
cls.err = None
cls.def_config = {}
try:
# make the default config transparaent by copying it
# in the config.example file
logging.debug('Copy Default Config to config.example.toml')
shutil.copy2("default_config.toml",
"config/config.example.toml")
except Exception:
pass
# read example config file as default configuration
try:
def_config = def_reader.get_config()
cls.def_config = cls.conf_schema.validate(def_config)
logging.info(f'Read from {def_reader.descr()} => ok')
except Exception as error:
cls.err = f'Config.read: {error}'
logging.error(
f"Can't read from {def_reader.descr()} => error\n {error}")
cls.act_config = cls.def_config.copy()
@classmethod
def add(cls, reader: ConfigIfc):
'''Merge the config from the Config Reader into the config
Checks if a default config exists. If no default configuration exists,
the Config.init method has not yet been called.This is normal for the very
first Config Reader which creates the default config and must be ignored
here. The default config reader is handled in the Config.init method'''
if hasattr(cls, 'def_config'):
cls.__parse(reader)
@classmethod
def get_error(cls) -> None | str:
'''return the last error as a string or None if there is no error'''
return cls.err
@classmethod
def __parse(cls, reader) -> None | str:
'''Read config from the reader, merge it with the default config
and sanitize the result'''
res = 'ok'
try:
rd_config = reader.get_config()
config = cls.act_config.copy()
for key in ['tsun', 'solarman', 'mqtt', 'ha', 'inverters',
'gen3plus', 'batteries']:
if key in rd_config:
config[key] = config[key] | rd_config[key]
cls.act_config = cls.conf_schema.validate(config)
except FileNotFoundError:
res = 'n/a'
except Exception as error:
cls.err = f'error: {error}'
logging.error(
f"Can't read from {reader.descr()} => error\n {error}")
return cls.err
logging.info(f'Read from {reader.descr()} => {res}')
return cls.err
@classmethod
def get(cls, member: str = None):
'''Get a named attribute from the proxy config. If member ==
None it returns the complete config dict'''
if member:
return cls.act_config.get(member, {})
else:
return cls.act_config
@classmethod
def is_default(cls, member: str) -> bool:
'''Check if the member is the default value'''
return cls.act_config.get(member) == cls.def_config.get(member)

View File

@@ -0,0 +1,25 @@
'''Config Reader module which handles config values from the environment'''
import os
from cnf.config import ConfigIfc
class ConfigReadEnv(ConfigIfc):
'''Reader for environment values of the configuration'''
def get_config(self) -> dict:
conf = {}
data = [
('mqtt.host', 'MQTT_HOST'),
('mqtt.port', 'MQTT_PORT'),
('mqtt.user', 'MQTT_USER'),
('mqtt.passwd', 'MQTT_PASSWORD'),
]
for key, env_var in data:
val = os.getenv(env_var)
if val:
self._extend_key(conf, key, val)
return conf
def descr(self):
return "environment"

View File

@@ -0,0 +1,47 @@
'''Config Reader module which handles *.json config files'''
import json
from cnf.config import ConfigIfc
class ConfigReadJson(ConfigIfc):
'''Reader for json config files'''
def __init__(self, cnf_file='/data/options.json'):
'''Read a json file and add the settings to the config'''
if not isinstance(cnf_file, str):
return
self.cnf_file = cnf_file
super().__init__()
def convert_inv(self, conf, inv):
if 'serial' in inv:
snr = inv['serial']
del inv['serial']
conf[snr] = {}
for key, val in inv.items():
self._extend_key(conf[snr], key, val)
def convert_inv_arr(self, conf, key, val: list):
if key not in conf:
conf[key] = {}
for elm in val:
self.convert_inv(conf[key], elm)
def convert_to_obj(self, data):
conf = {}
for key, val in data.items():
if (key == 'inverters' or key == 'batteries') and \
isinstance(val, list):
self.convert_inv_arr(conf, key, val)
else:
self._extend_key(conf, key, val)
return conf
def get_config(self) -> dict:
with open(self.cnf_file) as f:
data = json.load(f)
return self.convert_to_obj(data)
def descr(self):
return self.cnf_file

View File

@@ -0,0 +1,21 @@
'''Config Reader module which handles *.toml config files'''
import tomllib
from cnf.config import ConfigIfc
class ConfigReadToml(ConfigIfc):
'''Reader for toml config files'''
def __init__(self, cnf_file):
'''Read a toml file and add the settings to the config'''
if not isinstance(cnf_file, str):
return
self.cnf_file = cnf_file
super().__init__()
def get_config(self) -> dict:
with open(self.cnf_file, "rb") as f:
return tomllib.load(f)
def descr(self):
return self.cnf_file

View File

@@ -0,0 +1,204 @@
##########################################################################################
###
### T S U N - G E N 3 - P R O X Y
###
### from Stefan Allius
###
##########################################################################################
###
### The readme will give you an overview of the project:
### https://s-allius.github.io/tsun-gen3-proxy/
###
### The proxy supports different operation modes. Select the proper mode
### which depends on your inverter type and you inverter firmware.
### Please read:
### https://github.com/s-allius/tsun-gen3-proxy/wiki/Operation-Modes-Overview
###
### Here you will find a description of all configuration options:
### https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-toml
###
### The configration uses the TOML format, which aims to be easy to read due to
### obvious semantics. You find more details here: https://toml.io/en/v1.0.0
###
##########################################################################################
##########################################################################################
##
## MQTT broker configuration
##
## In this block, you must configure the connection to your MQTT broker and specify the
## required credentials. As the proxy does not currently support an encrypted connection
## to the MQTT broker, it is strongly recommended that you do not use a public broker.
##
## https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-toml#mqtt-broker-account
##
mqtt.host = 'mqtt' # URL or IP address of the mqtt broker
mqtt.port = 1883
mqtt.user = ''
mqtt.passwd = ''
##########################################################################################
##
## HOME ASSISTANT
##
## The proxy supports the MQTT autoconfiguration of Home Assistant (HA). The default
## values match the HA default configuration. If you need to change these or want to use
## a different MQTT client, you can adjust the prefixes of the MQTT topics below.
##
## https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-toml#home-assistant
##
ha.auto_conf_prefix = 'homeassistant' # MQTT prefix for subscribing for homeassistant status updates
ha.discovery_prefix = 'homeassistant' # MQTT prefix for discovery topic
ha.entity_prefix = 'tsun' # MQTT topic prefix for publishing inverter values
ha.proxy_node_id = 'proxy' # MQTT node id, for the proxy_node_id
ha.proxy_unique_id = 'P170000000000001' # MQTT unique id, to identify a proxy instance
##########################################################################################
##
## GEN3 Proxy Mode Configuration
##
## In this block, you can configure an optional connection to the TSUN cloud for GEN3
## inverters. This connection is only required if you want send data to the TSUN cloud
## to use the TSUN APPs or receive firmware updates.
##
## https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-toml#tsun-cloud-for-gen3-inverter-only
##
tsun.enabled = true # false: disables connecting to the tsun cloud, and avoids updates
tsun.host = 'logger.talent-monitoring.com'
tsun.port = 5005
##########################################################################################
##
## GEN3PLUS Proxy Mode Configuration
##
## In this block, you can configure an optional connection to the TSUN cloud for GEN3PLUS
## inverters. This connection is only required if you want send data to the TSUN cloud
## to use the TSUN APPs or receive firmware updates.
##
## https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-toml#solarman-cloud-for-gen3plus-inverter-only
##
solarman.enabled = true # false: disables connecting to the tsun cloud, and avoids updates
solarman.host = 'iot.talent-monitoring.com'
solarman.port = 10000
##########################################################################################
###
### Inverter Definitions
###
### The proxy supports the simultaneous operation of several inverters, even of different
### types. A configuration block must be defined for each inverter, in which all necessary
### parameters must be specified. These depend on the operation mode used and also differ
### slightly depending on the inverter type.
###
### In addition, the PV modules can be defined at the individual inputs for documentation
### purposes, whereby these are displayed in Home Assistant.
###
### The proxy only accepts connections from known inverters. This can be switched off for
### test purposes and unknown serial numbers are also accepted.
###
inverters.allow_all = false # only allow known inverters
##########################################################################################
##
## For each GEN3 inverter, the serial number of the inverter must be mapped to an MQTT
## definition. To do this, the corresponding configuration block is started with
## `[inverters.“<16-digit serial number>”]` so that all subsequent parameters are assigned
## to this inverter. Further inverter-specific parameters (e.g. polling mode) can be set
## in the configuration block
##
## The serial numbers of all GEN3 inverters start with `R17`!
##
[inverters."R170000000000001"]
node_id = '' # MQTT replacement for inverters serial number
suggested_area = '' # suggested installation area for home-assistant
modbus_polling = false # Disable optional MODBUS polling
pv1 = {type = 'RSM40-8-395M', manufacturer = 'Risen'} # Optional, PV module descr
pv2 = {type = 'RSM40-8-395M', manufacturer = 'Risen'} # Optional, PV module descr
##########################################################################################
##
## For each GEN3PLUS inverter, the serial number of the inverter must be mapped to an MQTT
## definition. To do this, the corresponding configuration block is started with
## `[inverters.“<16-digit serial number>”]` so that all subsequent parameters are assigned
## to this inverter. Further inverter-specific parameters (e.g. polling mode, client mode)
## can be set in the configuration block
##
## The serial numbers of all GEN3PLUS inverters start with `Y17` or Y47! Each GEN3PLUS
## inverter is supplied with a “Monitoring SN:”. This can be found on a sticker enclosed
## with the inverter.
##
[inverters."Y170000000000001"]
monitor_sn = 2000000000 # The GEN3PLUS "Monitoring SN:"
node_id = '' # MQTT replacement for inverters serial number
suggested_area = '' # suggested installation place for home-assistant
modbus_polling = true # Enable optional MODBUS polling
# if your inverter supports SSL connections you must use the client_mode. Pls, uncomment
# the next line and configure the fixed IP of your inverter
#client_mode = {host = '192.168.0.1', port = 8899, forward = true}
pv1 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
pv2 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
pv3 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
pv4 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
##########################################################################################
##
## For each GEN3PLUS energy storage system, the serial number must be mapped to an MQTT
## definition. To do this, the corresponding configuration block is started with
## `[batteries.“<16-digit serial number>”]` so that all subsequent parameters are assigned
## to this energy storage system. Further device-specific parameters (e.g. polling mode,
## client mode) can be set in the configuration block
##
## The serial numbers of all GEN3PLUS energy storage systems/batteries start with `410`!
## Each GEN3PLUS device is supplied with a “Monitoring SN:”. This can be found on a
## sticker enclosed with the inverter.
##
[batteries."4100000000000001"]
monitor_sn = 3000000000 # The GEN3PLUS "Monitoring SN:"
node_id = '' # MQTT replacement for devices serial number
suggested_area = '' # suggested installation place for home-assistant
modbus_polling = true # Enable optional MODBUS polling
# if your inverter supports SSL connections you must use the client_mode. Pls, uncomment
# the next line and configure the fixed IP of your inverter
#client_mode = {host = '192.168.0.1', port = 8899, forward = true}
pv1 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
pv2 = {type = 'RSM40-8-410M', manufacturer = 'Risen'} # Optional, PV module descr
##########################################################################################
###
### If the proxy mode is configured, commands from TSUN can be sent to the inverter via
### this connection or parameters (e.g. network credentials) can be queried. Filters can
### then be configured for the AT+ commands from the TSUN Cloud so that only certain
### accesses are permitted.
###
### An overview of all known AT+ commands can be found here:
### https://github.com/s-allius/tsun-gen3-proxy/wiki/AT--commands
###
[gen3plus.at_acl]
# filter for received commands from the internet
tsun.allow = ['AT+Z', 'AT+UPURL', 'AT+SUPDATE']
tsun.block = []
# filter for received commands from the MQTT broker
mqtt.allow = ['AT+']
mqtt.block = []

View File

@@ -1,172 +0,0 @@
'''Config module handles the proxy configuration in the config.toml file'''
import shutil
import tomllib
import logging
from schema import Schema, And, Or, Use, Optional
class Config():
'''Static class Config is reads and sanitize the config.
Read config.toml file and sanitize it with read().
Get named parts of the config with get()'''
config = {}
def_config = {}
conf_schema = Schema({
'tsun': {
'enabled': Use(bool),
'host': Use(str),
'port': And(Use(int), lambda n: 1024 <= n <= 65535)
},
'solarman': {
'enabled': Use(bool),
'host': Use(str),
'port': And(Use(int), lambda n: 1024 <= n <= 65535)
},
'mqtt': {
'host': Use(str),
'port': And(Use(int), lambda n: 1024 <= n <= 65535),
'user': And(Use(str), Use(lambda s: s if len(s) > 0 else None)),
'passwd': And(Use(str), Use(lambda s: s if len(s) > 0 else None))
},
'ha': {
'auto_conf_prefix': Use(str),
'discovery_prefix': Use(str),
'entity_prefix': Use(str),
'proxy_node_id': Use(str),
'proxy_unique_id': Use(str)
},
'gen3plus': {
'at_acl': {
Or('mqtt', 'tsun'): {
'allow': [str],
Optional('block', default=[]): [str]
}
}
},
'inverters': {
'allow_all': Use(bool), And(Use(str), lambda s: len(s) == 16): {
Optional('monitor_sn', default=0): Use(int),
Optional('node_id', default=""): And(Use(str),
Use(lambda s: s + '/'
if len(s) > 0 and
s[-1] != '/' else s)),
Optional('suggested_area', default=""): Use(str),
Optional('pv1'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
},
Optional('pv2'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
},
Optional('pv3'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
},
Optional('pv4'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
},
Optional('pv5'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
},
Optional('pv6'): {
Optional('type'): Use(str),
Optional('manufacturer'): Use(str),
}
}}
}, ignore_extra_keys=True
)
@classmethod
def class_init(cls) -> None | str: # pragma: no cover
try:
# make the default config transparaent by copying it
# in the config.example file
logging.debug('Copy Default Config to config.example.toml')
shutil.copy2("default_config.toml",
"config/config.example.toml")
except Exception:
pass
err_str = cls.read()
del cls.conf_schema
return err_str
@classmethod
def _read_config_file(cls) -> dict: # pragma: no cover
usr_config = {}
try:
with open("config/config.toml", "rb") as f:
usr_config = tomllib.load(f)
except Exception as error:
err = f'Config.read: {error}'
logging.error(err)
logging.info(
'\n To create the missing config.toml file, '
'you can rename the template config.example.toml\n'
' and customize it for your scenario.\n')
return usr_config
@classmethod
def read(cls, path='') -> None | str:
'''Read config file, merge it with the default config
and sanitize the result'''
err = None
config = {}
logger = logging.getLogger('data')
try:
# read example config file as default configuration
cls.def_config = {}
with open(f"{path}default_config.toml", "rb") as f:
def_config = tomllib.load(f)
cls.def_config = cls.conf_schema.validate(def_config)
# overwrite the default values, with values from
# the config.toml file
usr_config = cls._read_config_file()
# merge the default and the user config
config = def_config.copy()
for key in ['tsun', 'solarman', 'mqtt', 'ha', 'inverters',
'gen3plus']:
if key in usr_config:
config[key] |= usr_config[key]
try:
cls.config = cls.conf_schema.validate(config)
except Exception as error:
err = f'Config.read: {error}'
logging.error(err)
# logging.debug(f'Readed config: "{cls.config}" ')
except Exception as error:
err = f'Config.read: {error}'
logger.error(err)
cls.config = {}
return err
@classmethod
def get(cls, member: str = None):
'''Get a named attribute from the proxy config. If member ==
None it returns the complete config dict'''
if member:
return cls.config.get(member, {})
else:
return cls.config
@classmethod
def is_default(cls, member: str) -> bool:
'''Check if the member is the default value'''
return cls.config.get(member) == cls.def_config.get(member)

View File

@@ -1,42 +0,0 @@
import logging
# import gc
from asyncio import StreamReader, StreamWriter
from async_stream import AsyncStream
from gen3.talent import Talent
logger = logging.getLogger('conn')
class ConnectionG3(AsyncStream, Talent):
def __init__(self, reader: StreamReader, writer: StreamWriter,
addr, remote_stream: 'ConnectionG3', server_side: bool,
id_str=b'') -> None:
AsyncStream.__init__(self, reader, writer, addr)
Talent.__init__(self, server_side, id_str)
self.remoteStream: 'ConnectionG3' = remote_stream
'''
Our puplic methods
'''
def close(self):
AsyncStream.close(self)
Talent.close(self)
# logger.info(f'AsyncStream refs: {gc.get_referrers(self)}')
async def async_create_remote(self) -> None:
pass
async def async_publ_mqtt(self) -> None:
pass
def healthy(self) -> bool:
logger.debug('ConnectionG3 healthy()')
return AsyncStream.healthy(self)
'''
Our private methods
'''
def __del__(self):
super().__del__()

View File

@@ -2,91 +2,181 @@
import struct
import logging
from typing import Generator
from itertools import chain
if __name__ == "app.src.gen3.infos_g3":
from app.src.infos import Infos, Register
else: # pragma: no cover
from infos import Infos, Register
from infos import Infos, Register
class RegisterMap:
__slots__ = ()
map = {
0x00092ba8: Register.COLLECTOR_FW_VERSION,
0x000927c0: Register.CHIP_TYPE,
0x00092f90: Register.CHIP_MODEL,
0x00095a88: Register.TRACE_URL,
0x00095aec: Register.LOGGER_URL,
0x0000000a: Register.PRODUCT_NAME,
0x00000014: Register.MANUFACTURER,
0x0000001e: Register.VERSION,
0x00000028: Register.SERIAL_NUMBER,
0x00000032: Register.EQUIPMENT_MODEL,
0x00013880: Register.NO_INPUTS,
0xffffff00: Register.INVERTER_CNT,
0xffffff01: Register.UNKNOWN_SNR,
0xffffff02: Register.UNKNOWN_MSG,
0xffffff03: Register.INVALID_DATA_TYPE,
0xffffff04: Register.INTERNAL_ERROR,
0xffffff05: Register.UNKNOWN_CTRL,
0xffffff06: Register.OTA_START_MSG,
0xffffff07: Register.SW_EXCEPTION,
0xffffff08: Register.MAX_DESIGNED_POWER,
0xfffffffe: Register.TEST_REG1,
0xffffffff: Register.TEST_REG2,
0x00000640: Register.OUTPUT_POWER,
0x000005dc: Register.RATED_POWER,
0x00000514: Register.INVERTER_TEMP,
0x000006a4: Register.PV1_VOLTAGE,
0x00000708: Register.PV1_CURRENT,
0x0000076c: Register.PV1_POWER,
0x000007d0: Register.PV2_VOLTAGE,
0x00000834: Register.PV2_CURRENT,
0x00000898: Register.PV2_POWER,
0x000008fc: Register.PV3_VOLTAGE,
0x00000960: Register.PV3_CURRENT,
0x000009c4: Register.PV3_POWER,
0x00000a28: Register.PV4_VOLTAGE,
0x00000a8c: Register.PV4_CURRENT,
0x00000af0: Register.PV4_POWER,
0x00000c1c: Register.PV1_DAILY_GENERATION,
0x00000c80: Register.PV1_TOTAL_GENERATION,
0x00000ce4: Register.PV2_DAILY_GENERATION,
0x00000d48: Register.PV2_TOTAL_GENERATION,
0x00000dac: Register.PV3_DAILY_GENERATION,
0x00000e10: Register.PV3_TOTAL_GENERATION,
0x00000e74: Register.PV4_DAILY_GENERATION,
0x00000ed8: Register.PV4_TOTAL_GENERATION,
0x00000b54: Register.DAILY_GENERATION,
0x00000bb8: Register.TOTAL_GENERATION,
0x000003e8: Register.GRID_VOLTAGE,
0x0000044c: Register.GRID_CURRENT,
0x000004b0: Register.GRID_FREQUENCY,
0x000cfc38: Register.CONNECT_COUNT,
0x000c3500: Register.SIGNAL_STRENGTH,
0x000c96a8: Register.POWER_ON_TIME,
0x000d0020: Register.COLLECT_INTERVAL,
0x000cf850: Register.DATA_UP_INTERVAL,
0x000c7f38: Register.COMMUNICATION_TYPE,
0x00000191: Register.EVENT_401,
0x00000192: Register.EVENT_402,
0x00000193: Register.EVENT_403,
0x00000194: Register.EVENT_404,
0x00000195: Register.EVENT_405,
0x00000196: Register.EVENT_406,
0x00000197: Register.EVENT_407,
0x00000198: Register.EVENT_408,
0x00000199: Register.EVENT_409,
0x0000019a: Register.EVENT_410,
0x0000019b: Register.EVENT_411,
0x0000019c: Register.EVENT_412,
0x0000019d: Register.EVENT_413,
0x0000019e: Register.EVENT_414,
0x0000019f: Register.EVENT_415,
0x000001a0: Register.EVENT_416,
0xffffff00: {'reg': Register.INVERTER_CNT},
0xffffff01: {'reg': Register.UNKNOWN_SNR},
0xffffff02: {'reg': Register.UNKNOWN_MSG},
0xffffff03: {'reg': Register.INVALID_DATA_TYPE},
0xffffff04: {'reg': Register.INTERNAL_ERROR},
0xffffff05: {'reg': Register.UNKNOWN_CTRL},
0xffffff06: {'reg': Register.OTA_START_MSG},
0xffffff07: {'reg': Register.SW_EXCEPTION},
0xffffff08: {'reg': Register.POLLING_INTERVAL},
0xfffffffe: {'reg': Register.TEST_REG1},
0xffffffff: {'reg': Register.TEST_REG2},
}
map_0e100000 = {
0x00092ba8: {'reg': Register.COLLECTOR_FW_VERSION},
0x000927c0: {'reg': Register.CHIP_TYPE},
0x00092f90: {'reg': Register.CHIP_MODEL},
0x00094ae8: {'reg': Register.MAC_ADDR},
0x00095a88: {'reg': Register.TRACE_URL},
0x00095aec: {'reg': Register.LOGGER_URL},
0x000cfc38: {'reg': Register.CONNECT_COUNT},
0x000c3500: {'reg': Register.SIGNAL_STRENGTH},
0x000c96a8: {'reg': Register.POWER_ON_TIME},
0x000d0020: {'reg': Register.COLLECT_INTERVAL},
0x000cf850: {'reg': Register.DATA_UP_INTERVAL},
0x000c7f38: {'reg': Register.COMMUNICATION_TYPE},
}
map_01900000 = {
0x0000000a: {'reg': Register.PRODUCT_NAME},
0x00000014: {'reg': Register.MANUFACTURER},
0x0000001e: {'reg': Register.VERSION},
0x00000046: {'reg': Register.SERIAL_NUMBER},
0x0000005A: {'reg': Register.EQUIPMENT_MODEL},
0x00000064: {'reg': Register.INVERTER_STATUS},
0x00000190: {'reg': Register.EVENT_ALARM},
0x000001f4: {'reg': Register.EVENT_FAULT},
0x00000258: {'reg': Register.EVENT_BF1},
0x000002bc: {'reg': Register.EVENT_BF2},
0x00000320: {'reg': Register.TEST_IVAL_1},
0x000003e8: {'reg': Register.TEST_VAL_0},
0x0000044c: {'reg': Register.TEST_VAL_1}, # DC 1 Inpput Voltage *10
0x000004b0: {'reg': Register.TEST_VAL_2},
0x00000514: {'reg': Register.GRID_VOLTAGE}, # Grid Voltage
0x00000578: {'reg': Register.GRID_CURRENT}, # Grid Current
0x000005dc: {'reg': Register.TEST_VAL_3},
0x00000640: {'reg': Register.GRID_FREQUENCY},
0x000006a4: {'reg': Register.TEST_IVAL_2},
0x00000708: {'reg': Register.TEST_IVAL_3},
0x0000076c: {'reg': Register.TEST_IVAL_4},
0x000007d0: {'reg': Register.TEST_VAL_4}, # DC 2 Input Voltage *10
0x00000834: {'reg': Register.MAX_DESIGNED_POWER},
0x00000898: {'reg': Register.OUTPUT_POWER}, # Grid Power
0x000008fc: {'reg': Register.DAILY_GENERATION}, # Daily Generation
0x00000960: {'reg': Register.TOTAL_GENERATION}, # Total Genration
0x000009c4: {'reg': Register.TEST_IVAL_5},
0x00000a28: {'reg': Register.TEST_VAL_10}, # Isolationsimpedanz Rx
0x00000a8c: {'reg': Register.TEST_VAL_11}, # Isolationsimpedanz Ry
0x00000af0: {'reg': Register.TEST_IVAL_6},
0x000001324: {'reg': Register.PV1_VOLTAGE}, # PV1 Voltage
0x000001388: {'reg': Register.PV1_CURRENT}, # PV1 Current
0x0000013ec: {'reg': Register.PV1_POWER}, # PV1 Power
0x000001450: {'reg': Register.TEST_VAL_5},
0x0000015e0: {'reg': Register.PV2_VOLTAGE}, # PV2 Voltage
0x000001644: {'reg': Register.PV2_CURRENT}, # PV2 Current
0x0000016a8: {'reg': Register.PV2_POWER}, # PV2 Power
0x00000170c: {'reg': Register.TEST_VAL_6},
0x00000189c: {'reg': Register.PV3_VOLTAGE},
0x000001900: {'reg': Register.PV3_CURRENT},
0x000001964: {'reg': Register.PV3_POWER},
0x0000019c8: {'reg': Register.TEST_VAL_7},
0x000001c20: {'reg': Register.TEST_VAL_14},
0x000001c84: {'reg': Register.TEST_VAL_15},
0x000001ce8: {'reg': Register.TEST_VAL_16}, # DC 1 Voltage
0x000001d4c: {'reg': Register.TEST_VAL_17},
0x000001db0: {'reg': Register.TEST_VAL_18},
0x000001e14: {'reg': Register.TEST_IVAL_8},
0x000001e78: {'reg': Register.PV4_VOLTAGE},
0x000001edc: {'reg': Register.PV4_CURRENT},
0x000001f40: {'reg': Register.PV4_POWER},
0x000001fa4: {'reg': Register.TEST_VAL_8},
0x0000020c9: {'reg': Register.TEST_IVAL_9},
0x0000020db: {'reg': Register.TEST_IVAL_10},
0x000002134: {'reg': Register.PV5_VOLTAGE},
0x000002198: {'reg': Register.PV5_CURRENT},
0x0000021fc: {'reg': Register.PV5_POWER},
# 0x000002260: {'reg': Register.TEST_VAL_13},
0x0000023f0: {'reg': Register.PV6_VOLTAGE},
0x000002454: {'reg': Register.PV6_CURRENT},
0x0000024b8: {'reg': Register.PV6_POWER},
# 0x00000251c: {'reg': Register.TEST_VAL_14},
0x000002774: {'reg': Register.TEST_VAL_24},
0x0000027d8: {'reg': Register.TEST_VAL_25},
0x00000283c: {'reg': Register.TEST_VAL_26}, # DC 2 Voltage
0x0000028a0: {'reg': Register.TEST_VAL_27},
0x000002904: {'reg': Register.TEST_VAL_28},
0x000002968: {'reg': Register.TEST_IVAL_11},
0x0000029cc: {'reg': Register.TEST_IVAL_12},
}
map_01900001 = {
0x0000000a: {'reg': Register.PRODUCT_NAME},
0x00000014: {'reg': Register.MANUFACTURER},
0x0000001e: {'reg': Register.VERSION},
0x00000028: {'reg': Register.SERIAL_NUMBER},
0x00000032: {'reg': Register.EQUIPMENT_MODEL},
0x00013880: {'reg': Register.NO_INPUTS},
0x00000640: {'reg': Register.OUTPUT_POWER},
0x000005dc: {'reg': Register.RATED_POWER},
0x00000514: {'reg': Register.INVERTER_TEMP},
0x000006a4: {'reg': Register.PV1_VOLTAGE},
0x00000708: {'reg': Register.PV1_CURRENT},
0x0000076c: {'reg': Register.PV1_POWER},
0x000007d0: {'reg': Register.PV2_VOLTAGE},
0x00000834: {'reg': Register.PV2_CURRENT},
0x00000898: {'reg': Register.PV2_POWER},
0x000008fc: {'reg': Register.PV3_VOLTAGE},
0x00000960: {'reg': Register.PV3_CURRENT},
0x000009c4: {'reg': Register.PV3_POWER},
0x00000a28: {'reg': Register.PV4_VOLTAGE},
0x00000a8c: {'reg': Register.PV4_CURRENT},
0x00000af0: {'reg': Register.PV4_POWER},
0x00000c1c: {'reg': Register.PV1_DAILY_GENERATION},
0x00000c80: {'reg': Register.PV1_TOTAL_GENERATION},
0x00000ce4: {'reg': Register.PV2_DAILY_GENERATION},
0x00000d48: {'reg': Register.PV2_TOTAL_GENERATION},
0x00000dac: {'reg': Register.PV3_DAILY_GENERATION},
0x00000e10: {'reg': Register.PV3_TOTAL_GENERATION},
0x00000e74: {'reg': Register.PV4_DAILY_GENERATION},
0x00000ed8: {'reg': Register.PV4_TOTAL_GENERATION},
0x00000b54: {'reg': Register.DAILY_GENERATION},
0x00000bb8: {'reg': Register.TOTAL_GENERATION},
0x000003e8: {'reg': Register.GRID_VOLTAGE},
0x0000044c: {'reg': Register.GRID_CURRENT},
0x000004b0: {'reg': Register.GRID_FREQUENCY},
0x00000190: {'reg': Register.EVENT_ALARM},
0x000001f4: {'reg': Register.EVENT_FAULT},
0x00000258: {'reg': Register.EVENT_BF1},
0x000002bc: {'reg': Register.EVENT_BF2},
0x00000064: {'reg': Register.INVERTER_STATUS},
0x00000fa0: {'reg': Register.BOOT_STATUS},
0x00001004: {'reg': Register.DSP_STATUS},
0x000010cc: {'reg': Register.WORK_MODE},
0x000011f8: {'reg': Register.OUTPUT_SHUTDOWN},
0x0000125c: {'reg': Register.MAX_DESIGNED_POWER},
0x000012c0: {'reg': Register.RATED_LEVEL},
0x00001324: {'reg': Register.INPUT_COEFFICIENT, 'ratio': 100/1024},
0x00001388: {'reg': Register.GRID_VOLT_CAL_COEF},
0x00002710: {'reg': Register.PROD_COMPL_TYPE},
0x00003200: {'reg': Register.OUTPUT_COEFFICIENT, 'ratio': 100/1024},
}
class RegisterSel:
__sensor_map = {
0x0e100000: RegisterMap.map_0e100000,
0x01900000: RegisterMap.map_01900000,
0x01900001: RegisterMap.map_01900001,
}
@classmethod
def get(cls, sensor: int):
return cls.__sensor_map.get(sensor, RegisterMap.map)
class InfosG3(Infos):
__slots__ = ()
def ha_confs(self, ha_prfx: str, node_id: str, snr: str,
sug_area: str = '') \
@@ -100,17 +190,27 @@ class InfosG3(Infos):
entity strings
sug_area:str ==> suggested area string from the config file'''
# iterate over RegisterMap.map and get the register values
for reg in RegisterMap.map.values():
sensor = self.get_db_value(Register.SENSOR_LIST)
if "01900000" == sensor:
items = RegisterMap.map_01900000.items()
elif "01900001" == sensor:
items = RegisterMap.map_01900001.items()
else:
items = {}
for _, row in chain(RegisterMap.map_0e100000.items(), items):
reg = row['reg']
res = self.ha_conf(reg, ha_prfx, node_id, snr, False, sug_area) # noqa: E501
if res:
yield res
def parse(self, buf, ind=0, node_id: str = '') -> \
def parse(self, buf, ind=0, sensor: int = 0, node_id: str = '') -> \
Generator[tuple[str, bool], None, None]:
'''parse a data sequence received from the inverter and
stores the values in Infos.db
buf: buffer of the sequence to parse'''
reg_map = RegisterSel.get(sensor)
result = struct.unpack_from('!l', buf, ind)
elms = result[0]
i = 0
@@ -118,10 +218,12 @@ class InfosG3(Infos):
while i < elms:
result = struct.unpack_from('!lB', buf, ind)
addr = result[0]
if addr not in RegisterMap.map:
if addr not in reg_map:
row = None
info_id = -1
else:
info_id = RegisterMap.map[addr]
row = reg_map[addr]
info_id = row['reg']
data_type = result[1]
ind += 5
@@ -136,7 +238,6 @@ class InfosG3(Infos):
i = elms # abort the loop
elif data_type == 0x41: # 'A' -> Nop ??
# result = struct.unpack_from('!l', buf, ind)[0]
ind += 0
i += 1
continue
@@ -168,17 +269,27 @@ class InfosG3(Infos):
" not supported")
return
keys, level, unit, must_incr = self._key_obj(info_id)
if keys:
name, update = self.update_db(keys, must_incr, result)
yield keys[0], update
else:
update = False
name = str(f'info-id.0x{addr:x}')
if update:
self.tracer.log(level, f'[{node_id}] GEN3: {name} :'
f' {result}{unit}')
result = self.__modify_val(row, result)
yield from self.__store_result(addr, result, info_id, node_id)
i += 1
def __modify_val(self, row, result):
if row and 'ratio' in row:
result = round(result * row['ratio'], 2)
return result
def __store_result(self, addr, result, info_id, node_id):
keys, level, unit, must_incr = self._key_obj(info_id)
if keys:
name, update = self.update_db(keys, must_incr, result)
yield keys[0], update
else:
update = False
name = str(f'info-id.0x{addr:x}')
if update:
self.tracer.log(level, f'[{node_id}] GEN3: {name} :'
f' {result}{unit}')
logging.log(level, f'[{node_id}] GEN3: {name} :'
f' {result}{unit}')

View File

@@ -1,132 +1,9 @@
import logging
import traceback
import json
import asyncio
from asyncio import StreamReader, StreamWriter
from config import Config
from inverter import Inverter
from gen3.connection_g3 import ConnectionG3
from aiomqtt import MqttCodeError
from infos import Infos
# import gc
# logger = logging.getLogger('conn')
logger_mqtt = logging.getLogger('mqtt')
from inverter_base import InverterBase
from gen3.talent import Talent
class InverterG3(Inverter, ConnectionG3):
'''class Inverter is a derivation of an Async_Stream
The class has some class method for managing common resources like a
connection to the MQTT broker or proxy error counter which are common
for all inverter connection
Instances of the class are connections to an inverter and can have an
optional link to an remote connection to the TSUN cloud. A remote
connection dies with the inverter connection.
class methods:
class_init(): initialize the common resources of the proxy (MQTT
broker, Proxy DB, etc). Must be called before the
first inverter instance can be created
class_close(): release the common resources of the proxy. Should not
be called before any instances of the class are
destroyed
methods:
server_loop(addr): Async loop method for receiving messages from the
inverter (server-side)
client_loop(addr): Async loop method for receiving messages from the
TSUN cloud (client-side)
async_create_remote(): Establish a client connection to the TSUN cloud
async_publ_mqtt(): Publish data to MQTT broker
close(): Release method which must be called before a instance can be
destroyed
'''
def __init__(self, reader: StreamReader, writer: StreamWriter, addr):
super().__init__(reader, writer, addr, None, True)
self.__ha_restarts = -1
async def async_create_remote(self) -> None:
'''Establish a client connection to the TSUN cloud'''
tsun = Config.get('tsun')
host = tsun['host']
port = tsun['port']
addr = (host, port)
try:
logging.info(f'[{self.node_id}] Connect to {addr}')
connect = asyncio.open_connection(host, port)
reader, writer = await connect
self.remoteStream = ConnectionG3(reader, writer, addr, self,
False, self.id_str)
logging.info(f'[{self.remoteStream.node_id}:'
f'{self.remoteStream.conn_no}] '
f'Connected to {addr}')
asyncio.create_task(self.client_loop(addr))
except (ConnectionRefusedError, TimeoutError) as error:
logging.info(f'{error}')
except Exception:
self.inc_counter('SW_Exception')
logging.error(
f"Inverter: Exception for {addr}:\n"
f"{traceback.format_exc()}")
async def async_publ_mqtt(self) -> None:
'''publish data to MQTT broker'''
# check if new inverter or collector infos are available or when the
# home assistant has changed the status back to online
try:
if (('inverter' in self.new_data and self.new_data['inverter'])
or ('collector' in self.new_data and
self.new_data['collector'])
or self.mqtt.ha_restarts != self.__ha_restarts):
await self._register_proxy_stat_home_assistant()
await self.__register_home_assistant()
self.__ha_restarts = self.mqtt.ha_restarts
for key in self.new_data:
await self.__async_publ_mqtt_packet(key)
for key in Infos.new_stat_data:
await self._async_publ_mqtt_proxy_stat(key)
except MqttCodeError as error:
logging.error(f'Mqtt except: {error}')
except Exception:
self.inc_counter('SW_Exception')
logging.error(
f"Inverter: Exception:\n"
f"{traceback.format_exc()}")
async def __async_publ_mqtt_packet(self, key):
db = self.db.db
if key in db and self.new_data[key]:
data_json = json.dumps(db[key])
node_id = self.node_id
logger_mqtt.debug(f'{key}: {data_json}')
await self.mqtt.publish(f'{self.entity_prfx}{node_id}{key}', data_json) # noqa: E501
self.new_data[key] = False
async def __register_home_assistant(self) -> None:
'''register all our topics at home assistant'''
for data_json, component, node_id, id in self.db.ha_confs(
self.entity_prfx, self.node_id, self.unique_id,
self.sug_area):
logger_mqtt.debug(f"MQTT Register: cmp:'{component}'"
f" node_id:'{node_id}' {data_json}")
await self.mqtt.publish(f"{self.discovery_prfx}{component}"
f"/{node_id}{id}/config", data_json)
self.db.reg_clr_at_midnight(f'{self.entity_prfx}{self.node_id}')
def close(self) -> None:
logging.debug(f'InverterG3.close() l{self.l_addr} | r{self.r_addr}')
super().close() # call close handler in the parent class
# logging.info(f'Inverter refs: {gc.get_referrers(self)}')
def __del__(self):
logging.debug("InverterG3.__del__")
super().__del__()
class InverterG3(InverterBase):
def __init__(self, reader: StreamReader, writer: StreamWriter):
super().__init__(reader, writer, 'tsun', Talent)

View File

@@ -1,20 +1,15 @@
import struct
import logging
import time
from zoneinfo import ZoneInfo
from datetime import datetime
from tzlocal import get_localzone
if __name__ == "app.src.gen3.talent":
from app.src.messages import hex_dump_memory, Message, State
from app.src.modbus import Modbus
from app.src.my_timer import Timer
from app.src.config import Config
from app.src.gen3.infos_g3 import InfosG3
else: # pragma: no cover
from messages import hex_dump_memory, Message, State
from modbus import Modbus
from my_timer import Timer
from config import Config
from gen3.infos_g3 import InfosG3
from async_ifc import AsyncIfc
from messages import Message, State
from modbus import Modbus
from cnf.config import Config
from gen3.infos_g3 import InfosG3
from infos import Register
logger = logging.getLogger('msg')
@@ -37,11 +32,20 @@ class Control:
class Talent(Message):
MB_START_TIMEOUT = 40
MB_REGULAR_TIMEOUT = 60
TXT_UNKNOWN_CTRL = 'Unknown Ctrl'
def __init__(self, server_side: bool, id_str=b''):
super().__init__(server_side, self.send_modbus_cb, mb_timeout=11)
def __init__(self, inverter, addr, ifc: "AsyncIfc", server_side: bool,
client_mode: bool = False, id_str=b''):
super().__init__('G3', ifc, server_side, self.send_modbus_cb,
mb_timeout=15)
_ = inverter
ifc.rx_set_cb(self.read)
ifc.prot_set_timeout_cb(self._timeout)
ifc.prot_set_init_new_client_conn_cb(self._init_new_client_conn)
ifc.prot_set_update_header_cb(self._update_header)
self.addr = addr
self.conn_no = ifc.get_conn_no()
self.await_conn_resp_cnt = 0
self.id_str = id_str
self.contact_name = b''
@@ -52,25 +56,27 @@ class Talent(Message):
0x00: self.msg_contact_info,
0x13: self.msg_ota_update,
0x22: self.msg_get_time,
0x99: self.msg_heartbeat,
0x71: self.msg_collector_data,
# 0x76:
0x77: self.msg_modbus,
# 0x78:
0x87: self.msg_modbus2,
0x04: self.msg_inverter_data,
}
self.log_lvl = {
0x00: logging.INFO,
0x13: logging.INFO,
0x22: logging.INFO,
0x99: logging.INFO,
0x71: logging.INFO,
# 0x76:
0x77: self.get_modbus_log_lvl,
# 0x78:
0x87: self.get_modbus_log_lvl,
0x04: logging.INFO,
}
self.modbus_elms = 0 # for unit tests
self.node_id = 'G3' # will be overwritten in __set_serial_no
self.mb_timer = Timer(self.mb_timout_cb, self.node_id)
self.sensor_list = 0
'''
Our puplic methods
@@ -82,8 +88,6 @@ class Talent(Message):
# deallocated by the garbage collector ==> we get a memory leak
self.switch.clear()
self.log_lvl.clear()
self.state = State.closed
self.mb_timer.close()
super().close()
def __set_serial_no(self, serial_no: str):
@@ -96,10 +100,9 @@ class Talent(Message):
if serial_no in inverters:
inv = inverters[serial_no]
self.node_id = inv['node_id']
self.sug_area = inv['suggested_area']
logger.debug(f'SerialNo {serial_no} allowed! area:{self.sug_area}') # noqa: E501
self._set_config_parms(inv)
self.db.set_pv_module_details(inv)
logger.debug(f'SerialNo {serial_no} allowed! area:{self.sug_area}') # noqa: E501
else:
self.node_id = ''
self.sug_area = ''
@@ -111,43 +114,47 @@ class Talent(Message):
logger.debug(f'SerialNo {serial_no} not known but accepted!')
self.unique_id = serial_no
self.db.set_db_def_value(Register.COLLECTOR_SNR, serial_no)
def read(self) -> float:
'''process all received messages in the _recv_buffer'''
self._read()
while True:
if not self.header_valid:
self.__parse_header(self.ifc.rx_peek(), self.ifc.rx_len())
if not self.header_valid:
self.__parse_header(self._recv_buffer, len(self._recv_buffer))
if self.header_valid and \
self.ifc.rx_len() >= (self.header_len + self.data_len):
if self.state == State.init:
self.state = State.received # received 1st package
if self.header_valid and len(self._recv_buffer) >= (self.header_len +
self.data_len):
if self.state == State.init:
self.state = State.received # received 1st package
log_lvl = self.log_lvl.get(self.msg_id, logging.WARNING)
if callable(log_lvl):
log_lvl = log_lvl()
log_lvl = self.log_lvl.get(self.msg_id, logging.WARNING)
if callable(log_lvl):
log_lvl = log_lvl()
self.ifc.rx_log(log_lvl, f'Received from {self.addr}:'
f' BufLen: {self.ifc.rx_len()}'
f' HdrLen: {self.header_len}'
f' DtaLen: {self.data_len}')
hex_dump_memory(log_lvl, f'Received from {self.addr}:',
self._recv_buffer, self.header_len+self.data_len)
self.__set_serial_no(self.id_str.decode("utf-8"))
self.__dispatch_msg()
self.__flush_recv_msg()
else:
return 0 # don not wait before sending a response
self.__set_serial_no(self.id_str.decode("utf-8"))
self.__dispatch_msg()
self.__flush_recv_msg()
return 0.5 # wait 500ms before sending a response
def forward(self, buffer, buflen) -> None:
def forward(self) -> None:
'''add the actual receive msg to the forwarding queue'''
tsun = Config.get('tsun')
if tsun['enabled']:
self._forward_buffer = buffer[:buflen]
hex_dump_memory(logging.DEBUG, 'Store for forwarding:',
buffer, buflen)
buflen = self.header_len+self.data_len
buffer = self.ifc.rx_peek(buflen)
self.ifc.fwd_add(buffer)
self.ifc.fwd_log(logging.DEBUG, 'Store for forwarding:')
self.__parse_header(self._forward_buffer,
len(self._forward_buffer))
fnc = self.switch.get(self.msg_id, self.msg_unknown)
logger.info(self.__flow_str(self.server_side, 'forwrd') +
f' Ctl: {int(self.ctrl):#02x} Msg: {fnc.__name__!r}')
return
def send_modbus_cb(self, modbus_pdu: bytearray, log_lvl: int, state: str):
if self.state != State.up:
@@ -156,34 +163,27 @@ class Talent(Message):
return
self.__build_header(0x70, 0x77)
self._send_buffer += b'\x00\x01\xa3\x28' # fixme
self._send_buffer += struct.pack('!B', len(modbus_pdu))
self._send_buffer += modbus_pdu
self.ifc.tx_add(b'\x00\x01\xa3\x28') # magic ?
self.ifc.tx_add(struct.pack('!B', len(modbus_pdu)))
self.ifc.tx_add(modbus_pdu)
self.__finish_send_msg()
hex_dump_memory(log_lvl, f'Send Modbus {state}:{self.addr}:',
self._send_buffer, len(self._send_buffer))
self.writer.write(self._send_buffer)
self._send_buffer = bytearray(0) # self._send_buffer[sent:]
def _send_modbus_cmd(self, func, addr, val, log_lvl) -> None:
if self.state != State.up:
logger.log(log_lvl, f'[{self.node_id}] ignore MODBUS cmd,'
' as the state is not UP')
return
self.mb.build_msg(Modbus.INV_ADDR, func, addr, val, log_lvl)
async def send_modbus_cmd(self, func, addr, val, log_lvl) -> None:
self._send_modbus_cmd(func, addr, val, log_lvl)
self.ifc.tx_log(log_lvl, f'Send Modbus {state}:{self.addr}:')
self.ifc.tx_flush()
def mb_timout_cb(self, exp_cnt):
self.mb_timer.start(self.MB_REGULAR_TIMEOUT)
self.mb_timer.start(self.mb_timeout)
if self.mb_scan:
self._send_modbus_scan()
return
if 0 == (exp_cnt % 30):
if 2 == (exp_cnt % 30):
# logging.info("Regular Modbus Status request")
self._send_modbus_cmd(Modbus.READ_REGS, 0x2007, 2, logging.DEBUG)
self._send_modbus_cmd(Modbus.INV_ADDR, Modbus.READ_REGS, 0x2000,
96, logging.DEBUG)
else:
self._send_modbus_cmd(Modbus.READ_REGS, 0x3008, 21, logging.DEBUG)
self._send_modbus_cmd(Modbus.INV_ADDR, Modbus.READ_REGS, 0x3000,
48, logging.DEBUG)
def _init_new_client_conn(self) -> bool:
contact_name = self.contact_name
@@ -192,9 +192,9 @@ class Talent(Message):
self.msg_id = 0
self.await_conn_resp_cnt += 1
self.__build_header(0x91)
self._send_buffer += struct.pack(f'!{len(contact_name)+1}p'
f'{len(contact_mail)+1}p',
contact_name, contact_mail)
self.ifc.tx_add(struct.pack(f'!{len(contact_name)+1}p'
f'{len(contact_mail)+1}p',
contact_name, contact_mail))
self.__finish_send_msg()
return True
@@ -218,31 +218,43 @@ class Talent(Message):
return switch.get(type, '???')
def _timestamp(self): # pragma: no cover
if False:
# utc as epoche
ts = time.time()
else:
# convert localtime in epoche
ts = (datetime.now() - datetime(1970, 1, 1)).total_seconds()
'''returns timestamp fo the inverter as localtime
since 1.1.1970 in msec'''
# convert localtime in epoche
ts = (datetime.now() - datetime(1970, 1, 1)).total_seconds()
return round(ts*1000)
def _utcfromts(self, ts: float):
'''converts inverter timestamp into unix time (epoche)'''
dt = datetime.fromtimestamp(ts/1000, tz=ZoneInfo("UTC")). \
replace(tzinfo=get_localzone())
return dt.timestamp()
def _utc(self): # pragma: no cover
'''returns unix time (epoche)'''
return datetime.now().timestamp()
def _update_header(self, _forward_buffer):
'''update header for message before forwarding,
add time offset to timestamp'''
_len = len(_forward_buffer)
result = struct.unpack_from('!lB', _forward_buffer, 0)
id_len = result[1] # len of variable id string
if _len < 2*id_len + 21:
return
ofs = 0
while ofs < _len:
result = struct.unpack_from('!lB', _forward_buffer, 0)
msg_len = 4 + result[0]
id_len = result[1] # len of variable id string
if _len < 2*id_len + 21:
return
result = struct.unpack_from('!B', _forward_buffer, id_len+6)
msg_code = result[0]
if msg_code == 0x71 or msg_code == 0x04:
result = struct.unpack_from('!q', _forward_buffer, 13+2*id_len)
ts = result[0] + self.ts_offset
logger.debug(f'offset: {self.ts_offset:08x}'
f' proxy-time: {ts:08x}')
struct.pack_into('!q', _forward_buffer, 13+2*id_len, ts)
result = struct.unpack_from('!B', _forward_buffer, id_len+6)
msg_code = result[0]
if msg_code == 0x71 or msg_code == 0x04:
result = struct.unpack_from('!q', _forward_buffer, 13+2*id_len)
ts = result[0] + self.ts_offset
logger.debug(f'offset: {self.ts_offset:08x}'
f' proxy-time: {ts:08x}')
struct.pack_into('!q', _forward_buffer, 13+2*id_len, ts)
ofs += msg_len
# check if there is a complete header in the buffer, parse it
# and set
@@ -259,8 +271,15 @@ class Talent(Message):
if (buf_len < 5): # enough bytes to read len and id_len?
return
result = struct.unpack_from('!lB', buf, 0)
len = result[0] # len of complete message
msg_len = result[0] # len of complete message
id_len = result[1] # len of variable id string
if id_len > 17:
logger.warning(f'len of ID string must == 16 but is {id_len}')
self.inc_counter('Invalid_Msg_Format')
# erase broken recv buffer
self.ifc.rx_clear()
return
hdr_len = 5+id_len+2
@@ -273,24 +292,24 @@ class Talent(Message):
self.id_str = result[0]
self.ctrl = Control(result[1])
self.msg_id = result[2]
self.data_len = len-id_len-3
self.data_len = msg_len-id_len-3
self.header_len = hdr_len
self.header_valid = True
return
def __build_header(self, ctrl, msg_id=None) -> None:
if not msg_id:
msg_id = self.msg_id
self.send_msg_ofs = len(self._send_buffer)
self._send_buffer += struct.pack(f'!l{len(self.id_str)+1}pBB',
0, self.id_str, ctrl, msg_id)
self.send_msg_ofs = self.ifc.tx_len()
self.ifc.tx_add(struct.pack(f'!l{len(self.id_str)+1}pBB',
0, self.id_str, ctrl, msg_id))
fnc = self.switch.get(msg_id, self.msg_unknown)
logger.info(self.__flow_str(self.server_side, 'tx') +
f' Ctl: {int(ctrl):#02x} Msg: {fnc.__name__!r}')
def __finish_send_msg(self) -> None:
_len = len(self._send_buffer) - self.send_msg_ofs
struct.pack_into('!l', self._send_buffer, self.send_msg_ofs, _len-4)
_len = self.ifc.tx_len() - self.send_msg_ofs
struct.pack_into('!l', self.ifc.tx_peek(), self.send_msg_ofs,
_len-4)
def __dispatch_msg(self) -> None:
fnc = self.switch.get(self.msg_id, self.msg_unknown)
@@ -304,7 +323,7 @@ class Talent(Message):
f' Ctl: {int(self.ctrl):#02x} Msg: {fnc.__name__!r}')
def __flush_recv_msg(self) -> None:
self._recv_buffer = self._recv_buffer[(self.header_len+self.data_len):]
self.ifc.rx_get(self.header_len+self.data_len)
self.header_valid = False
'''
@@ -314,65 +333,103 @@ class Talent(Message):
if self.ctrl.is_ind():
if self.server_side and self.__process_contact_info():
self.__build_header(0x91)
self._send_buffer += b'\x01'
self.ifc.tx_add(b'\x01')
self.__finish_send_msg()
# don't forward this contact info here, we will build one
# when the remote connection is established
elif self.await_conn_resp_cnt > 0:
self.await_conn_resp_cnt -= 1
else:
self.forward(self._recv_buffer, self.header_len+self.data_len)
return
self.forward()
else:
logger.warning('Unknown Ctrl')
logger.warning(self.TXT_UNKNOWN_CTRL)
self.inc_counter('Unknown_Ctrl')
self.forward(self._recv_buffer, self.header_len+self.data_len)
self.forward()
def __process_contact_info(self) -> bool:
result = struct.unpack_from('!B', self._recv_buffer, self.header_len)
buf = self.ifc.rx_peek()
result = struct.unpack_from('!B', buf, self.header_len)
name_len = result[0]
if self.data_len < name_len+2:
if self.data_len == 1: # this is a response withone status byte
return False
result = struct.unpack_from(f'!{name_len+1}pB', self._recv_buffer,
self.header_len)
self.contact_name = result[0]
mail_len = result[1]
logger.info(f'name: {self.contact_name}')
if self.data_len >= name_len+2:
result = struct.unpack_from(f'!{name_len+1}pB', buf,
self.header_len)
self.contact_name = result[0]
mail_len = result[1]
logger.info(f'name: {self.contact_name}')
result = struct.unpack_from(f'!{mail_len+1}p', self._recv_buffer,
self.header_len+name_len+1)
self.contact_mail = result[0]
result = struct.unpack_from(f'!{mail_len+1}p', buf,
self.header_len+name_len+1)
self.contact_mail = result[0]
logger.info(f'mail: {self.contact_mail}')
return True
def msg_get_time(self):
if self.ctrl.is_ind():
if self.data_len == 0:
self.state = State.pend # block MODBUS cmds
self.mb_timer.start(self.MB_START_TIMEOUT)
if self.state == State.up:
self.state = State.pend # block MODBUS cmds
ts = self._timestamp()
logger.debug(f'time: {ts:08x}')
self.__build_header(0x91)
self._send_buffer += struct.pack('!q', ts)
self.ifc.tx_add(struct.pack('!q', ts))
self.__finish_send_msg()
elif self.data_len >= 8:
ts = self._timestamp()
result = struct.unpack_from('!q', self._recv_buffer,
result = struct.unpack_from('!q', self.ifc.rx_peek(),
self.header_len)
self.ts_offset = result[0]-ts
if self.ifc.remote.stream:
self.ifc.remote.stream.ts_offset = self.ts_offset
logger.debug(f'tsun-time: {int(result[0]):08x}'
f' proxy-time: {ts:08x}'
f' offset: {self.ts_offset}')
return # ignore received response
else:
logger.warning('Unknown Ctrl')
logger.warning(self.TXT_UNKNOWN_CTRL)
self.inc_counter('Unknown_Ctrl')
self.forward(self._recv_buffer, self.header_len+self.data_len)
self.forward()
def msg_heartbeat(self):
if self.ctrl.is_ind():
if self.data_len == 9:
self.state = State.up # allow MODBUS cmds
if (self.modbus_polling):
self.mb_timer.start(self.mb_first_timeout)
self.db.set_db_def_value(Register.POLLING_INTERVAL,
self.mb_timeout)
self.__build_header(0x99)
self.ifc.tx_add(b'\x02')
self.__finish_send_msg()
result = struct.unpack_from('!Bq', self.ifc.rx_peek(),
self.header_len)
resp_code = result[0]
ts = result[1]+self.ts_offset
logger.debug(f'inv-time: {int(result[1]):08x}'
f' tsun-time: {ts:08x}'
f' offset: {self.ts_offset}')
struct.pack_into('!Bq', self.ifc.rx_peek(),
self.header_len, resp_code, ts)
elif self.ctrl.is_resp():
result = struct.unpack_from('!B', self.ifc.rx_peek(),
self.header_len)
resp_code = result[0]
logging.debug(f'Heartbeat-RespCode: {resp_code}')
return
else:
logger.warning(self.TXT_UNKNOWN_CTRL)
self.inc_counter('Unknown_Ctrl')
self.forward()
def parse_msg_header(self):
result = struct.unpack_from('!lB', self._recv_buffer, self.header_len)
result = struct.unpack_from('!lB', self.ifc.rx_peek(),
self.header_len)
data_id = result[0] # len of complete message
id_len = result[1] # len of variable id string
@@ -380,92 +437,146 @@ class Talent(Message):
msg_hdr_len = 5+id_len+9
result = struct.unpack_from(f'!{id_len+1}pBq', self._recv_buffer,
result = struct.unpack_from(f'!{id_len+1}pBq', self.ifc.rx_peek(),
self.header_len + 4)
timestamp = result[2]
logger.debug(f'ID: {result[0]} B: {result[1]}')
logger.debug(f'time: {result[2]:08x}')
logger.debug(f'time: {timestamp:08x}')
# logger.info(f'time: {datetime.utcfromtimestamp(result[2]).strftime(
# "%Y-%m-%d %H:%M:%S")}')
return msg_hdr_len
return msg_hdr_len, data_id, timestamp
def msg_collector_data(self):
if self.ctrl.is_ind():
self.__build_header(0x99)
self._send_buffer += b'\x01'
self.ifc.tx_add(b'\x01')
self.__finish_send_msg()
self.__process_data()
self.__process_data(False)
elif self.ctrl.is_resp():
return # ignore received response
else:
logger.warning('Unknown Ctrl')
logger.warning(self.TXT_UNKNOWN_CTRL)
self.inc_counter('Unknown_Ctrl')
self.forward(self._recv_buffer, self.header_len+self.data_len)
self.forward()
def msg_inverter_data(self):
if self.ctrl.is_ind():
self.__build_header(0x99)
self._send_buffer += b'\x01'
self.ifc.tx_add(b'\x01')
self.__finish_send_msg()
self.__process_data()
self.__process_data(True)
self.state = State.up # allow MODBUS cmds
if (self.modbus_polling):
self.mb_timer.start(self.mb_first_timeout)
self.db.set_db_def_value(Register.POLLING_INTERVAL,
self.mb_timeout)
elif self.ctrl.is_resp():
return # ignore received response
else:
logger.warning('Unknown Ctrl')
logger.warning(self.TXT_UNKNOWN_CTRL)
self.inc_counter('Unknown_Ctrl')
self.forward(self._recv_buffer, self.header_len+self.data_len)
self.forward()
def __process_data(self):
msg_hdr_len = self.parse_msg_header()
def __build_model_name(self):
db = self.db
model = db.get_db_value(Register.EQUIPMENT_MODEL, None)
if model:
return
max_pow = db.get_db_value(Register.MAX_DESIGNED_POWER, 0)
if max_pow == 3000:
model = f'TSOL-MS{max_pow}'
self.db.set_db_def_value(Register.EQUIPMENT_MODEL, model)
self.db.set_db_def_value(Register.MANUFACTURER, 'TSUN')
self.db.set_db_def_value(Register.NO_INPUTS, 4)
for key, update in self.db.parse(self._recv_buffer, self.header_len
+ msg_hdr_len, self.node_id):
def __process_data(self, inv_data: bool):
msg_hdr_len, data_id, ts = self.parse_msg_header()
if inv_data:
# handle register mapping
if 0 == self.sensor_list:
self.sensor_list = data_id
self.db.set_db_def_value(Register.SENSOR_LIST,
f"{self.sensor_list:08x}")
logging.debug(f"Use sensor-list: {self.sensor_list:#08x}"
f" for '{self.unique_id}'")
if data_id != self.sensor_list:
logging.warning(f'Unexpected Sensor-List:{data_id:08x}'
f' (!={self.sensor_list:08x})')
# ignore replays for inverter data
age = self._utc() - self._utcfromts(ts)
age = age/(3600*24)
logger.debug(f"Age: {age} days")
if age > 1: # is a replay?
return
inv_update = False
for key, update in self.db.parse(self.ifc.rx_peek(), self.header_len
+ msg_hdr_len, data_id, self.node_id):
if update:
if key == 'inverter':
inv_update = True
self._set_mqtt_timestamp(key, self._utcfromts(ts))
self.new_data[key] = True
if inv_update:
self.__build_model_name()
def msg_ota_update(self):
if self.ctrl.is_req():
self.inc_counter('OTA_Start_Msg')
elif self.ctrl.is_ind():
pass
pass # Ok, nothing to do
else:
logger.warning('Unknown Ctrl')
logger.warning(self.TXT_UNKNOWN_CTRL)
self.inc_counter('Unknown_Ctrl')
self.forward(self._recv_buffer, self.header_len+self.data_len)
self.forward()
def parse_modbus_header(self):
msg_hdr_len = 5
result = struct.unpack_from('!lBB', self._recv_buffer,
result = struct.unpack_from('!lBB', self.ifc.rx_peek(),
self.header_len)
modbus_len = result[1]
# logger.debug(f'Ref: {result[0]}')
# logger.debug(f'Modbus MsgLen: {modbus_len} Func:{result[2]}')
return msg_hdr_len, modbus_len
def parse_modbus_header2(self):
msg_hdr_len = 6
result = struct.unpack_from('!lBBB', self.ifc.rx_peek(),
self.header_len)
modbus_len = result[2]
return msg_hdr_len, modbus_len
def get_modbus_log_lvl(self) -> int:
if self.ctrl.is_req():
return logging.INFO
elif self.ctrl.is_ind():
if self.server_side:
return self.mb.last_log_lvl
elif self.ctrl.is_ind() and self.server_side:
return self.mb.last_log_lvl
return logging.WARNING
def msg_modbus(self):
hdr_len, modbus_len = self.parse_modbus_header()
data = self._recv_buffer[self.header_len:
self.header_len+self.data_len]
hdr_len, _ = self.parse_modbus_header()
self.__msg_modbus(hdr_len)
def msg_modbus2(self):
hdr_len, _ = self.parse_modbus_header2()
self.__msg_modbus(hdr_len)
def __msg_modbus(self, hdr_len):
data = self.ifc.rx_peek()[self.header_len:
self.header_len+self.data_len]
if self.ctrl.is_req():
if self.remoteStream.mb.recv_req(data[hdr_len:],
self.remoteStream.
msg_forward):
rstream = self.ifc.remote.stream
if rstream.mb.recv_req(data[hdr_len:], rstream.msg_forward):
self.inc_counter('Modbus_Command')
else:
self.inc_counter('Invalid_Msg_Format')
@@ -476,22 +587,25 @@ class Talent(Message):
logger.warning('Unknown Message')
self.inc_counter('Unknown_Msg')
return
if (self.mb_scan):
modbus_msg_len = self.data_len - hdr_len
self._dump_modbus_scan(data, hdr_len, modbus_msg_len)
for key, update, _ in self.mb.recv_resp(self.db, data[
hdr_len:],
self.node_id):
hdr_len:]):
if update:
self._set_mqtt_timestamp(key, self._utc())
self.new_data[key] = True
self.modbus_elms += 1 # count for unit tests
else:
logger.warning('Unknown Ctrl')
logger.warning(self.TXT_UNKNOWN_CTRL)
self.inc_counter('Unknown_Ctrl')
self.forward(self._recv_buffer, self.header_len+self.data_len)
self.forward()
def msg_forward(self):
self.forward(self._recv_buffer, self.header_len+self.data_len)
self.forward()
def msg_unknown(self):
logger.warning(f"Unknow Msg: ID:{self.msg_id}")
self.inc_counter('Unknown_Msg')
self.forward(self._recv_buffer, self.header_len+self.data_len)
self.forward()

View File

@@ -1,42 +0,0 @@
import logging
# import gc
from asyncio import StreamReader, StreamWriter
from async_stream import AsyncStream
from gen3plus.solarman_v5 import SolarmanV5
logger = logging.getLogger('conn')
class ConnectionG3P(AsyncStream, SolarmanV5):
def __init__(self, reader: StreamReader, writer: StreamWriter,
addr, remote_stream: 'ConnectionG3P',
server_side: bool) -> None:
AsyncStream.__init__(self, reader, writer, addr)
SolarmanV5.__init__(self, server_side)
self.remoteStream: 'ConnectionG3P' = remote_stream
'''
Our puplic methods
'''
def close(self):
AsyncStream.close(self)
SolarmanV5.close(self)
# logger.info(f'AsyncStream refs: {gc.get_referrers(self)}')
async def async_create_remote(self) -> None:
pass
async def async_publ_mqtt(self) -> None:
pass
def healthy(self) -> bool:
logger.debug('ConnectionG3P healthy()')
return AsyncStream.healthy(self)
'''
Our private methods
'''
def __del__(self):
super().__del__()

View File

@@ -1,36 +1,87 @@
import struct
from typing import Generator
from itertools import chain
if __name__ == "app.src.gen3plus.infos_g3p":
from app.src.infos import Infos, Register
else: # pragma: no cover
from infos import Infos, Register
from infos import Infos, Register, ProxyMode, Fmt
class RegisterFunc:
@staticmethod
def prod_sum(info: Infos, arr: dict) -> None | int:
result = 0
for sum in arr:
prod = 1
for factor in sum:
val = info.get_db_value(factor)
if val is None:
return None
prod = prod * val
result += prod
return result
@staticmethod
def cmp_values(info: Infos, params: map) -> None | int:
try:
val = info.get_db_value(params['reg'])
if val < params['cmp_val']:
return params['res'][0]
if val == params['cmp_val']:
return params['res'][1]
return params['res'][2]
except Exception:
pass
return None
class RegisterMap:
# make the class read/only by using __slots__
__slots__ = ()
FMT_2_16BIT_VAL = '!HH'
FMT_3_16BIT_VAL = '!HHH'
FMT_4_16BIT_VAL = '!HHHH'
map = {
# 0x41020007: {'reg': Register.DEVICE_SNR, 'fmt': '<L'}, # noqa: E501
0x41020018: {'reg': Register.DATA_UP_INTERVAL, 'fmt': '<B', 'ratio': 60}, # noqa: E501
0x41020019: {'reg': Register.COLLECT_INTERVAL, 'fmt': '<B', 'eval': 'round(result/60)'}, # noqa: E501
0x41020018: {'reg': Register.DATA_UP_INTERVAL, 'fmt': '<B', 'ratio': 60, 'dep': ProxyMode.SERVER}, # noqa: E501
0x41020019: {'reg': Register.COLLECT_INTERVAL, 'fmt': '<B', 'quotient': 60, 'dep': ProxyMode.SERVER}, # noqa: E501
0x4102001a: {'reg': Register.HEARTBEAT_INTERVAL, 'fmt': '<B', 'ratio': 1}, # noqa: E501
0x4102001c: {'reg': Register.SIGNAL_STRENGTH, 'fmt': '<B', 'ratio': 1}, # noqa: E501
0x4102001b: {'reg': None, 'fmt': '<B', 'const': 1}, # noqa: E501 Max No Of Connected Devices
0x4102001c: {'reg': Register.SIGNAL_STRENGTH, 'fmt': '<B', 'ratio': 1, 'dep': ProxyMode.SERVER}, # noqa: E501
0x4102001d: {'reg': None, 'fmt': '<B', 'const': 1}, # noqa: E501
0x4102001e: {'reg': Register.CHIP_MODEL, 'fmt': '!40s'}, # noqa: E501
0x41020046: {'reg': Register.MAC_ADDR, 'fmt': '!6B', 'func': Fmt.mac}, # noqa: E501
0x4102004c: {'reg': Register.IP_ADDRESS, 'fmt': '!16s'}, # noqa: E501
0x4102005c: {'reg': None, 'fmt': '<B', 'const': 15}, # noqa: E501
0x4102005e: {'reg': None, 'fmt': '<B', 'const': 1}, # noqa: E501 No Of Sensors (ListLen)
0x4102005f: {'reg': Register.SENSOR_LIST, 'fmt': '<H', 'func': Fmt.hex4}, # noqa: E501
0x41020061: {'reg': None, 'fmt': '<HB', 'const': (15, 255)}, # noqa: E501
0x41020064: {'reg': Register.COLLECTOR_FW_VERSION, 'fmt': '!40s'}, # noqa: E501
0x4201001c: {'reg': Register.POWER_ON_TIME, 'fmt': '<H', 'ratio': 1}, # noqa: E501
0x4102008c: {'reg': None, 'fmt': '<BB', 'const': (254, 254)}, # noqa: E501
0x4102008e: {'reg': None, 'fmt': '<B'}, # noqa: E501 Encryption Certificate File Status
0x4102008f: {'reg': None, 'fmt': '!40s'}, # noqa: E501
0x410200b7: {'reg': Register.SSID, 'fmt': '!40s'}, # noqa: E501
}
map_02b0 = {
0x4201000c: {'reg': Register.SENSOR_LIST, 'fmt': '<H', 'func': Fmt.hex4}, # noqa: E501
0x4201001c: {'reg': Register.POWER_ON_TIME, 'fmt': '<H', 'ratio': 1, 'dep': ProxyMode.SERVER}, # noqa: E501, or packet number
0x42010020: {'reg': Register.SERIAL_NUMBER, 'fmt': '!16s'}, # noqa: E501
# Start MODBUS Block: 0x3000 (R/O Measurements)
0x420100c0: {'reg': Register.INVERTER_STATUS, 'fmt': '!H'}, # noqa: E501
0x420100d0: {'reg': Register.VERSION, 'fmt': '!H', 'eval': "f'V{(result>>12)}.{(result>>8)&0xf}.{(result>>4)&0xf}{result&0xf}'"}, # noqa: E501
0x420100c2: {'reg': Register.DETECT_STATUS_1, 'fmt': '!H'}, # noqa: E501
0x420100c4: {'reg': Register.DETECT_STATUS_2, 'fmt': '!H'}, # noqa: E501
0x420100c6: {'reg': Register.EVENT_ALARM, 'fmt': '!H'}, # noqa: E501
0x420100c8: {'reg': Register.EVENT_FAULT, 'fmt': '!H'}, # noqa: E501
0x420100ca: {'reg': Register.EVENT_BF1, 'fmt': '!H'}, # noqa: E501
0x420100cc: {'reg': Register.EVENT_BF2, 'fmt': '!H'}, # noqa: E501
# 0x420100ce
0x420100d0: {'reg': Register.VERSION, 'fmt': '!H', 'func': Fmt.version}, # noqa: E501
0x420100d2: {'reg': Register.GRID_VOLTAGE, 'fmt': '!H', 'ratio': 0.1}, # noqa: E501
0x420100d4: {'reg': Register.GRID_CURRENT, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501
0x420100d6: {'reg': Register.GRID_FREQUENCY, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501
0x420100d8: {'reg': Register.INVERTER_TEMP, 'fmt': '!H', 'eval': 'result-40'}, # noqa: E501
# 0x420100d8: {'reg': Register.INVERTER_TEMP, 'fmt': '!H'}, # noqa: E501
0x420100d8: {'reg': Register.INVERTER_TEMP, 'fmt': '!H', 'offset': -40}, # noqa: E501
# 0x420100da
0x420100dc: {'reg': Register.RATED_POWER, 'fmt': '!H', 'ratio': 1}, # noqa: E501
0x420100de: {'reg': Register.OUTPUT_POWER, 'fmt': '!H', 'ratio': 0.1}, # noqa: E501
0x420100e0: {'reg': Register.PV1_VOLTAGE, 'fmt': '!H', 'ratio': 0.1}, # noqa: E501
@@ -55,20 +106,126 @@ class RegisterMap:
0x4201010c: {'reg': Register.PV3_TOTAL_GENERATION, 'fmt': '!L', 'ratio': 0.01}, # noqa: E501
0x42010110: {'reg': Register.PV4_DAILY_GENERATION, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501
0x42010112: {'reg': Register.PV4_TOTAL_GENERATION, 'fmt': '!L', 'ratio': 0.01}, # noqa: E501
0x42010126: {'reg': Register.MAX_DESIGNED_POWER, 'fmt': '!H', 'ratio': 1}, # noqa: E501
0x42010170: {'reg': Register.NO_INPUTS, 'fmt': '!B'}, # noqa: E501
0x42010116: {'reg': Register.INV_UNKNOWN_1, 'fmt': '!H'}, # noqa: E501
# Start MODBUS Block: 0x2000 (R/W Config Paramaneters)
0x42010118: {'reg': Register.BOOT_STATUS, 'fmt': '!H'},
0x4201011a: {'reg': Register.DSP_STATUS, 'fmt': '!H'},
0x4201011c: {'reg': None, 'fmt': '!H', 'const': 1}, # noqa: E501
0x4201011e: {'reg': Register.WORK_MODE, 'fmt': '!H'},
0x42010124: {'reg': Register.OUTPUT_SHUTDOWN, 'fmt': '!H'},
0x42010126: {'reg': Register.MAX_DESIGNED_POWER, 'fmt': '!H'},
0x42010128: {'reg': Register.RATED_LEVEL, 'fmt': '!H'},
0x4201012a: {'reg': Register.INPUT_COEFFICIENT, 'fmt': '!H', 'ratio': 100/1024}, # noqa: E501
0x4201012c: {'reg': Register.GRID_VOLT_CAL_COEF, 'fmt': '!H'},
0x4201012e: {'reg': None, 'fmt': '!H', 'const': 1024}, # noqa: E501
0x42010130: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (1024, 1, 0xffff, 1)}, # noqa: E501
0x42010138: {'reg': Register.PROD_COMPL_TYPE, 'fmt': '!H'},
0x4201013a: {'reg': None, 'fmt': FMT_3_16BIT_VAL, 'const': (0x68, 0x68, 0x500)}, # noqa: E501
0x42010140: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (0x9cd, 0x7b6, 0x139c, 0x1324)}, # noqa: E501
0x42010148: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (1, 0x7ae, 0x40f, 0x41)}, # noqa: E501
0x42010150: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (0xf, 0xa64, 0xa64, 0x6)}, # noqa: E501
0x42010158: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (0x6, 0x9f6, 0x128c, 0x128c)}, # noqa: E501
0x42010160: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (0x10, 0x10, 0x1452, 0x1452)}, # noqa: E501
0x42010168: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (0x10, 0x10, 0x151, 0x5)}, # noqa: E501
0x42010170: {'reg': Register.OUTPUT_COEFFICIENT, 'fmt': '!H', 'ratio': 100/1024}, # noqa: E501
0x42010172: {'reg': None, 'fmt': FMT_3_16BIT_VAL, 'const': (0x1, 0x139c, 0xfa0)}, # noqa: E501
0x42010178: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (0x4e, 0x66, 0x3e8, 0x400)}, # noqa: E501
0x42010180: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (0x9ce, 0x7a8, 0x139c, 0x1326)}, # noqa: E501
0x42010188: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (0x0, 0x0, 0x0, 0)}, # noqa: E501
0x42010190: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (0x0, 0x0, 1024, 1024)}, # noqa: E501
0x42010198: {'reg': None, 'fmt': FMT_4_16BIT_VAL, 'const': (0, 0, 0xffff, 0)}, # noqa: E501
0x420101a0: {'reg': None, 'fmt': FMT_2_16BIT_VAL, 'const': (0x0, 0x0)}, # noqa: E501
0xffffff02: {'reg': Register.POLLING_INTERVAL},
# 0x4281001c: {'reg': Register.POWER_ON_TIME, 'fmt': '<H', 'ratio': 1}, # noqa: E501
}
map_3026 = {
0x4201000c: {'reg': Register.SENSOR_LIST, 'fmt': '<H', 'func': Fmt.hex4}, # noqa: E501
0x4201001c: {'reg': Register.POWER_ON_TIME, 'fmt': '<H', 'ratio': 1, 'dep': ProxyMode.SERVER}, # noqa: E501, or packet number
0x42010020: {'reg': Register.SERIAL_NUMBER, 'fmt': '!16s'}, # noqa: E501
0x42010030: {'reg': Register.BATT_PV1_VOLT, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501, DC Voltage PV1
0x42010032: {'reg': Register.BATT_PV1_CUR, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501, DC Current PV1
0x42010034: {'reg': Register.BATT_PV2_VOLT, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501, DC Voltage PV2
0x42010036: {'reg': Register.BATT_PV2_CUR, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501, DC Current PV2
0x42010038: {'reg': Register.BATT_TOTAL_CHARG, 'fmt': '!L', 'ratio': 0.01}, # noqa: E501
0x4201003c: {'reg': Register.BATT_PV1_STATUS, 'fmt': '!H'}, # noqa: E501 MPTT-1 Operating Status: 0(Standby), 1(Work)
0x4201003e: {'reg': Register.BATT_PV2_STATUS, 'fmt': '!H'}, # noqa: E501 MPTT-2 Operating Status: 0(Standby), 1(Work)
0x42010040: {'reg': Register.BATT_VOLT, 'fmt': '!h', 'ratio': 0.01}, # noqa: E501
0x42010042: {'reg': Register.BATT_CUR, 'fmt': '!h', 'ratio': 0.01}, # noqa: E501 => Batterie Status: <0(Discharging), 0(Static), 0>(Loading)
0x42010044: {'reg': Register.BATT_SOC, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501, state of charge (SOC) in percent
0x42010046: {'reg': Register.BATT_CELL1_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x42010048: {'reg': Register.BATT_CELL2_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x4201004a: {'reg': Register.BATT_CELL3_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x4201004c: {'reg': Register.BATT_CELL4_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x4201004e: {'reg': Register.BATT_CELL5_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x42010050: {'reg': Register.BATT_CELL6_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x42010052: {'reg': Register.BATT_CELL7_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x42010054: {'reg': Register.BATT_CELL8_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x42010056: {'reg': Register.BATT_CELL9_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x42010058: {'reg': Register.BATT_CELL10_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x4201005a: {'reg': Register.BATT_CELL11_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x4201005c: {'reg': Register.BATT_CELL12_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x4201005e: {'reg': Register.BATT_CELL13_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x42010060: {'reg': Register.BATT_CELL14_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x42010062: {'reg': Register.BATT_CELL15_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x42010064: {'reg': Register.BATT_CELL16_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501H
0x42010066: {'reg': Register.BATT_TEMP_1, 'fmt': '!h'}, # noqa: E501 Cell Temperture 1
0x42010068: {'reg': Register.BATT_TEMP_2, 'fmt': '!h'}, # noqa: E501 Cell Temperture 2
0x4201006a: {'reg': Register.BATT_TEMP_3, 'fmt': '!h'}, # noqa: E501 Cell Temperture 3
0x4201006c: {'reg': Register.BATT_OUT_VOLT, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501 Output Voltage
0x4201006e: {'reg': Register.BATT_OUT_CUR, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501 Output Current
0x42010070: {'reg': Register.BATT_OUT_STATUS, 'fmt': '!H'}, # noqa: E501 Output Working Status: 0(Standby), 1(Work)
0x42010072: {'reg': Register.BATT_TEMP_4, 'fmt': '!h'}, # noqa: E50, Environment temp
0x42010074: {'reg': Register.BATT_ALARM, 'fmt': '!H'}, # noqa: E501 Warning Alarmcode 1, Bit 0..15
0x42010076: {'reg': Register.BATT_HW_VERS, 'fmt': '!h'}, # noqa: E501 hardware version
0x42010078: {'reg': Register.BATT_SW_VERS, 'fmt': '!h'}, # noqa: E501 software main version
'calc': {
1: {'reg': Register.BATT_PV_PWR, 'func': RegisterFunc.prod_sum, # noqa: E501 Generated Power
'params': [[Register.BATT_PV1_VOLT, Register.BATT_PV1_CUR],
[Register.BATT_PV2_VOLT, Register.BATT_PV2_CUR]]},
2: {'reg': Register.BATT_PWR, 'func': RegisterFunc.prod_sum, # noqa: E501
'params': [[Register.BATT_VOLT, Register.BATT_CUR]]},
3: {'reg': Register.BATT_OUT_PWR, 'func': RegisterFunc.prod_sum, # noqa: E501 Supply Power => Power Supply State: 0(Idle), 0>(Power Supply)
'params': [[Register.BATT_OUT_VOLT, Register.BATT_OUT_CUR]]},
4: {'reg': Register.BATT_PWR_SUPL_STATE, 'func': RegisterFunc.cmp_values, # noqa: E501
'params': {'reg': Register.BATT_OUT_PWR, 'cmp_val': 0, 'res': [0, 0, 1]}}, # noqa: E501
5: {'reg': Register.BATT_STATUS, 'func': RegisterFunc.cmp_values, # noqa: E501
'params': {'reg': Register.BATT_CUR, 'cmp_val': 0.0, 'res': [0, 1, 2]}} # noqa: E501
}
}
class RegisterSel:
__sensor_map = {
0x02b0: RegisterMap.map_02b0,
0x3026: RegisterMap.map_3026,
}
@classmethod
def get(cls, sensor: int):
return cls.__sensor_map.get(sensor, RegisterMap.map)
class InfosG3P(Infos):
def __init__(self):
__slots__ = ('client_mode', )
def __init__(self, client_mode: bool):
super().__init__()
self.client_mode = client_mode
self.set_db_def_value(Register.MANUFACTURER, 'TSUN')
self.set_db_def_value(Register.EQUIPMENT_MODEL, 'TSOL-MSxx00')
self.set_db_def_value(Register.CHIP_TYPE, 'IGEN TECH')
self.set_db_def_value(Register.NO_INPUTS, 4)
def __hide_topic(self, row: dict) -> bool:
if 'dep' in row:
mode = row['dep']
if self.client_mode:
return mode != ProxyMode.CLIENT
else:
return mode != ProxyMode.SERVER
return False
def ha_confs(self, ha_prfx: str, node_id: str, snr: str,
sug_area: str = '') \
@@ -82,19 +239,41 @@ class InfosG3P(Infos):
entity strings
sug_area:str ==> suggested area string from the config file'''
# iterate over RegisterMap.map and get the register values
for row in RegisterMap.map.values():
sensor = self.get_db_value(Register.SENSOR_LIST)
if "3026" == sensor:
reg_map = RegisterMap.map_3026
elif "02b0" == sensor:
reg_map = RegisterMap.map_02b0
else:
reg_map = {}
items = reg_map.items()
if 'calc' in reg_map:
virt = reg_map['calc'].items()
else:
virt = {}
for idx, row in chain(RegisterMap.map.items(), items, virt):
if 'calc' == idx:
continue
info_id = row['reg']
res = self.ha_conf(info_id, ha_prfx, node_id, snr, False, sug_area) # noqa: E501
if self.__hide_topic(row):
res = self.ha_remove(info_id, node_id, snr) # noqa: E501
else:
res = self.ha_conf(info_id, ha_prfx, node_id, snr, False, sug_area) # noqa: E501
if res:
yield res
def parse(self, buf, msg_type: int, rcv_ftype: int, node_id: str = '') \
def parse(self, buf, msg_type: int, rcv_ftype: int,
sensor: int = 0, node_id: str = '') \
-> Generator[tuple[str, bool], None, None]:
'''parse a data sequence received from the inverter and
stores the values in Infos.db
buf: buffer of the sequence to parse'''
for idx, row in RegisterMap.map.items():
reg_map = RegisterSel.get(sensor)
for idx, row in reg_map.items():
if 'calc' == idx:
continue
addr = idx & 0xffff
ftype = (idx >> 16) & 0xff
mtype = (idx >> 24) & 0xff
@@ -103,25 +282,50 @@ class InfosG3P(Infos):
if not isinstance(row, dict):
continue
info_id = row['reg']
fmt = row['fmt']
res = struct.unpack_from(fmt, buf, addr)
result = res[0]
if isinstance(result, (bytearray, bytes)):
result = result.decode().split('\x00')[0]
if 'eval' in row:
result = eval(row['eval'])
if 'ratio' in row:
result = round(result * row['ratio'], 2)
result = Fmt.get_value(buf, addr, row)
yield from self.__update_val(node_id, "GEN3PLUS", info_id, result)
yield from self.calc(sensor, node_id)
keys, level, unit, must_incr = self._key_obj(info_id)
def calc(self, sensor: int = 0, node_id: str = '') \
-> Generator[tuple[str, bool], None, None]:
'''calculate meta values from the
stored values in Infos.db
if keys:
name, update = self.update_db(keys, must_incr, result)
yield keys[0], update
else:
name = str(f'info-id.0x{addr:x}')
update = False
sensor: sensor_list number
node_id: id-string for the node'''
reg_map = RegisterSel.get(sensor)
if 'calc' in reg_map:
for row in reg_map['calc'].values():
info_id = row['reg']
result = row['func'](self, row['params'])
yield from self.__update_val(node_id, "CALC", info_id, result)
def __update_val(self, node_id, source: str, info_id, result):
keys, level, unit, must_incr = self._key_obj(info_id)
if keys:
name, update = self.update_db(keys, must_incr, result)
yield keys[0], update
if update:
self.tracer.log(level, f'[{node_id}] GEN3PLUS: {name}'
self.tracer.log(level, f'[{node_id}] {source}: {name}'
f' : {result}{unit}')
def build(self, len, msg_type: int, rcv_ftype: int, sensor: int = 0):
buf = bytearray(len)
for idx, row in RegisterSel.get(sensor).items():
addr = idx & 0xffff
ftype = (idx >> 16) & 0xff
mtype = (idx >> 24) & 0xff
if ftype != rcv_ftype or mtype != msg_type:
continue
if not isinstance(row, dict):
continue
if 'const' in row:
val = row['const']
else:
info_id = row['reg']
val = self.get_db_value(info_id)
if not val:
continue
Fmt.set_value(buf, addr, row, val)
return buf

View File

@@ -1,132 +1,24 @@
import logging
import traceback
import json
import asyncio
from asyncio import StreamReader, StreamWriter
from config import Config
from inverter import Inverter
from gen3plus.connection_g3p import ConnectionG3P
from aiomqtt import MqttCodeError
from infos import Infos
# import gc
# logger = logging.getLogger('conn')
logger_mqtt = logging.getLogger('mqtt')
from inverter_base import InverterBase
from gen3plus.solarman_v5 import SolarmanV5
from gen3plus.solarman_emu import SolarmanEmu
class InverterG3P(Inverter, ConnectionG3P):
'''class Inverter is a derivation of an Async_Stream
class InverterG3P(InverterBase):
def __init__(self, reader: StreamReader, writer: StreamWriter,
client_mode: bool = False):
# shared value between both inverter connections
self.forward_at_cmd_resp = False
'''Flag if response for the last at command must be send to the cloud.
The class has some class method for managing common resources like a
connection to the MQTT broker or proxy error counter which are common
for all inverter connection
False: send result only to the MQTT broker, cause the AT+ command
came from there
True: send response packet to the cloud, cause the AT+ command
came from the cloud'''
Instances of the class are connections to an inverter and can have an
optional link to an remote connection to the TSUN cloud. A remote
connection dies with the inverter connection.
class methods:
class_init(): initialize the common resources of the proxy (MQTT
broker, Proxy DB, etc). Must be called before the
first inverter instance can be created
class_close(): release the common resources of the proxy. Should not
be called before any instances of the class are
destroyed
methods:
server_loop(addr): Async loop method for receiving messages from the
inverter (server-side)
client_loop(addr): Async loop method for receiving messages from the
TSUN cloud (client-side)
async_create_remote(): Establish a client connection to the TSUN cloud
async_publ_mqtt(): Publish data to MQTT broker
close(): Release method which must be called before a instance can be
destroyed
'''
def __init__(self, reader: StreamReader, writer: StreamWriter, addr):
super().__init__(reader, writer, addr, None, True)
self.__ha_restarts = -1
async def async_create_remote(self) -> None:
'''Establish a client connection to the TSUN cloud'''
tsun = Config.get('solarman')
host = tsun['host']
port = tsun['port']
addr = (host, port)
try:
logging.info(f'[{self.node_id}] Connect to {addr}')
connect = asyncio.open_connection(host, port)
reader, writer = await connect
self.remoteStream = ConnectionG3P(reader, writer, addr, self,
False)
logging.info(f'[{self.remoteStream.node_id}:'
f'{self.remoteStream.conn_no}] '
f'Connected to {addr}')
asyncio.create_task(self.client_loop(addr))
except (ConnectionRefusedError, TimeoutError) as error:
logging.info(f'{error}')
except Exception:
self.inc_counter('SW_Exception')
logging.error(
f"Inverter: Exception for {addr}:\n"
f"{traceback.format_exc()}")
async def async_publ_mqtt(self) -> None:
'''publish data to MQTT broker'''
# check if new inverter or collector infos are available or when the
# home assistant has changed the status back to online
try:
if (('inverter' in self.new_data and self.new_data['inverter'])
or ('collector' in self.new_data and
self.new_data['collector'])
or self.mqtt.ha_restarts != self.__ha_restarts):
await self._register_proxy_stat_home_assistant()
await self.__register_home_assistant()
self.__ha_restarts = self.mqtt.ha_restarts
for key in self.new_data:
await self.__async_publ_mqtt_packet(key)
for key in Infos.new_stat_data:
await self._async_publ_mqtt_proxy_stat(key)
except MqttCodeError as error:
logging.error(f'Mqtt except: {error}')
except Exception:
self.inc_counter('SW_Exception')
logging.error(
f"Inverter: Exception:\n"
f"{traceback.format_exc()}")
async def __async_publ_mqtt_packet(self, key):
db = self.db.db
if key in db and self.new_data[key]:
data_json = json.dumps(db[key])
node_id = self.node_id
logger_mqtt.debug(f'{key}: {data_json}')
await self.mqtt.publish(f'{self.entity_prfx}{node_id}{key}', data_json) # noqa: E501
self.new_data[key] = False
async def __register_home_assistant(self) -> None:
'''register all our topics at home assistant'''
for data_json, component, node_id, id in self.db.ha_confs(
self.entity_prfx, self.node_id, self.unique_id,
self.sug_area):
logger_mqtt.debug(f"MQTT Register: cmp:'{component}'"
f" node_id:'{node_id}' {data_json}")
await self.mqtt.publish(f"{self.discovery_prfx}{component}"
f"/{node_id}{id}/config", data_json)
self.db.reg_clr_at_midnight(f'{self.entity_prfx}{self.node_id}')
def close(self) -> None:
logging.debug(f'InverterG3P.close() l{self.l_addr} | r{self.r_addr}')
super().close() # call close handler in the parent class
# logger.debug (f'Inverter refs: {gc.get_referrers(self)}')
def __del__(self):
logging.debug("InverterG3P.__del__")
super().__del__()
remote_prot = None
if client_mode:
remote_prot = SolarmanEmu
super().__init__(reader, writer, 'solarman',
SolarmanV5, client_mode, remote_prot)

View File

@@ -0,0 +1,139 @@
import logging
import struct
from async_ifc import AsyncIfc
from gen3plus.solarman_v5 import SolarmanBase
from my_timer import Timer
from infos import Register
logger = logging.getLogger('msg')
class SolarmanEmu(SolarmanBase):
def __init__(self, inverter, addr, ifc: "AsyncIfc",
server_side: bool, client_mode: bool):
super().__init__(addr, ifc, server_side=False,
_send_modbus_cb=None,
mb_timeout=8)
_ = inverter
logging.debug('SolarmanEmu.init()')
self.db = ifc.remote.stream.db
self.snr = ifc.remote.stream.snr
self.hb_timeout = 60
'''actual heatbeat timeout from the last response message'''
self.data_up_inv = self.db.get_db_value(Register.DATA_UP_INTERVAL)
'''time interval for getting new MQTT data messages'''
self.hb_timer = Timer(self.send_heartbeat_cb, self.node_id)
self.data_timer = Timer(self.send_data_cb, self.node_id)
self.last_sync = self._emu_timestamp()
'''timestamp when we send the last sync message (4110)'''
self.pkt_cnt = 0
'''last sent packet number'''
self.switch = {
0x4210: 'msg_data_ind', # real time data
0x1210: self.msg_response, # at least every 5 minutes
0x4710: 'msg_hbeat_ind', # heatbeat
0x1710: self.msg_response, # every 2 minutes
0x4110: 'msg_dev_ind', # device data, sync start
0x1110: self.msg_response, # every 3 hours
}
self.log_lvl = {
0x4110: logging.INFO, # device data, sync start
0x1110: logging.INFO, # every 3 hours
0x4210: logging.INFO, # real time data
0x1210: logging.INFO, # at least every 5 minutes
0x4710: logging.DEBUG, # heatbeat
0x1710: logging.DEBUG, # every 2 minutes
}
'''
Our puplic methods
'''
def close(self) -> None:
logging.info('SolarmanEmu.close()')
# we have references to methods of this class in self.switch
# so we have to erase self.switch, otherwise this instance can't be
# deallocated by the garbage collector ==> we get a memory leak
self.switch.clear()
self.log_lvl.clear()
self.hb_timer.close()
self.data_timer.close()
self.db = None
super().close()
def _set_serial_no(self, snr: int):
logging.debug(f'SolarmanEmu._set_serial_no, snr: {snr}')
self.unique_id = str(snr)
def _init_new_client_conn(self) -> bool:
logging.debug('SolarmanEmu.init_new()')
self.data_timer.start(self.data_up_inv)
return False
def next_pkt_cnt(self):
'''get the next packet number'''
self.pkt_cnt = (self.pkt_cnt + 1) & 0xffffffff
return self.pkt_cnt
def seconds_since_last_sync(self):
'''get seconds since last 0x4110 message was sent'''
return self._emu_timestamp() - self.last_sync
def send_heartbeat_cb(self, exp_cnt):
'''send a heartbeat to the TSUN cloud'''
self._build_header(0x4710)
self.ifc.tx_add(struct.pack('<B', 0))
self._finish_send_msg()
log_lvl = self.log_lvl.get(0x4710, logging.WARNING)
self.ifc.tx_log(log_lvl, 'Send heartbeat:')
self.ifc.tx_flush()
def send_data_cb(self, exp_cnt):
'''send a inverter data message to the TSUN cloud'''
self.hb_timer.start(self.hb_timeout)
self.data_timer.start(self.data_up_inv)
_len = 420
ftype = 1
build_msg = self.db.build(_len, 0x42, ftype, 0x02b0)
self._build_header(0x4210)
self.ifc.tx_add(
struct.pack(
'<BHLLLHL', ftype, 0x02b0,
self._emu_timestamp(),
self.seconds_since_last_sync(),
self.time_ofs,
1, # offset 0x1a
self.next_pkt_cnt()))
self.ifc.tx_add(build_msg[0x20:])
self._finish_send_msg()
log_lvl = self.log_lvl.get(0x4210, logging.WARNING)
self.ifc.tx_log(log_lvl, 'Send inv-data:')
self.ifc.tx_flush()
'''
Message handler methods
'''
def msg_response(self):
'''handle a received response from the TSUN cloud'''
logger.debug("EMU received rsp:")
_, _, ts, hb = super().msg_response()
logger.debug(f"EMU ts:{ts} hb:{hb}")
self.hb_timeout = hb
self.time_ofs = ts - self._emu_timestamp()
self.hb_timer.start(self.hb_timeout)
def msg_unknown(self):
'''counts a unknown or unexpected message from the TSUN cloud'''
logger.warning(f"EMU Unknow Msg: ID:{int(self.control):#04x}")
self.inc_counter('Unknown_Msg')

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

195
app/src/inverter_base.py Normal file
View File

@@ -0,0 +1,195 @@
import weakref
import asyncio
import logging
import traceback
import json
import gc
from aiomqtt import MqttCodeError
from asyncio import StreamReader, StreamWriter
from ipaddress import ip_address
from inverter_ifc import InverterIfc
from proxy import Proxy
from async_stream import StreamPtr
from async_stream import AsyncStreamClient
from async_stream import AsyncStreamServer
from cnf.config import Config
from infos import Infos
logger_mqtt = logging.getLogger('mqtt')
class InverterBase(InverterIfc, Proxy):
def __init__(self, reader: StreamReader, writer: StreamWriter,
config_id: str, prot_class,
client_mode: bool = False,
remote_prot_class=None):
Proxy.__init__(self)
self._registry.append(weakref.ref(self))
self.addr = writer.get_extra_info('peername')
self.config_id = config_id
if remote_prot_class:
self.prot_class = remote_prot_class
else:
self.prot_class = prot_class
self.__ha_restarts = -1
self.remote = StreamPtr(None)
ifc = AsyncStreamServer(reader, writer,
self.async_publ_mqtt,
self.create_remote,
self.remote)
self.local = StreamPtr(
prot_class(self, self.addr, ifc, True, client_mode), ifc
)
def __enter__(self):
return self
def __exit__(self, exc_type, exc, tb) -> None:
logging.debug(f'InverterBase.__exit__() {self.addr}')
self.__del_remote()
self.local.stream.close()
self.local.stream = None
self.local.ifc.close()
self.local.ifc = None
# now explicitly call garbage collector to release unreachable objects
unreachable_obj = gc.collect()
logging.debug(
f'InverterBase.__exit: freed unreachable obj: {unreachable_obj}')
def __del_remote(self):
if self.remote.stream:
self.remote.stream.close()
self.remote.stream = None
if self.remote.ifc:
self.remote.ifc.close()
self.remote.ifc = None
async def disc(self, shutdown_started=False) -> None:
if self.remote.stream:
self.remote.stream.shutdown_started = shutdown_started
if self.remote.ifc:
await self.remote.ifc.disc()
if self.local.stream:
self.local.stream.shutdown_started = shutdown_started
if self.local.ifc:
await self.local.ifc.disc()
def healthy(self) -> bool:
logging.debug('InverterBase healthy()')
if self.local.ifc and not self.local.ifc.healthy():
return False
if self.remote.ifc and not self.remote.ifc.healthy():
return False
return True
async def create_remote(self) -> None:
'''Establish a client connection to the TSUN cloud'''
tsun = Config.get(self.config_id)
host = tsun['host']
port = tsun['port']
addr = (host, port)
stream = self.local.stream
try:
logging.info(f'[{stream.node_id}] Connect to {addr}')
connect = asyncio.open_connection(host, port)
reader, writer = await connect
r_addr = writer.get_extra_info('peername')
if r_addr is not None:
(ip, _) = r_addr
if ip_address(ip).is_private:
logging.error(
f"""resolve {host} to {ip}, which is a private IP!
\u001B[31m Check your DNS settings and use a public DNS resolver!
To prevent a possible loop, forwarding to local IP addresses is
not supported and is deactivated for subsequent connections
\u001B[0m
""")
Config.act_config[self.config_id]['enabled'] = False
ifc = AsyncStreamClient(
reader, writer, self.local, self.__del_remote)
self.remote.ifc = ifc
if hasattr(stream, 'id_str'):
self.remote.stream = self.prot_class(
self, addr, ifc, server_side=False,
client_mode=False, id_str=stream.id_str)
else:
self.remote.stream = self.prot_class(
self, addr, ifc, server_side=False,
client_mode=False)
logging.info(f'[{self.remote.stream.node_id}:'
f'{self.remote.stream.conn_no}] '
f'Connected to {addr}')
asyncio.create_task(self.remote.ifc.client_loop(addr))
except (ConnectionRefusedError, TimeoutError) as error:
logging.info(f'{error}')
except Exception:
Infos.inc_counter('SW_Exception')
logging.error(
f"Inverter: Exception for {addr}:\n"
f"{traceback.format_exc()}")
async def async_publ_mqtt(self) -> None:
'''publish data to MQTT broker'''
stream = self.local.stream
if not stream or not stream.unique_id:
return
# check if new inverter or collector infos are available or when the
# home assistant has changed the status back to online
try:
if (('inverter' in stream.new_data and stream.new_data['inverter'])
or ('batterie' in stream.new_data and
stream.new_data['batterie'])
or ('collector' in stream.new_data and
stream.new_data['collector'])
or self.mqtt.ha_restarts != self.__ha_restarts):
await self._register_proxy_stat_home_assistant()
await self.__register_home_assistant(stream)
self.__ha_restarts = self.mqtt.ha_restarts
for key in stream.new_data:
await self.__async_publ_mqtt_packet(stream, key)
for key in Infos.new_stat_data:
await Proxy._async_publ_mqtt_proxy_stat(key)
except MqttCodeError as error:
logging.error(f'Mqtt except: {error}')
except Exception:
Infos.inc_counter('SW_Exception')
logging.error(
f"Inverter: Exception:\n"
f"{traceback.format_exc()}")
async def __async_publ_mqtt_packet(self, stream, key):
db = stream.db.db
if key in db and stream.new_data[key]:
data_json = json.dumps(db[key])
node_id = stream.node_id
logger_mqtt.debug(f'{key}: {data_json}')
await self.mqtt.publish(f'{self.entity_prfx}{node_id}{key}', data_json) # noqa: E501
stream.new_data[key] = False
async def __register_home_assistant(self, stream) -> None:
'''register all our topics at home assistant'''
for data_json, component, node_id, id in stream.db.ha_confs(
self.entity_prfx, stream.node_id, stream.unique_id,
stream.sug_area):
logger_mqtt.debug(f"MQTT Register: cmp:'{component}'"
f" node_id:'{node_id}' {data_json}")
await self.mqtt.publish(f"{self.discovery_prfx}{component}"
f"/{node_id}{id}/config", data_json)
stream.db.reg_clr_at_midnight(f'{self.entity_prfx}{stream.node_id}')

37
app/src/inverter_ifc.py Normal file
View File

@@ -0,0 +1,37 @@
from abc import abstractmethod
import logging
from asyncio import StreamReader, StreamWriter
from iter_registry import AbstractIterMeta
logger_mqtt = logging.getLogger('mqtt')
class InverterIfc(metaclass=AbstractIterMeta):
_registry = []
@abstractmethod
def __init__(self, reader: StreamReader, writer: StreamWriter,
config_id: str, prot_class,
client_mode: bool):
pass # pragma: no cover
@abstractmethod
def __enter__(self):
pass # pragma: no cover
@abstractmethod
def __exit__(self, exc_type, exc, tb):
pass # pragma: no cover
@abstractmethod
def healthy(self) -> bool:
pass # pragma: no cover
@abstractmethod
async def disc(self, shutdown_started=False) -> None:
pass # pragma: no cover
@abstractmethod
async def create_remote(self) -> None:
pass # pragma: no cover

9
app/src/iter_registry.py Normal file
View File

@@ -0,0 +1,9 @@
from abc import ABCMeta
class AbstractIterMeta(ABCMeta):
def __iter__(cls):
for ref in cls._registry:
obj = ref()
if obj is not None:
yield obj

View File

@@ -58,19 +58,19 @@ formatter=console_formatter
class=handlers.TimedRotatingFileHandler
level=INFO
formatter=file_formatter
args=('log/proxy.log', when:='midnight')
args=(handlers.log_path + 'proxy.log', when:='midnight', backupCount:=handlers.log_backups)
[handler_file_handler_name2]
class=handlers.TimedRotatingFileHandler
level=NOTSET
formatter=file_formatter
args=('log/trace.log', when:='midnight')
args=(handlers.log_path + 'trace.log', when:='midnight', backupCount:=handlers.log_backups)
[formatter_console_formatter]
format=%(asctime)s %(levelname)5s | %(name)4s | %(message)s'
datefmt='%Y-%m-%d %H:%M:%S
format=%(asctime)s %(levelname)5s | %(name)4s | %(message)s
datefmt=%Y-%m-%d %H:%M:%S
[formatter_file_formatter]
format=%(asctime)s %(levelname)5s | %(name)4s | %(message)s'
datefmt='%Y-%m-%d %H:%M:%S
format=%(asctime)s %(levelname)5s | %(name)4s | %(message)s
datefmt=%Y-%m-%d %H:%M:%S

View File

@@ -1,58 +1,69 @@
import logging
import weakref
from typing import Callable, Generator
from typing import Callable
from enum import Enum
if __name__ == "app.src.messages":
from app.src.infos import Infos
from app.src.modbus import Modbus
else: # pragma: no cover
from infos import Infos
from modbus import Modbus
from async_ifc import AsyncIfc
from protocol_ifc import ProtocolIfc
from infos import Infos, Register
from modbus import Modbus
from my_timer import Timer
logger = logging.getLogger('msg')
def hex_dump_memory(level, info, data, num):
def __hex_val(n, data, data_len):
line = ''
for j in range(n-16, n):
if j >= data_len:
break
line += '%02x ' % abs(data[j])
return line
def __asc_val(n, data, data_len):
line = ''
for j in range(n-16, n):
if j >= data_len:
break
c = data[j] if not (data[j] < 0x20 or data[j] > 0x7e) else '.'
line += '%c' % c
return line
def hex_dump(data, data_len) -> list:
n = 0
lines = []
for i in range(0, data_len, 16):
line = ' '
line += '%04x | ' % (i)
n += 16
line += __hex_val(n, data, data_len)
line += ' ' * (3 * 16 + 9 - len(line)) + ' | '
line += __asc_val(n, data, data_len)
lines.append(line)
return lines
def hex_dump_str(data, data_len):
lines = hex_dump(data, data_len)
return '\n'.join(lines)
def hex_dump_memory(level, info, data, data_len):
lines = []
lines.append(info)
tracer = logging.getLogger('tracer')
if not tracer.isEnabledFor(level):
return
for i in range(0, num, 16):
line = ' '
line += '%04x | ' % (i)
n += 16
for j in range(n-16, n):
if j >= len(data):
break
line += '%02x ' % abs(data[j])
line += ' ' * (3 * 16 + 9 - len(line)) + ' | '
for j in range(n-16, n):
if j >= len(data):
break
c = data[j] if not (data[j] < 0x20 or data[j] > 0x7e) else '.'
line += '%c' % c
lines.append(line)
lines += hex_dump(data, data_len)
tracer.log(level, '\n'.join(lines))
class IterRegistry(type):
def __iter__(cls) -> Generator['Message', None, None]:
for ref in cls._registry:
obj = ref()
if obj is not None:
yield obj
class State(Enum):
'''state of the logical connection'''
init = 0
@@ -67,30 +78,59 @@ class State(Enum):
'''connection closed'''
class Message(metaclass=IterRegistry):
_registry = []
class Message(ProtocolIfc):
MAX_START_TIME = 400
'''maximum time without a received msg in sec'''
MAX_INV_IDLE_TIME = 120
'''maximum time without a received msg from the inverter in sec'''
MAX_DEF_IDLE_TIME = 360
'''maximum default time without a received msg in sec'''
MB_START_TIMEOUT = 40
'''start delay for Modbus polling in server mode'''
MB_REGULAR_TIMEOUT = 60
'''regular Modbus polling time in server mode'''
def __init__(self, server_side: bool, send_modbus_cb:
Callable[[bytes, int, str], None], mb_timeout: int):
def __init__(self, node_id, ifc: "AsyncIfc", server_side: bool,
send_modbus_cb: Callable[[bytes, int, str], None],
mb_timeout: int):
self._registry.append(weakref.ref(self))
self.server_side = server_side
self.ifc = ifc
self.node_id = node_id
if server_side:
self.mb = Modbus(send_modbus_cb, mb_timeout)
self.mb_timer = Timer(self.mb_timout_cb, self.node_id)
else:
self.mb = None
self.mb_timer = None
self.header_valid = False
self.header_len = 0
self.data_len = 0
self.unique_id = 0
self.node_id = '' # will be overwritten in the child class's __init__
self.sug_area = ''
self._recv_buffer = bytearray(0)
self._send_buffer = bytearray(0)
self._forward_buffer = bytearray(0)
self.new_data = {}
self.state = State.init
self.shutdown_started = False
self.modbus_elms = 0 # for unit tests
self.mb_timeout = self.MB_REGULAR_TIMEOUT
self.mb_first_timeout = self.MB_START_TIMEOUT
'''timer value for next Modbus polling request'''
self.modbus_polling = False
self.mb_start_reg = 0
self.mb_step = 0
self.mb_bytes = 0
self.mb_inv_no = 1
self.mb_scan = False
@property
def node_id(self):
return self._node_id
@node_id.setter
def node_id(self, value):
self._node_id = value
self.ifc.set_node_id(value)
'''
Empty methods, that have to be implemented in any child class which
@@ -100,18 +140,107 @@ class Message(metaclass=IterRegistry):
# to our _recv_buffer
return # pragma: no cover
def _update_header(self, _forward_buffer):
'''callback for updating the header of the forward buffer'''
return # pragma: no cover
def _set_config_parms(self, inv: dict):
'''init connection with params from the configuration'''
self.node_id = inv['node_id']
self.sug_area = inv['suggested_area']
self.modbus_polling = inv['modbus_polling']
if 'modbus_scanning' in inv:
scan = inv['modbus_scanning']
self.mb_scan = True
self.mb_start_reg = scan['start']
self.mb_step = scan['step']
self.mb_bytes = scan['bytes']
if 'client_mode' in inv:
self.mb_start_reg = scan['start']
else:
self.mb_start_reg = scan['start'] - scan['step']
self.mb_start_reg &= 0xffff
if self.mb:
self.mb.set_node_id(self.node_id)
def _set_mqtt_timestamp(self, key, ts: float | None):
if key not in self.new_data or \
not self.new_data[key]:
if key == 'grid':
info_id = Register.TS_GRID
elif key == 'input':
info_id = Register.TS_INPUT
elif key == 'total':
info_id = Register.TS_TOTAL
else:
return
# tstr = time.strftime("%Y-%m-%d %H:%M:%S", time.gmtime(ts))
# logger.info(f'update: key: {key} ts:{tstr}'
self.db.set_db_def_value(info_id, round(ts))
def _timeout(self) -> int:
if self.state == State.init or self.state == State.received:
to = self.MAX_START_TIME
elif self.state == State.up and \
self.server_side and self.modbus_polling:
to = self.MAX_INV_IDLE_TIME
else:
to = self.MAX_DEF_IDLE_TIME
return to
def _send_modbus_cmd(self, dev_id, func, addr, val, log_lvl) -> None:
if self.state != State.up:
logger.log(log_lvl, f'[{self.node_id}] ignore MODBUS cmd,'
' as the state is not UP')
return
self.mb.build_msg(dev_id, func, addr, val, log_lvl)
async def send_modbus_cmd(self, func, addr, val, log_lvl) -> None:
self._send_modbus_cmd(Modbus.INV_ADDR, func, addr, val, log_lvl)
def _send_modbus_scan(self):
self.mb_start_reg += self.mb_step
if self.mb_start_reg > 0xffff:
self.mb_start_reg = self.mb_start_reg & 0xffff
self.mb_inv_no += 1
logging.info(f"Next Round: inv:{self.mb_inv_no}"
f" reg:{self.mb_start_reg:04x}")
if (self.mb_start_reg & 0xfffc) % 0x80 == 0:
logging.info(f"[{self.node_id}] Scan info: "
f"inv:{self.mb_inv_no}"
f" reg:{self.mb_start_reg:04x}")
self._send_modbus_cmd(self.mb_inv_no, Modbus.READ_REGS,
self.mb_start_reg, self.mb_bytes,
logging.INFO)
def _dump_modbus_scan(self, data, hdr_len, modbus_msg_len):
if (data[hdr_len] == self.mb_inv_no and
data[hdr_len+1] == Modbus.READ_REGS):
logging.info(f'[{self.node_id}] Valid MODBUS data '
f'(reg: 0x{self.mb.last_reg:04x}):')
hex_dump_memory(logging.INFO, 'Valid MODBUS data '
f'(reg: 0x{self.mb.last_reg:04x}):',
data[hdr_len:], modbus_msg_len)
'''
Our puplic methods
'''
def close(self) -> None:
if self.server_side:
# set inverter state to offline, if output power is very low
logging.debug('close power: '
f'{self.db.get_db_value(Register.OUTPUT_POWER, -1)}')
if self.db.get_db_value(Register.OUTPUT_POWER, 999) < 2:
self.db.set_db_def_value(Register.INVERTER_STATUS, 0)
self.new_data['env'] = True
self.mb_timer.close()
self.state = State.closed
self.ifc.rx_set_cb(None)
self.ifc.prot_set_timeout_cb(None)
self.ifc.prot_set_init_new_client_conn_cb(None)
self.ifc.prot_set_update_header_cb(None)
self.ifc = None
if self.mb:
self.mb.close()
self.mb = None
pass # pragma: no cover
# pragma: no cover
def inc_counter(self, counter: str) -> None:
self.db.inc_counter(counter)

View File

@@ -16,10 +16,7 @@ import logging
import asyncio
from typing import Generator, Callable
if __name__ == "app.src.modbus":
from app.src.infos import Register
else: # pragma: no cover
from infos import Register
from infos import Register, Fmt
logger = logging.getLogger('data')
@@ -39,14 +36,69 @@ class Modbus():
'''Modbus function code: Write Single Register'''
__crc_tab = []
map = {
0x2007: {'reg': Register.MAX_DESIGNED_POWER, 'fmt': '!H', 'ratio': 1}, # noqa: E501
# 0x????: {'reg': Register.INVERTER_STATUS, 'fmt': '!H'}, # noqa: E501
0x3008: {'reg': Register.VERSION, 'fmt': '!H', 'eval': "f'V{(result>>12)}.{(result>>8)&0xf}.{(result>>4)&0xf}{result&0xf}'"}, # noqa: E501
mb_reg_mapping = {
0x0000: {'reg': Register.SERIAL_NUMBER, 'fmt': '!16s'}, # noqa: E501
0x0008: {'reg': Register.BATT_PV1_VOLT, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501, PV1 voltage
0x0009: {'reg': Register.BATT_PV1_CUR, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501, PV1 current
0x000a: {'reg': Register.BATT_PV2_VOLT, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501, PV2 voltage
0x000b: {'reg': Register.BATT_PV2_CUR, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501, PV2 current
0x000c: {'reg': Register.BATT_TOTAL_CHARG, 'fmt': '!L', 'ratio': 0.01}, # noqa: E501
0x000e: {'reg': Register.BATT_PV1_STATUS, 'fmt': '!H'}, # noqa: E501
0x000f: {'reg': Register.BATT_PV2_STATUS, 'fmt': '!H'}, # noqa: E501
0x0010: {'reg': Register.BATT_VOLT, 'fmt': '!h', 'ratio': 0.01}, # noqa: E501
0x0011: {'reg': Register.BATT_CUR, 'fmt': '!h', 'ratio': 0.01}, # noqa: E501
0x0012: {'reg': Register.BATT_SOC, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501, state of charge (SOC) in percent
0x0013: {'reg': Register.BATT_CELL1_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x0014: {'reg': Register.BATT_CELL2_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x0015: {'reg': Register.BATT_CELL3_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x0016: {'reg': Register.BATT_CELL4_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x0017: {'reg': Register.BATT_CELL5_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x0018: {'reg': Register.BATT_CELL6_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x0019: {'reg': Register.BATT_CELL7_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x001a: {'reg': Register.BATT_CELL8_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x001b: {'reg': Register.BATT_CELL9_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x001c: {'reg': Register.BATT_CELL10_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x001d: {'reg': Register.BATT_CELL11_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x001e: {'reg': Register.BATT_CELL12_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x001f: {'reg': Register.BATT_CELL13_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x0020: {'reg': Register.BATT_CELL14_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x0021: {'reg': Register.BATT_CELL15_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x0022: {'reg': Register.BATT_CELL16_VOLT, 'fmt': '!H', 'ratio': 0.001}, # noqa: E501
0x0023: {'reg': Register.BATT_TEMP_1, 'fmt': '!h'}, # noqa: E501
0x0024: {'reg': Register.BATT_TEMP_2, 'fmt': '!h'}, # noqa: E501
0x0025: {'reg': Register.BATT_TEMP_3, 'fmt': '!h'}, # noqa: E501
0x0026: {'reg': Register.BATT_OUT_VOLT, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501
0x0027: {'reg': Register.BATT_OUT_CUR, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501
0x0028: {'reg': Register.BATT_OUT_STATUS, 'fmt': '!H'}, # noqa: E501
0x0029: {'reg': Register.BATT_TEMP_4, 'fmt': '!h'}, # noqa: E501
0x002a: {'reg': Register.BATT_ALARM, 'fmt': '!h'}, # noqa: E501
0x002b: {'reg': Register.BATT_HW_VERS, 'fmt': '!h'}, # noqa: E501
0x002c: {'reg': Register.BATT_SW_VERS, 'fmt': '!h'}, # noqa: E501
0x2000: {'reg': Register.BOOT_STATUS, 'fmt': '!H'}, # noqa: E501
0x2001: {'reg': Register.DSP_STATUS, 'fmt': '!H'}, # noqa: E501
0x2003: {'reg': Register.WORK_MODE, 'fmt': '!H'},
0x2006: {'reg': Register.OUTPUT_SHUTDOWN, 'fmt': '!H'},
0x2007: {'reg': Register.MAX_DESIGNED_POWER, 'fmt': '!H', 'ratio': 1}, # noqa: E501
0x2008: {'reg': Register.RATED_LEVEL, 'fmt': '!H'},
0x2009: {'reg': Register.INPUT_COEFFICIENT, 'fmt': '!H', 'ratio': 100/1024}, # noqa: E501
0x200a: {'reg': Register.GRID_VOLT_CAL_COEF, 'fmt': '!H'},
0x2010: {'reg': Register.PROD_COMPL_TYPE, 'fmt': '!H'},
0x202c: {'reg': Register.OUTPUT_COEFFICIENT, 'fmt': '!H', 'ratio': 100/1024}, # noqa: E501
0x3000: {'reg': Register.INVERTER_STATUS, 'fmt': '!H'}, # noqa: E501
0x3001: {'reg': Register.DETECT_STATUS_1, 'fmt': '!H'}, # noqa: E501
0x3002: {'reg': Register.DETECT_STATUS_2, 'fmt': '!H'}, # noqa: E501
0x3003: {'reg': Register.EVENT_ALARM, 'fmt': '!H'}, # noqa: E501
0x3004: {'reg': Register.EVENT_FAULT, 'fmt': '!H'}, # noqa: E501
0x3005: {'reg': Register.EVENT_BF1, 'fmt': '!H'}, # noqa: E501
0x3006: {'reg': Register.EVENT_BF2, 'fmt': '!H'}, # noqa: E501
0x3008: {'reg': Register.VERSION, 'fmt': '!H', 'func': Fmt.version}, # noqa: E501
0x3009: {'reg': Register.GRID_VOLTAGE, 'fmt': '!H', 'ratio': 0.1}, # noqa: E501
0x300a: {'reg': Register.GRID_CURRENT, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501
0x300b: {'reg': Register.GRID_FREQUENCY, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501
0x300c: {'reg': Register.INVERTER_TEMP, 'fmt': '!H', 'eval': 'result-40'}, # noqa: E501
0x300c: {'reg': Register.INVERTER_TEMP, 'fmt': '!H', 'offset': -40}, # noqa: E501
# 0x300d
0x300e: {'reg': Register.RATED_POWER, 'fmt': '!H', 'ratio': 1}, # noqa: E501
0x300f: {'reg': Register.OUTPUT_POWER, 'fmt': '!H', 'ratio': 0.1}, # noqa: E501
@@ -72,6 +124,7 @@ class Modbus():
0x3026: {'reg': Register.PV3_TOTAL_GENERATION, 'fmt': '!L', 'ratio': 0.01}, # noqa: E501
0x3028: {'reg': Register.PV4_DAILY_GENERATION, 'fmt': '!H', 'ratio': 0.01}, # noqa: E501
0x3029: {'reg': Register.PV4_TOTAL_GENERATION, 'fmt': '!L', 'ratio': 0.01}, # noqa: E501
# 0x302a
}
def __init__(self, snd_handler: Callable[[bytes, int, str], None],
@@ -104,6 +157,7 @@ class Modbus():
self.loop = asyncio.get_event_loop()
self.req_pend = False
self.tim = None
self.node_id = ''
def close(self):
"""free the queue and erase the callback handlers"""
@@ -111,12 +165,11 @@ class Modbus():
self.__stop_timer()
self.rsp_handler = None
self.snd_handler = None
while not self.que.empty:
while not self.que.empty():
self.que.get_nowait()
def __del__(self):
"""log statistics on the deleting of a MODBUS instance"""
logging.debug(f'Modbus __del__:\n {self.counter}')
def set_node_id(self, node_id: str):
self.node_id = node_id
def build_msg(self, addr: int, func: int, reg: int, val: int,
log_lvl=logging.DEBUG) -> None:
@@ -136,7 +189,7 @@ class Modbus():
if self.que.qsize() == 1:
self.__send_next_from_que()
def recv_req(self, buf: bytearray,
def recv_req(self, buf: bytes,
rsp_handler: Callable[[None], None] = None) -> bool:
"""Add the received Modbus RTU request to the tx queue
@@ -161,14 +214,13 @@ class Modbus():
return True
def recv_resp(self, info_db, buf: bytearray, node_id: str) -> \
def recv_resp(self, info_db, buf: bytes) -> \
Generator[tuple[str, bool, int | float | str], None, None]:
"""Generator which check and parse a received MODBUS response.
Keyword arguments:
info_db: database for info lockups
buf: received Modbus RTU response frame
node_id: string for logging which identifies the slave
Returns on error and set Self.err to:
1: CRC error
@@ -178,58 +230,19 @@ class Modbus():
5: No MODBUS request pending
"""
# logging.info(f'recv_resp: first byte modbus:{buf[0]} len:{len(buf)}')
if not self.req_pend:
self.err = 5
return
if not self.__check_crc(buf):
logger.error(f'[{node_id}] Modbus resp: CRC error')
self.err = 1
return
if buf[0] != self.last_addr:
logger.info(f'[{node_id}] Modbus resp: Wrong addr {buf[0]}')
self.err = 2
return
fcode = buf[1]
if fcode != self.last_fcode:
logger.info(f'[{node_id}] Modbus: Wrong fcode {fcode}'
f' != {self.last_fcode}')
self.err = 3
data_available = self.last_addr == self.INV_ADDR and \
(fcode == 3 or fcode == 4)
if self.__resp_error_check(buf, data_available):
return
if self.last_addr == self.INV_ADDR and \
(fcode == 3 or fcode == 4):
if data_available:
elmlen = buf[2] >> 1
if elmlen != self.last_len:
logger.info(f'[{node_id}] Modbus: len error {elmlen}'
f' != {self.last_len}')
self.err = 4
return
first_reg = self.last_reg # save last_reg before sending next pdu
self.__stop_timer() # stop timer and send next pdu
for i in range(0, elmlen):
addr = first_reg+i
if addr in self.map:
row = self.map[addr]
info_id = row['reg']
fmt = row['fmt']
val = struct.unpack_from(fmt, buf, 3+2*i)
result = val[0]
if 'eval' in row:
result = eval(row['eval'])
if 'ratio' in row:
result = round(result * row['ratio'], 2)
keys, level, unit, must_incr = info_db._key_obj(info_id)
if keys:
name, update = info_db.update_db(keys, must_incr,
result)
yield keys[0], update, result
if update:
info_db.tracer.log(level,
f'[{node_id}] MODBUS: {name}'
f' : {result}{unit}')
yield from self.__process_data(info_db, buf, first_reg, elmlen)
else:
self.__stop_timer()
@@ -238,6 +251,53 @@ class Modbus():
self.rsp_handler()
self.__send_next_from_que()
def __resp_error_check(self, buf: bytes, data_available: bool) -> bool:
'''Check the MODBUS response for errors, returns True if one accure'''
if not self.req_pend:
self.err = 5
return True
if not self.__check_crc(buf):
logger.error(f'[{self.node_id}] Modbus resp: CRC error')
self.err = 1
return True
if buf[0] != self.last_addr:
logger.info(f'[{self.node_id}] Modbus resp: Wrong addr {buf[0]}')
self.err = 2
return True
fcode = buf[1]
if fcode != self.last_fcode:
logger.info(f'[{self.node_id}] Modbus: Wrong fcode {fcode}'
f' != {self.last_fcode}')
self.err = 3
return True
if data_available:
elmlen = buf[2] >> 1
if elmlen != self.last_len:
logger.info(f'[{self.node_id}] Modbus: len error {elmlen}'
f' != {self.last_len}')
self.err = 4
return True
return False
def __process_data(self, info_db, buf: bytes, first_reg, elmlen):
'''Generator over received registers, updates the db'''
for i in range(0, elmlen):
addr = first_reg+i
if addr in self.mb_reg_mapping:
row = self.mb_reg_mapping[addr]
info_id = row['reg']
keys, level, unit, must_incr = info_db._key_obj(info_id)
if keys:
result = Fmt.get_value(buf, 3+2*i, row)
name, update = info_db.update_db(keys, must_incr,
result)
yield keys[0], update, result
if update:
info_db.tracer.log(level,
f'[{self.node_id}] MODBUS: {name}'
f' : {result}{unit}')
'''
MODBUS response timer
'''
@@ -265,7 +325,10 @@ class Modbus():
self.__start_timer()
self.snd_handler(self.last_req, self.last_log_lvl, state='Retrans')
else:
logger.info(f'Modbus timeout {self}')
logger.info(f'[{self.node_id}] Modbus timeout '
f'(FCode: {self.last_fcode} '
f'Reg: 0x{self.last_reg:04x}, '
f'{self.last_len})')
self.counter['timeouts'] += 1
self.__send_next_from_que()
@@ -294,11 +357,11 @@ class Modbus():
'''
Helper function for CRC-16 handling
'''
def __check_crc(self, msg: bytearray) -> bool:
def __check_crc(self, msg: bytes) -> bool:
'''Check CRC-16 and returns True if valid'''
return 0 == self.__calc_crc(msg)
def __calc_crc(self, buffer: bytearray) -> int:
def __calc_crc(self, buffer: bytes) -> int:
'''Build CRC-16 for buffer and returns it'''
crc = CRC_INIT

90
app/src/modbus_tcp.py Normal file
View File

@@ -0,0 +1,90 @@
import logging
import traceback
import asyncio
from itertools import chain
from cnf.config import Config
from gen3plus.inverter_g3p import InverterG3P
from infos import Infos
logger = logging.getLogger('conn')
class ModbusConn():
def __init__(self, host, port):
self.host = host
self.port = port
self.addr = (host, port)
self.inverter = None
async def __aenter__(self) -> 'InverterG3P':
'''Establish a client connection to the TSUN cloud'''
connection = asyncio.open_connection(self.host, self.port)
reader, writer = await connection
self.inverter = InverterG3P(reader, writer,
client_mode=True)
self.inverter.__enter__()
stream = self.inverter.local.stream
logging.info(f'[{stream.node_id}:{stream.conn_no}] '
f'Connected to {self.addr}')
Infos.inc_counter('Inverter_Cnt')
await self.inverter.local.ifc.publish_outstanding_mqtt()
return self.inverter
async def __aexit__(self, exc_type, exc, tb):
Infos.dec_counter('Inverter_Cnt')
await self.inverter.local.ifc.publish_outstanding_mqtt()
self.inverter.__exit__(exc_type, exc, tb)
class ModbusTcp():
def __init__(self, loop, tim_restart=10) -> None:
self.tim_restart = tim_restart
inverters = Config.get('inverters')
batteries = Config.get('batteries')
# logging.info(f'Inverters: {inverters}')
for _, inv in chain(inverters.items(), batteries.items()):
if (type(inv) is dict
and 'monitor_sn' in inv
and 'client_mode' in inv):
client = inv['client_mode']
logger.info(f"'client_mode' for Monitoring-SN: {inv['monitor_sn']} host: {client['host']}:{client['port']}, forward: {client['forward']}") # noqa: E501
loop.create_task(self.modbus_loop(client['host'],
client['port'],
inv['monitor_sn'],
client['forward']))
async def modbus_loop(self, host, port,
snr: int, forward: bool) -> None:
'''Loop for receiving messages from the TSUN cloud (client-side)'''
while True:
try:
async with ModbusConn(host, port) as inverter:
stream = inverter.local.stream
await stream.send_start_cmd(snr, host, forward)
await stream.ifc.loop()
logger.info(f'[{stream.node_id}:{stream.conn_no}] '
f'Connection closed - Shutdown: '
f'{stream.shutdown_started}')
if stream.shutdown_started:
return
del inverter # decrease ref counter after the with block
except (ConnectionRefusedError, TimeoutError) as error:
logging.debug(f'Inv-conn:{error}')
except OSError as error:
if error.errno == 113: # pragma: no cover
logging.debug(f'os-error:{error}')
else:
logging.info(f'os-error: {error}')
except Exception:
logging.error(
f"ModbusTcpCreate: Exception for {(host, port)}:\n"
f"{traceback.format_exc()}")
await asyncio.sleep(self.tim_restart)

View File

@@ -2,9 +2,10 @@ import asyncio
import logging
import aiomqtt
import traceback
from modbus import Modbus
from messages import Message
from config import Config
from cnf.config import Config
from singleton import Singleton
logger_mqtt = logging.getLogger('mqtt')
@@ -12,16 +13,24 @@ logger_mqtt = logging.getLogger('mqtt')
class Mqtt(metaclass=Singleton):
__client = None
__cb_MqttIsUp = None
__cb_mqtt_is_up = None
def __init__(self, cb_MqttIsUp):
def __init__(self, cb_mqtt_is_up):
logger_mqtt.debug('MQTT: __init__')
if cb_MqttIsUp:
self.__cb_MqttIsUp = cb_MqttIsUp
if cb_mqtt_is_up:
self.__cb_mqtt_is_up = cb_mqtt_is_up
loop = asyncio.get_event_loop()
self.task = loop.create_task(self.__loop())
self.ha_restarts = 0
ha = Config.get('ha')
self.ha_status_topic = f"{ha['auto_conf_prefix']}/status"
self.mb_rated_topic = f"{ha['entity_prefix']}/+/rated_load"
self.mb_out_coeff_topic = f"{ha['entity_prefix']}/+/out_coeff"
self.mb_reads_topic = f"{ha['entity_prefix']}/+/modbus_read_regs"
self.mb_inputs_topic = f"{ha['entity_prefix']}/+/modbus_read_inputs"
self.mb_at_cmd_topic = f"{ha['entity_prefix']}/+/at_cmd"
@property
def ha_restarts(self):
return self._ha_restarts
@@ -30,15 +39,13 @@ class Mqtt(metaclass=Singleton):
def ha_restarts(self, value):
self._ha_restarts = value
def __del__(self):
logger_mqtt.debug('MQTT: __del__')
async def close(self) -> None:
logger_mqtt.debug('MQTT: close')
self.task.cancel()
try:
await self.task
except Exception as e:
except (asyncio.CancelledError, Exception) as e:
logging.debug(f"Mqtt.close: exception: {e} ...")
async def publish(self, topic: str, payload: str | bytes | bytearray
@@ -48,7 +55,6 @@ class Mqtt(metaclass=Singleton):
async def __loop(self) -> None:
mqtt = Config.get('mqtt')
ha = Config.get('ha')
logger_mqtt.info(f'start MQTT: host:{mqtt["host"]} port:'
f'{mqtt["port"]} '
f'user:{mqtt["user"]}')
@@ -58,51 +64,24 @@ class Mqtt(metaclass=Singleton):
password=mqtt['passwd'])
interval = 5 # Seconds
ha_status_topic = f"{ha['auto_conf_prefix']}/status"
mb_rated_topic = "tsun/+/rated_load" # fixme
mb_reads_topic = "tsun/+/modbus_read_regs" # fixme
mb_inputs_topic = "tsun/+/modbus_read_inputs" # fixme
mb_at_cmd_topic = "tsun/+/at_cmd" # fixme
while True:
try:
async with self.__client:
logger_mqtt.info('MQTT broker connection established')
if self.__cb_MqttIsUp:
await self.__cb_MqttIsUp()
if self.__cb_mqtt_is_up:
await self.__cb_mqtt_is_up()
# async with self.__client.messages() as messages:
await self.__client.subscribe(ha_status_topic)
await self.__client.subscribe(mb_rated_topic)
await self.__client.subscribe(mb_reads_topic)
await self.__client.subscribe(mb_inputs_topic)
await self.__client.subscribe(mb_at_cmd_topic)
await self.__client.subscribe(self.ha_status_topic)
await self.__client.subscribe(self.mb_rated_topic)
await self.__client.subscribe(self.mb_out_coeff_topic)
await self.__client.subscribe(self.mb_reads_topic)
await self.__client.subscribe(self.mb_inputs_topic)
await self.__client.subscribe(self.mb_at_cmd_topic)
async for message in self.__client.messages:
if message.topic.matches(ha_status_topic):
status = message.payload.decode("UTF-8")
logger_mqtt.info('Home-Assistant Status:'
f' {status}')
if status == 'online':
self.ha_restarts += 1
await self.__cb_MqttIsUp()
if message.topic.matches(mb_rated_topic):
await self.modbus_cmd(message,
Modbus.WRITE_SINGLE_REG,
1, 0x2008)
if message.topic.matches(mb_reads_topic):
await self.modbus_cmd(message,
Modbus.READ_REGS, 2)
if message.topic.matches(mb_inputs_topic):
await self.modbus_cmd(message,
Modbus.READ_INPUTS, 2)
if message.topic.matches(mb_at_cmd_topic):
await self.at_cmd(message)
await self.dispatch_msg(message)
except aiomqtt.MqttError:
if Config.is_default('mqtt'):
@@ -126,46 +105,76 @@ class Mqtt(metaclass=Singleton):
f"Exception:\n"
f"{traceback.format_exc()}")
async def dispatch_msg(self, message):
if message.topic.matches(self.ha_status_topic):
status = message.payload.decode("UTF-8")
logger_mqtt.info('Home-Assistant Status:'
f' {status}')
if status == 'online':
self.ha_restarts += 1
await self.__cb_mqtt_is_up()
if message.topic.matches(self.mb_rated_topic):
await self.modbus_cmd(message,
Modbus.WRITE_SINGLE_REG,
1, 0x2008)
if message.topic.matches(self.mb_out_coeff_topic):
payload = message.payload.decode("UTF-8")
try:
val = round(float(payload) * 1024/100)
if val < 0 or val > 1024:
logger_mqtt.error('out_coeff: value must be in'
'the range 0..100,'
f' got: {payload}')
else:
await self.modbus_cmd(message,
Modbus.WRITE_SINGLE_REG,
0, 0x202c, val)
except Exception:
pass
if message.topic.matches(self.mb_reads_topic):
await self.modbus_cmd(message,
Modbus.READ_REGS, 2)
if message.topic.matches(self.mb_inputs_topic):
await self.modbus_cmd(message,
Modbus.READ_INPUTS, 2)
if message.topic.matches(self.mb_at_cmd_topic):
await self.at_cmd(message)
def each_inverter(self, message, func_name: str):
topic = str(message.topic)
node_id = topic.split('/')[1] + '/'
found = False
for m in Message:
if m.server_side and (m.node_id == node_id):
found = True
logger_mqtt.debug(f'Found: {node_id}')
fnc = getattr(m, func_name, None)
if callable(fnc):
yield fnc
else:
logger_mqtt.warning(f'Cmd not supported by: {node_id}')
break
if not found:
else:
logger_mqtt.warning(f'Node_id: {node_id} not found')
async def modbus_cmd(self, message, func, params=0, addr=0, val=0):
topic = str(message.topic)
node_id = topic.split('/')[1] + '/'
# refactor into a loop over a table
payload = message.payload.decode("UTF-8")
logger_mqtt.info(f'MODBUS via MQTT: {topic} = {payload}')
for m in Message:
if m.server_side and (m.node_id == node_id):
logger_mqtt.debug(f'Found: {node_id}')
fnc = getattr(m, "send_modbus_cmd", None)
res = payload.split(',')
if params != len(res):
logger_mqtt.error(f'Parameter expected: {params}, '
f'got: {len(res)}')
return
if callable(fnc):
if params == 1:
val = int(payload)
elif params == 2:
addr = int(res[0], base=16)
val = int(res[1]) # lenght
await fnc(func, addr, val, logging.INFO)
for fnc in self.each_inverter(message, "send_modbus_cmd"):
res = payload.split(',')
if params > 0 and params != len(res):
logger_mqtt.error(f'Parameter expected: {params}, '
f'got: {len(res)}')
return
if params == 1:
val = int(payload)
elif params == 2:
addr = int(res[0], base=16)
val = int(res[1]) # lenght
await fnc(func, addr, val, logging.INFO)
async def at_cmd(self, message):
payload = message.payload.decode("UTF-8")

17
app/src/protocol_ifc.py Normal file
View File

@@ -0,0 +1,17 @@
from abc import abstractmethod
from async_ifc import AsyncIfc
from iter_registry import AbstractIterMeta
class ProtocolIfc(metaclass=AbstractIterMeta):
_registry = []
@abstractmethod
def __init__(self, addr, ifc: "AsyncIfc", server_side: bool,
client_mode: bool = False, id_str=b''):
pass # pragma: no cover
@abstractmethod
def close(self):
pass # pragma: no cover

View File

@@ -1,18 +1,41 @@
import asyncio
import logging
import json
from config import Config
from itertools import chain
from cnf.config import Config
from mqtt import Mqtt
from infos import Infos
# logger = logging.getLogger('conn')
logger_mqtt = logging.getLogger('mqtt')
class Inverter():
class Proxy():
'''class Proxy is a baseclass
The class has some class method for managing common resources like a
connection to the MQTT broker or proxy error counter which are common
for all inverter connection
Instances of the class are connections to an inverter and can have an
optional link to an remote connection to the TSUN cloud. A remote
connection dies with the inverter connection.
class methods:
class_init(): initialize the common resources of the proxy (MQTT
broker, Proxy DB, etc). Must be called before the
first inverter instance can be created
class_close(): release the common resources of the proxy. Should not
be called before any instances of the class are
destroyed
methods:
create_remote(): Establish a client connection to the TSUN cloud
async_publ_mqtt(): Publish data to MQTT broker
'''
@classmethod
def class_init(cls) -> None:
logging.debug('Inverter.class_init')
logging.debug('Proxy.class_init')
# initialize the proxy statistics
Infos.static_init()
cls.db_stat = Infos()
@@ -34,8 +57,9 @@ class Inverter():
# reset at midnight when you restart the proxy just before
# midnight!
inverters = Config.get('inverters')
# logger.debug(f'Inverters: {inverters}')
for inv in inverters.values():
batteries = Config.get('batteries')
# logger.debug(f'Proxys: {inverters}')
for _, inv in chain(inverters.items(), batteries.items()):
if (type(inv) is dict):
node_id = inv['node_id']
cls.db_stat.reg_clr_at_midnight(f'{cls.entity_prfx}{node_id}',
@@ -72,8 +96,8 @@ class Inverter():
Infos.new_stat_data[key] = False
@classmethod
def class_close(cls, loop) -> None:
logging.debug('Inverter.class_close')
def class_close(cls, loop) -> None: # pragma: no cover
logging.debug('Proxy.class_close')
logging.info('Close MQTT Task')
loop.run_until_complete(cls.mqtt.close())
cls.mqtt = None

View File

@@ -1,16 +1,22 @@
import logging
import asyncio
import logging.handlers
import signal
import os
import argparse
from asyncio import StreamReader, StreamWriter
from aiohttp import web
from logging import config # noqa F401
from messages import Message
from inverter import Inverter
from proxy import Proxy
from inverter_ifc import InverterIfc
from gen3.inverter_g3 import InverterG3
from gen3plus.inverter_g3p import InverterG3P
from scheduler import Schedule
from config import Config
from cnf.config import Config
from cnf.config_read_env import ConfigReadEnv
from cnf.config_read_toml import ConfigReadToml
from cnf.config_read_json import ConfigReadJson
from modbus_tcp import ModbusTcp
routes = web.RouteTableDef()
proxy_is_up = False
@@ -37,9 +43,9 @@ async def healthy(request):
if proxy_is_up:
# logging.info('web reqeust healthy()')
for stream in Message:
for inverter in InverterIfc:
try:
res = stream.healthy()
res = inverter.healthy()
if not res:
return web.Response(status=503, text="I have a problem")
except Exception as err:
@@ -69,21 +75,14 @@ async def webserver(addr, port):
logging.debug('HTTP cleanup done')
async def handle_client(reader: StreamReader, writer: StreamWriter):
async def handle_client(reader: StreamReader, writer: StreamWriter, inv_class):
'''Handles a new incoming connection and starts an async loop'''
addr = writer.get_extra_info('peername')
await InverterG3(reader, writer, addr).server_loop(addr)
with inv_class(reader, writer) as inv:
await inv.local.ifc.server_loop()
async def handle_client_v2(reader: StreamReader, writer: StreamWriter):
'''Handles a new incoming connection and starts an async loop'''
addr = writer.get_extra_info('peername')
await InverterG3P(reader, writer, addr).server_loop(addr)
async def handle_shutdown(web_task):
async def handle_shutdown(loop, web_task):
'''Close all TCP connections and stop the event loop'''
logging.info('Shutdown due to SIGTERM')
@@ -93,28 +92,24 @@ async def handle_shutdown(web_task):
#
# first, disc all open TCP connections gracefully
#
for stream in Message:
try:
await asyncio.wait_for(stream.disc(), 2)
except Exception:
pass
for inverter in InverterIfc:
await inverter.disc(True)
logging.info('Proxy disconnecting done')
#
# second, close all open TCP connections
#
for stream in Message:
stream.close()
await asyncio.sleep(0.1) # give time for closing
logging.info('Proxy closing done')
#
# third, cancel the web server
# second, cancel the web server
#
web_task.cancel()
await web_task
#
# now cancel all remaining (pending) tasks
#
pending = asyncio.all_tasks()
for task in pending:
task.cancel()
#
# at last, start a coro for stopping the loop
#
@@ -122,56 +117,103 @@ async def handle_shutdown(web_task):
loop.stop()
def get_log_level() -> int:
def get_log_level() -> int | None:
'''checks if LOG_LVL is set in the environment and returns the
corresponding logging.LOG_LEVEL'''
log_level = os.getenv('LOG_LVL', 'INFO')
if log_level == 'DEBUG':
log_level = logging.DEBUG
elif log_level == 'WARN':
log_level = logging.WARNING
else:
log_level = logging.INFO
return log_level
switch = {
'DEBUG': logging.DEBUG,
'WARN': logging.WARNING,
'INFO': logging.INFO,
'ERROR': logging.ERROR,
}
log_level = os.getenv('LOG_LVL', None)
logging.info(f"LOG_LVL : {log_level}")
return switch.get(log_level, None)
if __name__ == "__main__":
def main(): # pragma: no cover
parser = argparse.ArgumentParser()
parser.add_argument('-c', '--config_path', type=str,
default='./config/',
help='set path for the configuration files')
parser.add_argument('-j', '--json_config', type=str,
help='read user config from json-file')
parser.add_argument('-t', '--toml_config', type=str,
help='read user config from toml-file')
parser.add_argument('-l', '--log_path', type=str,
default='./log/',
help='set path for the logging files')
parser.add_argument('-b', '--log_backups', type=int,
default=0,
help='set max number of daily log-files')
args = parser.parse_args()
#
# Setup our daily, rotating logger
#
serv_name = os.getenv('SERVICE_NAME', 'proxy')
version = os.getenv('VERSION', 'unknown')
logging.config.fileConfig('logging.ini')
logging.info(f'Server "{serv_name} - {version}" will be started')
setattr(logging.handlers, "log_path", args.log_path)
setattr(logging.handlers, "log_backups", args.log_backups)
os.makedirs(args.log_path, exist_ok=True)
# set lowest-severity for 'root', 'msg', 'conn' and 'data' logger
src_dir = os.path.dirname(__file__) + '/'
logging.config.fileConfig(src_dir + 'logging.ini')
logging.info(f'Server "{serv_name} - {version}" will be started')
logging.info(f'current dir: {os.getcwd()}')
logging.info(f"config_path: {args.config_path}")
logging.info(f"json_config: {args.json_config}")
logging.info(f"toml_config: {args.toml_config}")
logging.info(f"log_path: {args.log_path}")
if args.log_backups == 0:
logging.info("log_backups: unlimited")
else:
logging.info(f"log_backups: {args.log_backups} days")
log_level = get_log_level()
logging.getLogger().setLevel(log_level)
logging.getLogger('msg').setLevel(log_level)
logging.getLogger('conn').setLevel(log_level)
logging.getLogger('data').setLevel(log_level)
logging.getLogger('tracer').setLevel(log_level)
logging.getLogger('asyncio').setLevel(log_level)
# logging.getLogger('mqtt').setLevel(log_level)
logging.info('******')
if log_level:
# set lowest-severity for 'root', 'msg', 'conn' and 'data' logger
logging.getLogger().setLevel(log_level)
logging.getLogger('msg').setLevel(log_level)
logging.getLogger('conn').setLevel(log_level)
logging.getLogger('data').setLevel(log_level)
logging.getLogger('tracer').setLevel(log_level)
logging.getLogger('asyncio').setLevel(log_level)
# logging.getLogger('mqtt').setLevel(log_level)
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
# read config file
ConfigErr = Config.class_init()
if ConfigErr is not None:
logging.info(f'ConfigErr: {ConfigErr}')
Inverter.class_init()
Config.init(ConfigReadToml(src_dir + "cnf/default_config.toml"))
ConfigReadEnv()
ConfigReadJson(args.config_path + "config.json")
ConfigReadToml(args.config_path + "config.toml")
ConfigReadJson(args.json_config)
ConfigReadToml(args.toml_config)
config_err = Config.get_error()
if config_err is not None:
logging.info(f'config_err: {config_err}')
return
logging.info('******')
Proxy.class_init()
Schedule.start()
ModbusTcp(loop)
#
# Create tasks for our listening servers. These must be tasks! If we call
# start_server directly out of our main task, the eventloop will be blocked
# and we can't receive and handle the UNIX signals!
#
loop.create_task(asyncio.start_server(handle_client, '0.0.0.0', 5005))
loop.create_task(asyncio.start_server(handle_client_v2, '0.0.0.0', 10000))
for inv_class, port in [(InverterG3, 5005), (InverterG3P, 10000)]:
logging.info(f'listen on port: {port} for inverters')
loop.create_task(asyncio.start_server(lambda r, w, i=inv_class:
handle_client(r, w, i),
'0.0.0.0', port))
web_task = loop.create_task(webserver('0.0.0.0', 8127))
#
@@ -181,18 +223,22 @@ if __name__ == "__main__":
for signame in ('SIGINT', 'SIGTERM'):
loop.add_signal_handler(getattr(signal, signame),
lambda loop=loop: asyncio.create_task(
handle_shutdown(web_task)))
handle_shutdown(loop, web_task)))
loop.set_debug(log_level == logging.DEBUG)
try:
if ConfigErr is None:
proxy_is_up = True
global proxy_is_up
proxy_is_up = True
loop.run_forever()
except KeyboardInterrupt:
pass
finally:
logging.info("Event loop is stopped")
Inverter.class_close(loop)
Proxy.class_close(loop)
logging.debug('Close event loop')
loop.close()
logging.info(f'Finally, exit Server "{serv_name}"')
if __name__ == "__main__": # pragma: no cover
main()

View File

@@ -1,9 +1,14 @@
from weakref import WeakValueDictionary
class Singleton(type):
_instances = {}
_instances = WeakValueDictionary()
def __call__(cls, *args, **kwargs):
# logger_mqtt.debug('singleton: __call__')
if cls not in cls._instances:
cls._instances[cls] = super(Singleton,
cls).__call__(*args, **kwargs)
instance = super(Singleton,
cls).__call__(*args, **kwargs)
cls._instances[cls] = instance
return cls._instances[cls]

View File

@@ -0,0 +1,583 @@
# test_with_pytest.py
import pytest
import asyncio
import gc
import time
from infos import Infos
from inverter_base import InverterBase
from async_stream import AsyncStreamServer, AsyncStreamClient, StreamPtr
from messages import Message
from test_modbus_tcp import FakeReader, FakeWriter
from test_inverter_base import config_conn, patch_open_connection
pytest_plugins = ('pytest_asyncio',)
# initialize the proxy statistics
Infos.static_init()
class FakeProto(Message):
def __init__(self, ifc, server_side):
super().__init__('G3F', ifc, server_side, None, 10)
self.conn_no = 0
def mb_timout_cb(self, exp_cnt):
pass # empty callback
def fake_reader_fwd():
reader = FakeReader()
reader.test = FakeReader.RD_TEST_13_BYTES
reader.on_recv.set()
return reader
def test_timeout_cb():
reader = FakeReader()
writer = FakeWriter()
def timeout():
return 13
ifc = AsyncStreamClient(reader, writer, None, None)
assert 360 == ifc._AsyncStream__timeout()
ifc.prot_set_timeout_cb(timeout)
assert 13 == ifc._AsyncStream__timeout()
ifc.prot_set_timeout_cb(None)
assert 360 == ifc._AsyncStream__timeout()
# call healthy outside the contexter manager (__exit__() was called)
assert ifc.healthy()
del ifc
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
def test_health():
reader = FakeReader()
writer = FakeWriter()
ifc = AsyncStreamClient(reader, writer, None, None)
ifc.proc_start = time.time()
assert ifc.healthy()
ifc.proc_start = time.time() -10
assert not ifc.healthy()
ifc.proc_start = None
assert ifc.healthy()
del ifc
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_close_cb():
assert asyncio.get_running_loop()
reader = FakeReader()
writer = FakeWriter()
cnt = 0
def timeout():
return 0.1
def closed():
nonlocal cnt
# The callback will be called after the AsyncStreamServer
# constructer has finished and so ifc must be defined in the
# upper scope
assert "ifc" in locals()
ifc.close() # clears the closed callback
cnt += 1
cnt = 0
ifc = AsyncStreamClient(reader, writer, None, closed)
ifc.prot_set_timeout_cb(timeout)
await ifc.client_loop('')
assert cnt == 1
ifc.prot_set_timeout_cb(timeout)
await ifc.client_loop('')
assert cnt == 1 # check that the closed method would not be called
del ifc
cnt = 0
ifc = AsyncStreamClient(reader, writer, None, None)
ifc.prot_set_timeout_cb(timeout)
await ifc.client_loop('')
assert cnt == 0
del ifc
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_read():
assert asyncio.get_running_loop()
reader = FakeReader()
reader.test = FakeReader.RD_TEST_13_BYTES
reader.on_recv.set()
writer = FakeWriter()
cnt = 0
def timeout():
return 1
def closed():
nonlocal cnt
# The callback will be called after the AsyncStreamServer
# constructer has finished and so ifc must be defined in the
# upper scope
assert "ifc" in locals()
ifc.close() # clears the closed callback
cnt += 1
def app_read():
ifc.proc_start -= 3
return 0.01 # async wait of 0.01
cnt = 0
ifc = AsyncStreamClient(reader, writer, None, closed)
ifc.proc_max = 0
ifc.prot_set_timeout_cb(timeout)
ifc.rx_set_cb(app_read)
await ifc.client_loop('')
print('End loop')
assert ifc.proc_max >= 3
assert 13 == ifc.rx_len()
assert cnt == 1
del ifc
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_write():
assert asyncio.get_running_loop()
reader = FakeReader()
reader.test = FakeReader.RD_TEST_13_BYTES
reader.on_recv.set()
writer = FakeWriter()
cnt = 0
def timeout():
return 1
def closed():
nonlocal cnt
# The callback will be called after the AsyncStreamServer
# constructer has finished and so ifc must be defined in the
# upper scope
assert "ifc" in locals()
ifc.close() # clears the closed callback
cnt += 1
def app_read():
ifc.proc_start -= 3
return 0.01 # async wait of 0.01
cnt = 0
ifc = AsyncStreamClient(reader, writer, None, closed)
ifc.proc_max = 10
ifc.prot_set_timeout_cb(timeout)
ifc.rx_set_cb(app_read)
ifc.tx_add(b'test-data-resp')
assert 14 == ifc.tx_len()
await ifc.client_loop('')
print('End loop')
assert ifc.proc_max >= 3
assert 13 == ifc.rx_len()
assert 0 == ifc.tx_len()
assert cnt == 1
del ifc
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_publ_mqtt_cb():
assert asyncio.get_running_loop()
reader = FakeReader()
reader.test = FakeReader.RD_TEST_13_BYTES
reader.on_recv.set()
writer = FakeWriter()
cnt = 0
def timeout():
return 0.1
async def publ_mqtt():
nonlocal cnt
cnt += 1
cnt = 0
ifc = AsyncStreamServer(reader, writer, publ_mqtt, None, None)
assert ifc.async_publ_mqtt
ifc.prot_set_timeout_cb(timeout)
await ifc.server_loop()
assert cnt == 3 # 2 calls in server_loop() and 1 in loop()
assert ifc.async_publ_mqtt
ifc.close() # clears the closed callback
assert not ifc.async_publ_mqtt
del ifc
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_create_remote_cb():
assert asyncio.get_running_loop()
reader = FakeReader()
writer = FakeWriter()
cnt = 0
def timeout():
return 0.1
async def create_remote():
nonlocal cnt
# The callback will be called after the AsyncStreamServer
# constructer has finished and so ifc must be defined in the
# upper scope
assert "ifc" in locals()
ifc.close() # clears the closed callback
cnt += 1
cnt = 0
ifc = AsyncStreamServer(reader, writer, None, create_remote, None)
assert ifc.create_remote
await ifc.create_remote()
assert cnt == 1
ifc.prot_set_timeout_cb(timeout)
await ifc.server_loop()
assert not ifc.create_remote
del ifc
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_sw_exception():
assert asyncio.get_running_loop()
reader = FakeReader()
reader.test = FakeReader.RD_TEST_SW_EXCEPT
reader.on_recv.set()
writer = FakeWriter()
cnt = 0
def timeout():
return 1
def closed():
nonlocal cnt
# The callback will be called after the AsyncStreamServer
# constructer has finished and so ifc must be defined in the
# upper scope
assert "ifc" in locals()
ifc.close() # clears the closed callback
cnt += 1
cnt = 0
ifc = AsyncStreamClient(reader, writer, None, closed)
ifc.prot_set_timeout_cb(timeout)
await ifc.client_loop('')
print('End loop')
assert cnt == 1
del ifc
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_os_error():
assert asyncio.get_running_loop()
reader = FakeReader()
reader.test = FakeReader.RD_TEST_OS_ERROR
reader.on_recv.set()
writer = FakeWriter()
cnt = 0
def timeout():
return 1
def closed():
nonlocal cnt
cnt += 1
cnt = 0
ifc = AsyncStreamClient(reader, writer, None, closed)
ifc.prot_set_timeout_cb(timeout)
await ifc.client_loop('')
print('End loop')
assert cnt == 1
del ifc
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
class TestType():
FWD_NO_EXCPT = 1
FWD_SW_EXCPT = 2
FWD_TIMEOUT = 3
FWD_OS_ERROR = 4
FWD_OS_ERROR_NO_STREAM = 5
FWD_RUNTIME_ERROR = 6
FWD_RUNTIME_ERROR_NO_STREAM = 7
def create_remote(remote, test_type, with_close_hdr:bool = False):
def update_hdr(buf):
return
def callback():
if test_type == TestType.FWD_SW_EXCPT:
remote.unknown_var += 1
elif test_type == TestType.FWD_TIMEOUT:
raise TimeoutError
elif test_type == TestType.FWD_OS_ERROR:
raise ConnectionRefusedError
elif test_type == TestType.FWD_OS_ERROR_NO_STREAM:
remote.stream = None
raise ConnectionRefusedError
elif test_type == TestType.FWD_RUNTIME_ERROR:
raise RuntimeError("Peer closed")
elif test_type == TestType.FWD_RUNTIME_ERROR_NO_STREAM:
remote.stream = None
raise RuntimeError("Peer closed")
return True
def close():
return
if with_close_hdr:
close_hndl = close
else:
close_hndl = None
remote.ifc = AsyncStreamClient(
FakeReader(), FakeWriter(), StreamPtr(None), close_hndl)
remote.ifc.prot_set_update_header_cb(update_hdr)
remote.ifc.prot_set_init_new_client_conn_cb(callback)
remote.stream = FakeProto(remote.ifc, False)
@pytest.mark.asyncio
async def test_forward():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
async def _create_remote():
nonlocal cnt
create_remote(remote, TestType.FWD_NO_EXCPT)
# The callback will be called after the AsyncStreamServer
# constructer has finished and so ifc must be defined in the
# upper scope
assert "ifc" in locals()
ifc.fwd_add(b'test-forward_msg2 ')
cnt += 1
cnt = 0
ifc = AsyncStreamServer(fake_reader_fwd(), FakeWriter(), None, _create_remote, remote)
ifc.fwd_add(b'test-forward_msg')
await ifc.server_loop()
assert cnt == 1
del ifc
@pytest.mark.asyncio
async def test_forward_with_conn():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
async def _create_remote():
nonlocal cnt
cnt += 1
cnt = 0
ifc = AsyncStreamServer(fake_reader_fwd(), FakeWriter(), None, _create_remote, remote)
create_remote(remote, TestType.FWD_NO_EXCPT)
ifc.fwd_add(b'test-forward_msg')
await ifc.server_loop()
assert cnt == 0
del ifc
@pytest.mark.asyncio
async def test_forward_no_conn():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
async def _create_remote():
nonlocal cnt
cnt += 1
cnt = 0
ifc = AsyncStreamServer(fake_reader_fwd(), FakeWriter(), None, _create_remote, remote)
ifc.fwd_add(b'test-forward_msg')
await ifc.server_loop()
assert cnt == 1
del ifc
@pytest.mark.asyncio
async def test_forward_sw_except():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
async def _create_remote():
nonlocal cnt
create_remote(remote, TestType.FWD_SW_EXCPT)
cnt += 1
cnt = 0
ifc = AsyncStreamServer(fake_reader_fwd(), FakeWriter(), None, _create_remote, remote)
ifc.fwd_add(b'test-forward_msg')
await ifc.server_loop()
assert cnt == 1
del ifc
@pytest.mark.asyncio
async def test_forward_os_error():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
async def _create_remote():
nonlocal cnt
create_remote(remote, TestType.FWD_OS_ERROR)
cnt += 1
cnt = 0
ifc = AsyncStreamServer(fake_reader_fwd(), FakeWriter(), None, _create_remote, remote)
ifc.fwd_add(b'test-forward_msg')
await ifc.server_loop()
assert cnt == 1
del ifc
@pytest.mark.asyncio
async def test_forward_os_error2():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
async def _create_remote():
nonlocal cnt
create_remote(remote, TestType.FWD_OS_ERROR, True)
cnt += 1
cnt = 0
ifc = AsyncStreamServer(fake_reader_fwd(), FakeWriter(), None, _create_remote, remote)
ifc.fwd_add(b'test-forward_msg')
await ifc.server_loop()
assert cnt == 1
del ifc
@pytest.mark.asyncio
async def test_forward_os_error3():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
async def _create_remote():
nonlocal cnt
create_remote(remote, TestType.FWD_OS_ERROR_NO_STREAM)
cnt += 1
cnt = 0
ifc = AsyncStreamServer(fake_reader_fwd(), FakeWriter(), None, _create_remote, remote)
ifc.fwd_add(b'test-forward_msg')
await ifc.server_loop()
assert cnt == 1
del ifc
@pytest.mark.asyncio
async def test_forward_runtime_error():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
async def _create_remote():
nonlocal cnt
create_remote(remote, TestType.FWD_RUNTIME_ERROR)
cnt += 1
cnt = 0
ifc = AsyncStreamServer(fake_reader_fwd(), FakeWriter(), None, _create_remote, remote)
ifc.fwd_add(b'test-forward_msg')
await ifc.server_loop()
assert cnt == 1
del ifc
@pytest.mark.asyncio
async def test_forward_runtime_error2():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
async def _create_remote():
nonlocal cnt
create_remote(remote, TestType.FWD_RUNTIME_ERROR, True)
cnt += 1
cnt = 0
ifc = AsyncStreamServer(fake_reader_fwd(), FakeWriter(), None, _create_remote, remote)
ifc.fwd_add(b'test-forward_msg')
await ifc.server_loop()
assert cnt == 1
del ifc
@pytest.mark.asyncio
async def test_forward_runtime_error3():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
async def _create_remote():
nonlocal cnt
create_remote(remote, TestType.FWD_RUNTIME_ERROR_NO_STREAM, True)
cnt += 1
cnt = 0
ifc = AsyncStreamServer(fake_reader_fwd(), FakeWriter(), None, _create_remote, remote)
ifc.fwd_add(b'test-forward_msg')
await ifc.server_loop()
assert cnt == 1
del ifc
@pytest.mark.asyncio
async def test_forward_resp():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
def _close_cb():
nonlocal cnt
cnt += 1
cnt = 0
ifc = AsyncStreamClient(fake_reader_fwd(), FakeWriter(), remote, _close_cb)
create_remote(remote, TestType.FWD_NO_EXCPT)
ifc.fwd_add(b'test-forward_msg')
await ifc.client_loop('')
assert cnt == 1
del ifc
@pytest.mark.asyncio
async def test_forward_resp2():
assert asyncio.get_running_loop()
remote = StreamPtr(None)
cnt = 0
def _close_cb():
nonlocal cnt
cnt += 1
cnt = 0
ifc = AsyncStreamClient(fake_reader_fwd(), FakeWriter(), None, _close_cb)
create_remote(remote, TestType.FWD_NO_EXCPT)
ifc.fwd_add(b'test-forward_msg')
await ifc.client_loop('')
assert cnt == 1
del ifc

View File

@@ -0,0 +1,43 @@
# test_with_pytest.py
from byte_fifo import ByteFifo
def test_fifo():
read = ByteFifo()
assert 0 == len(read)
read += b'12'
assert 2 == len(read)
read += bytearray("34", encoding='UTF8')
assert 4 == len(read)
assert b'12' == read.peek(2)
assert 4 == len(read)
assert b'1234' == read.peek()
assert 4 == len(read)
assert b'12' == read.get(2)
assert 2 == len(read)
assert b'34' == read.get()
assert 0 == len(read)
def test_fifo_fmt():
read = ByteFifo()
read += b'1234'
assert b'1234' == read.peek()
assert " 0000 | 31 32 33 34 | 1234" == f'{read}'
def test_fifo_observer():
read = ByteFifo()
def _read():
assert b'1234' == read.get(4)
read += b'12'
assert 2 == len(read)
read()
read.reg_trigger(_read)
read += b'34'
assert 4 == len(read)
read()
assert 0 == len(read)
assert b'' == read.peek(2)
assert b'' == read.get(2)
assert 0 == len(read)

View File

@@ -1,17 +1,57 @@
# test_with_pytest.py
import tomllib
import pytest
import json
from mock import patch
from schema import SchemaMissingKeyError
from app.src.config import Config
from cnf.config import Config, ConfigIfc
from cnf.config_read_toml import ConfigReadToml
class TstConfig(Config):
class FakeBuffer:
rd = str()
test_buffer = FakeBuffer
class FakeFile():
def __init__(self):
self.buf = test_buffer
def __enter__(self):
return self
def __exit__(self, exc_type, exc, tb):
pass
class FakeOptionsFile(FakeFile):
def __init__(self, OpenTextMode):
super().__init__()
self.bin_mode = 'b' in OpenTextMode
def read(self):
if self.bin_mode:
return bytearray(self.buf.rd.encode('utf-8')).copy()
else:
return self.buf.rd.copy()
def patch_open():
def new_open(file: str, OpenTextMode="rb"):
if file == "_no__file__no_":
raise FileNotFoundError
return FakeOptionsFile(OpenTextMode)
with patch('builtins.open', new_open) as conn:
yield conn
class TstConfig(ConfigIfc):
@classmethod
def set(cls, cnf):
cls.config = cnf
def __init__(cls, cnf):
cls.act_config = cnf
@classmethod
def _read_config_file(cls) -> dict:
return cls.config
def add_config(cls) -> dict:
return cls.act_config
def test_empty_config():
@@ -20,123 +60,427 @@ def test_empty_config():
Config.conf_schema.validate(cnf)
assert False
except SchemaMissingKeyError:
assert True
pass
@pytest.fixture
def ConfigDefault():
return {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'},
'inverters': {
'allow_all': False,
'R170000000000001': {
'suggested_area': '',
'modbus_polling': False,
'monitor_sn': 0,
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-395M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-395M'},
'sensor_list': 0
},
'Y170000000000001': {
'modbus_polling': True,
'monitor_sn': 2000000000,
'suggested_area': '',
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv3': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv4': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'sensor_list': 0
}
},
'batteries': {
'4100000000000001': {
'modbus_polling': True,
'monitor_sn': 3000000000,
'suggested_area': '',
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'sensor_list': 0,
}
}
}
@pytest.fixture
def ConfigComplete():
return {
'gen3plus': {
'at_acl': {
'mqtt': {'allow': ['AT+'], 'block': ['AT+SUPDATE']},
'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'],
'block': ['AT+SUPDATE']}
}
},
'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com',
'port': 5005},
'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com',
'port': 10000},
'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None},
'ha': {'auto_conf_prefix': 'homeassistant',
'discovery_prefix': 'homeassistant',
'entity_prefix': 'tsun',
'proxy_node_id': 'proxy',
'proxy_unique_id': 'P170000000000001'},
'inverters': {
'allow_all': False,
'R170000000000001': {'node_id': 'PV-Garage/',
'modbus_polling': False,
'monitor_sn': 0,
'pv1': {'manufacturer': 'man1',
'type': 'type1'},
'pv2': {'manufacturer': 'man2',
'type': 'type2'},
'suggested_area': 'Garage',
'sensor_list': 688},
'Y170000000000001': {'modbus_polling': True,
'monitor_sn': 2000000000,
'node_id': 'PV-Garage2/',
'pv1': {'manufacturer': 'man1',
'type': 'type1'},
'pv2': {'manufacturer': 'man2',
'type': 'type2'},
'pv3': {'manufacturer': 'man3',
'type': 'type3'},
'pv4': {'manufacturer': 'man4',
'type': 'type4'},
'suggested_area': 'Garage2',
'sensor_list': 688},
'Y170000000000002': {'modbus_polling': False,
'modbus_scanning': {
'bytes': 16,
'start': 2048,
'step': 1024
},
'monitor_sn': 2000000001,
'node_id': 'PV-Garage3/',
'suggested_area': 'Garage3',
'sensor_list': 688}
},
'batteries': {
'4100000000000001': {
'modbus_polling': True,
'monitor_sn': 3000000000,
'suggested_area': 'Garage3',
'node_id': 'Bat-Garage3/',
'pv1': {'manufacturer': 'man5',
'type': 'type5'},
'pv2': {'manufacturer': 'man6',
'type': 'type6'},
'sensor_list': 12326}
}
}
def test_default_config():
with open("app/config/default_config.toml", "rb") as f:
cnf = tomllib.load(f)
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
validated = Config.def_config
assert validated == {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'},
'batteries': {
'4100000000000001': {
'modbus_polling': True,
'monitor_sn': 3000000000,
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'sensor_list': 0,
'suggested_area': ''
}
},
'inverters': {
'allow_all': False,
'R170000000000001': {
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-395M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-395M'},
'modbus_polling': False,
'monitor_sn': 0,
'suggested_area': '',
'sensor_list': 0},
'Y170000000000001': {
'modbus_polling': True,
'monitor_sn': 2000000000,
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv3': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv4': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'suggested_area': '',
'sensor_list': 0}}}
try:
validated = Config.conf_schema.validate(cnf)
assert True
except:
assert False
assert validated == {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'}, 'inverters': {'allow_all': True, 'R170000000000001': {'node_id': '', 'monitor_sn': 0, 'suggested_area': ''}, 'Y170000000000001': {'monitor_sn': 2000000000, 'node_id': '', 'suggested_area': ''}}}
def test_full_config():
def test_full_config(ConfigComplete):
cnf = {'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005},
'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []},
'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}},
'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': ['AT+SUPDATE']},
'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': ['AT+SUPDATE']}}},
'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000},
'mqtt': {'host': 'mqtt', 'port': 1883, 'user': '', 'passwd': ''},
'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'},
'inverters': {'allow_all': True,
'R170000000000001': {'node_id': '', 'suggested_area': '', 'pv1': {'type': 'type1', 'manufacturer': 'man1'}, 'pv2': {'type': 'type2', 'manufacturer': 'man2'}, 'pv3': {'type': 'type3', 'manufacturer': 'man3'}},
'Y170000000000001': {'monitor_sn': 2000000000, 'node_id': '', 'suggested_area': ''}}}
'batteries': {
'4100000000000001': {'modbus_polling': True, 'monitor_sn': 3000000000, 'node_id': 'Bat-Garage3/', 'sensor_list': 0x3026, 'suggested_area': 'Garage3', 'pv1': {'type': 'type5', 'manufacturer': 'man5'}, 'pv2': {'type': 'type6', 'manufacturer': 'man6'}}
},
'inverters': {'allow_all': False,
'R170000000000001': {'modbus_polling': False, 'node_id': 'PV-Garage/', 'sensor_list': 0x02B0, 'suggested_area': 'Garage', 'pv1': {'type': 'type1', 'manufacturer': 'man1'}, 'pv2': {'type': 'type2', 'manufacturer': 'man2'}},
'Y170000000000001': {'modbus_polling': True, 'monitor_sn': 2000000000, 'node_id': 'PV-Garage2/', 'sensor_list': 0x02B0, 'suggested_area': 'Garage2', 'pv1': {'type': 'type1', 'manufacturer': 'man1'}, 'pv2': {'type': 'type2', 'manufacturer': 'man2'}, 'pv3': {'type': 'type3', 'manufacturer': 'man3'}, 'pv4': {'type': 'type4', 'manufacturer': 'man4'}},
'Y170000000000002': {'modbus_polling': False, 'monitor_sn': 2000000001, 'node_id': 'PV-Garage3/', 'sensor_list': 0x02B0, 'suggested_area': 'Garage3', 'modbus_scanning': {'start': 2048, 'step': 1024, 'bytes': 16}}
}
}
try:
validated = Config.conf_schema.validate(cnf)
assert True
except:
except Exception:
assert False
assert validated == {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'}, 'inverters': {'allow_all': True, 'R170000000000001': {'node_id': '', 'monitor_sn': 0, 'pv1': {'manufacturer': 'man1','type': 'type1'},'pv2': {'manufacturer': 'man2','type': 'type2'},'pv3': {'manufacturer': 'man3','type': 'type3'}, 'suggested_area': ''}, 'Y170000000000001': {'monitor_sn': 2000000000, 'node_id': '', 'suggested_area': ''}}}
assert validated == ConfigComplete
def test_mininum_config():
cnf = {'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005},
'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+']},
'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE']}}},
'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000},
'mqtt': {'host': 'mqtt', 'port': 1883, 'user': '', 'passwd': ''},
'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'},
'inverters': {'allow_all': True,
'R170000000000001': {}}
}
try:
validated = Config.conf_schema.validate(cnf)
assert True
except:
assert False
assert validated == {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'}, 'inverters': {'allow_all': True, 'R170000000000001': {'node_id': '', 'monitor_sn': 0, 'suggested_area': ''}}}
def test_read_empty():
cnf = {}
TstConfig.set(cnf)
err = TstConfig.read('app/config/')
assert err == None
cnf = TstConfig.get()
assert cnf == {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'}, 'inverters': {'allow_all': True, 'R170000000000001': {'suggested_area': '', 'monitor_sn': 0, 'node_id': ''}, 'Y170000000000001': {'monitor_sn': 2000000000, 'suggested_area': '', 'node_id': ''}}}
def test_read_empty(ConfigDefault):
test_buffer.rd = ""
defcnf = TstConfig.def_config.get('solarman')
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadToml("config/config.toml")
err = Config.get_error()
assert err == None
cnf = Config.get()
assert cnf == ConfigDefault
defcnf = Config.def_config.get('solarman')
assert defcnf == {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000}
assert True == TstConfig.is_default('solarman')
assert True == Config.is_default('solarman')
def test_no_file():
cnf = {}
TstConfig.set(cnf)
err = TstConfig.read('')
Config.init(ConfigReadToml("default_config.toml"))
err = Config.get_error()
assert err == "Config.read: [Errno 2] No such file or directory: 'default_config.toml'"
cnf = TstConfig.get()
cnf = Config.get()
assert cnf == {}
defcnf = TstConfig.def_config.get('solarman')
defcnf = Config.def_config.get('solarman')
assert defcnf == None
def test_read_cnf1():
cnf = {'solarman' : {'enabled': False}}
TstConfig.set(cnf)
err = TstConfig.read('app/config/')
def test_no_file2():
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
assert Config.err == None
ConfigReadToml("_no__file__no_")
err = Config.get_error()
assert err == None
cnf = TstConfig.get()
assert cnf == {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': False, 'host': 'iot.talent-monitoring.com', 'port': 10000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'}, 'inverters': {'allow_all': True, 'R170000000000001': {'suggested_area': '', 'monitor_sn': 0, 'node_id': ''}, 'Y170000000000001': {'monitor_sn': 2000000000, 'suggested_area': '', 'node_id': ''}}}
cnf = TstConfig.get('solarman')
def test_invalid_filename():
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
assert Config.err == None
ConfigReadToml(None)
err = Config.get_error()
assert err == None
def test_read_cnf1():
test_buffer.rd = "solarman.enabled = false"
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadToml("config/config.toml")
err = Config.get_error()
assert err == None
cnf = Config.get()
assert cnf == {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': False, 'host': 'iot.talent-monitoring.com', 'port': 10000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'},
'batteries': {
'4100000000000001': {
'modbus_polling': True,
'monitor_sn': 3000000000,
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'sensor_list': 0,
'suggested_area': ''
}
},
'inverters': {
'allow_all': False,
'R170000000000001': {
'suggested_area': '',
'modbus_polling': False,
'monitor_sn': 0,
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-395M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-395M'},
'sensor_list': 0
},
'Y170000000000001': {
'modbus_polling': True,
'monitor_sn': 2000000000,
'suggested_area': '',
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv3': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv4': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'sensor_list': 0
}
}
}
cnf = Config.get('solarman')
assert cnf == {'enabled': False, 'host': 'iot.talent-monitoring.com', 'port': 10000}
defcnf = TstConfig.def_config.get('solarman')
defcnf = Config.def_config.get('solarman')
assert defcnf == {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000}
assert False == TstConfig.is_default('solarman')
assert False == Config.is_default('solarman')
def test_read_cnf2():
cnf = {'solarman' : {'enabled': 'FALSE'}}
TstConfig.set(cnf)
err = TstConfig.read('app/config/')
assert err == None
cnf = TstConfig.get()
assert cnf == {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'}, 'inverters': {'allow_all': True, 'R170000000000001': {'suggested_area': '', 'monitor_sn': 0, 'node_id': ''}, 'Y170000000000001': {'monitor_sn': 2000000000, 'suggested_area': '', 'node_id': ''}}}
assert True == TstConfig.is_default('solarman')
test_buffer.rd = "solarman.enabled = 'FALSE'"
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadToml("config/config.toml")
err = Config.get_error()
def test_read_cnf3():
cnf = {'solarman' : {'port': 'FALSE'}}
TstConfig.set(cnf)
err = TstConfig.read('app/config/')
assert err == 'Config.read: Key \'solarman\' error:\nKey \'port\' error:\nint(\'FALSE\') raised ValueError("invalid literal for int() with base 10: \'FALSE\'")'
cnf = TstConfig.get()
assert cnf == {'solarman': {'port': 'FALSE'}}
assert err == None
cnf = Config.get()
assert cnf == {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 10000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'},
'batteries': {
'4100000000000001': {
'modbus_polling': True,
'monitor_sn': 3000000000,
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'sensor_list': 0,
'suggested_area': ''
}
},
'inverters': {
'allow_all': False,
'R170000000000001': {
'suggested_area': '',
'modbus_polling': False,
'monitor_sn': 0,
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-395M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-395M'},
'sensor_list': 0
},
'Y170000000000001': {
'modbus_polling': True,
'monitor_sn': 2000000000,
'suggested_area': '',
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv3': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv4': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'sensor_list': 0
}
}
}
assert True == Config.is_default('solarman')
def test_read_cnf3(ConfigDefault):
test_buffer.rd = "solarman.port = 'FALSE'"
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadToml("config/config.toml")
err = Config.get_error()
assert err == 'error: Key \'solarman\' error:\nKey \'port\' error:\nint(\'FALSE\') raised ValueError("invalid literal for int() with base 10: \'FALSE\'")'
cnf = Config.get()
assert cnf == ConfigDefault
def test_read_cnf4():
cnf = {'solarman' : {'port': 5000}}
TstConfig.set(cnf)
err = TstConfig.read('app/config/')
test_buffer.rd = "solarman.port = 5000"
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadToml("config/config.toml")
err = Config.get_error()
assert err == None
cnf = TstConfig.get()
assert cnf == {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 5000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'}, 'inverters': {'allow_all': True, 'R170000000000001': {'suggested_area': '', 'monitor_sn': 0, 'node_id': ''}, 'Y170000000000001': {'monitor_sn': 2000000000, 'suggested_area': '', 'node_id': ''}}}
assert False == TstConfig.is_default('solarman')
cnf = Config.get()
assert cnf == {'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': []}, 'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'], 'block': []}}}, 'tsun': {'enabled': True, 'host': 'logger.talent-monitoring.com', 'port': 5005}, 'solarman': {'enabled': True, 'host': 'iot.talent-monitoring.com', 'port': 5000}, 'mqtt': {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}, 'ha': {'auto_conf_prefix': 'homeassistant', 'discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun', 'proxy_node_id': 'proxy', 'proxy_unique_id': 'P170000000000001'},
'batteries': {
'4100000000000001': {
'modbus_polling': True,
'monitor_sn': 3000000000,
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'sensor_list': 0,
'suggested_area': ''
}
},
'inverters': {
'allow_all': False,
'R170000000000001': {
'suggested_area': '',
'modbus_polling': False,
'monitor_sn': 0,
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-395M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-395M'},
'sensor_list': 0
},
'Y170000000000001': {
'modbus_polling': True,
'monitor_sn': 2000000000,
'suggested_area': '',
'node_id': '',
'pv1': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv2': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv3': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'pv4': {'manufacturer': 'Risen',
'type': 'RSM40-8-410M'},
'sensor_list': 0
}
}
}
assert False == Config.is_default('solarman')
def test_read_cnf5():
cnf = {'solarman' : {'port': 1023}}
TstConfig.set(cnf)
err = TstConfig.read('app/config/')
test_buffer.rd = "solarman.port = 1023"
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadToml("config/config.toml")
err = Config.get_error()
assert err != None
def test_read_cnf6():
cnf = {'solarman' : {'port': 65536}}
TstConfig.set(cnf)
err = TstConfig.read('app/config/')
test_buffer.rd = "solarman.port = 65536"
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadToml("config/config.toml")
err = Config.get_error()
assert err != None

View File

@@ -0,0 +1,53 @@
# test_with_pytest.py
import pytest
import os
from mock import patch
from cnf.config import Config
from cnf.config_read_toml import ConfigReadToml
from cnf.config_read_env import ConfigReadEnv
def patch_getenv():
def new_getenv(key: str, defval=None):
"""Get an environment variable, return None if it doesn't exist.
The optional second argument can specify an alternate default. key,
default and the result are str."""
if key == 'MQTT_PASSWORD':
return 'passwd'
elif key == 'MQTT_PORT':
return 1234
elif key == 'MQTT_HOST':
return ""
return defval
with patch.object(os, 'getenv', new_getenv) as conn:
yield conn
def test_extend_key():
cnf_rd = ConfigReadEnv()
conf = {}
cnf_rd._extend_key(conf, "mqtt.user", "testuser")
assert conf == {
'mqtt': {
'user': 'testuser',
},
}
conf = {}
cnf_rd._extend_key(conf, "mqtt", "testuser")
assert conf == {
'mqtt': 'testuser',
}
conf = {}
cnf_rd._extend_key(conf, "", "testuser")
assert conf == {'': 'testuser'}
def test_read_env_config():
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
assert Config.get('mqtt') == {'host': 'mqtt', 'port': 1883, 'user': None, 'passwd': None}
for _ in patch_getenv():
ConfigReadEnv()
assert Config.get_error() == None
assert Config.get('mqtt') == {'host': 'mqtt', 'port': 1234, 'user': None, 'passwd': 'passwd'}

View File

@@ -0,0 +1,434 @@
# test_with_pytest.py
import pytest
from mock import patch
from cnf.config import Config
from cnf.config_read_json import ConfigReadJson
from cnf.config_read_toml import ConfigReadToml
from test_config import ConfigDefault, ConfigComplete
class CnfIfc(ConfigReadJson):
def __init__(self):
pass
class FakeBuffer:
rd = str()
wr = str()
test_buffer = FakeBuffer
class FakeFile():
def __init__(self):
self.buf = test_buffer
def __enter__(self):
return self
def __exit__(self, exc_type, exc, tb):
pass
class FakeOptionsFile(FakeFile):
def __init__(self, OpenTextMode):
super().__init__()
self.bin_mode = 'b' in OpenTextMode
def read(self):
print(f"Fake.read: bmode:{self.bin_mode}")
if self.bin_mode:
return bytearray(self.buf.rd.encode('utf-8')).copy()
else:
print(f"Fake.read: str:{self.buf.rd}")
return self.buf.rd
def patch_open():
def new_open(file: str, OpenTextMode="r"):
if file == "_no__file__no_":
raise FileNotFoundError
return FakeOptionsFile(OpenTextMode)
with patch('builtins.open', new_open) as conn:
yield conn
@pytest.fixture
def ConfigTomlEmpty():
return {
'mqtt': {'host': 'mqtt', 'port': 1883, 'user': '', 'passwd': ''},
'ha': {'auto_conf_prefix': 'homeassistant',
'discovery_prefix': 'homeassistant',
'entity_prefix': 'tsun',
'proxy_node_id': 'proxy',
'proxy_unique_id': 'P170000000000001'},
'solarman': {
'enabled': True,
'host': 'iot.talent-monitoring.com',
'port': 10000,
},
'tsun': {
'enabled': True,
'host': 'logger.talent-monitoring.com',
'port': 5005,
},
'inverters': {
'allow_all': False
},
'gen3plus': {'at_acl': {'tsun': {'allow': [], 'block': []},
'mqtt': {'allow': [], 'block': []}}},
}
def test_no_config(ConfigDefault):
test_buffer.rd = "" # empty buffer, no json
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadJson()
err = Config.get_error()
assert err == 'error: Expecting value: line 1 column 1 (char 0)'
cnf = Config.get()
assert cnf == ConfigDefault
def test_no_file(ConfigDefault):
test_buffer.rd = "" # empty buffer, no json
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadJson("_no__file__no_")
err = Config.get_error()
assert err == None
cnf = Config.get()
assert cnf == ConfigDefault
def test_invalid_filename(ConfigDefault):
test_buffer.rd = "" # empty buffer, no json
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadJson(None)
err = Config.get_error()
assert err == None
cnf = Config.get()
assert cnf == ConfigDefault
def test_cnv1():
"""test dotted key converting"""
tst = {
"gen3plus.at_acl.mqtt.block": [
"AT+SUPDATE",
"AT+"
]
}
cnf = ConfigReadJson()
obj = cnf.convert_to_obj(tst)
assert obj == {
'gen3plus': {
'at_acl': {
'mqtt': {
'block': [
'AT+SUPDATE',
"AT+"
],
},
},
},
}
def test_cnv2():
"""test a valid list with serials in inverters"""
tst = {
"inverters": [
{
"serial": "R170000000000001",
},
{
"serial": "Y170000000000001",
}
],
}
cnf = ConfigReadJson()
obj = cnf.convert_to_obj(tst)
assert obj == {
'inverters': {
'R170000000000001': {},
'Y170000000000001': {}
},
}
def test_cnv3():
"""test the combination of a list and a scalar in inverters"""
tst = {
"inverters": [
{
"serial": "R170000000000001",
},
{
"serial": "Y170000000000001",
}
],
"inverters.allow_all": False,
}
cnf = ConfigReadJson()
obj = cnf.convert_to_obj(tst)
assert obj == {
'inverters': {
'R170000000000001': {},
'Y170000000000001': {},
'allow_all': False,
},
}
def test_cnv4():
tst = {
"inverters": [
{
"serial": "R170000000000001",
"node_id": "PV-Garage/",
"suggested_area": "Garage",
"modbus_polling": False,
"pv1.manufacturer": "man1",
"pv1.type": "type1",
"pv2.manufacturer": "man2",
"pv2.type": "type2",
"sensor_list": 688
},
{
"serial": "Y170000000000001",
"monitor_sn": 2000000000,
"node_id": "PV-Garage2/",
"suggested_area": "Garage2",
"modbus_polling": True,
"client_mode.host": "InverterIP",
"client_mode.port": 1234,
"client_mode.forward": True,
"pv1.manufacturer": "man1",
"pv1.type": "type1",
"pv2.manufacturer": "man2",
"pv2.type": "type2",
"pv3.manufacturer": "man3",
"pv3.type": "type3",
"pv4.manufacturer": "man4",
"pv4.type": "type4",
"sensor_list": 688
}
],
"tsun.enabled": True,
"solarman.enabled": True,
"inverters.allow_all": False,
"gen3plus.at_acl.tsun.allow": [
"AT+Z",
"AT+UPURL",
"AT+SUPDATE"
],
"gen3plus.at_acl.tsun.block": [
"AT+SUPDATE"
],
"gen3plus.at_acl.mqtt.allow": [
"AT+"
],
"gen3plus.at_acl.mqtt.block": [
"AT+SUPDATE"
]
}
cnf = ConfigReadJson()
obj = cnf.convert_to_obj(tst)
assert obj == {
'gen3plus': {'at_acl': {'mqtt': {'allow': ['AT+'], 'block': ['AT+SUPDATE']},
'tsun': {'allow': ['AT+Z', 'AT+UPURL', 'AT+SUPDATE'],
'block': ['AT+SUPDATE']}}},
'inverters': {'R170000000000001': {'modbus_polling': False,
'node_id': 'PV-Garage/',
'pv1': {
'manufacturer': 'man1',
'type': 'type1'},
'pv2': {
'manufacturer': 'man2',
'type': 'type2'},
'sensor_list': 688,
'suggested_area': 'Garage'},
'Y170000000000001': {'client_mode': {
'host': 'InverterIP',
'port': 1234,
'forward': True},
'modbus_polling': True,
'monitor_sn': 2000000000,
'node_id': 'PV-Garage2/',
'pv1': {
'manufacturer': 'man1',
'type': 'type1'},
'pv2': {
'manufacturer': 'man2',
'type': 'type2'},
'pv3': {
'manufacturer': 'man3',
'type': 'type3'},
'pv4': {
'manufacturer': 'man4',
'type': 'type4'},
'sensor_list': 688,
'suggested_area': 'Garage2'},
'allow_all': False},
'solarman': {'enabled': True},
'tsun': {'enabled': True}
}
def test_cnv5():
"""test a invalid list with missing serials"""
tst = {
"inverters": [
{
"node_id": "PV-Garage1/",
},
{
"serial": "Y170000000000001",
"node_id": "PV-Garage2/",
}
],
}
cnf = ConfigReadJson()
obj = cnf.convert_to_obj(tst)
assert obj == {
'inverters': {
'Y170000000000001': {'node_id': 'PV-Garage2/'}
},
}
def test_cnv6():
"""test overwritting a value in inverters"""
tst = {
"inverters": [{
"serial": "Y170000000000001",
"node_id": "PV-Garage2/",
}],
}
tst2 = {
"inverters": [{
"serial": "Y170000000000001",
"node_id": "PV-Garden/",
}],
}
cnf = ConfigReadJson()
conf = {}
for key, val in tst.items():
cnf.convert_inv_arr(conf, key, val)
assert conf == {
'inverters': {
'Y170000000000001': {'node_id': 'PV-Garage2/'}
},
}
for key, val in tst2.items():
cnf.convert_inv_arr(conf, key, val)
assert conf == {
'inverters': {
'Y170000000000001': {'node_id': 'PV-Garden/'}
},
}
def test_empty_config(ConfigDefault):
test_buffer.rd = "{}" # empty json
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadJson()
err = Config.get_error()
assert err == None
cnf = Config.get()
assert cnf == ConfigDefault
def test_full_config(ConfigComplete):
test_buffer.rd = """
{
"inverters": [
{
"serial": "R170000000000001",
"node_id": "PV-Garage/",
"suggested_area": "Garage",
"modbus_polling": false,
"pv1.manufacturer": "man1",
"pv1.type": "type1",
"pv2.manufacturer": "man2",
"pv2.type": "type2",
"sensor_list": 688
},
{
"serial": "Y170000000000001",
"monitor_sn": 2000000000,
"node_id": "PV-Garage2/",
"suggested_area": "Garage2",
"modbus_polling": true,
"pv1.manufacturer": "man1",
"pv1.type": "type1",
"pv2.manufacturer": "man2",
"pv2.type": "type2",
"pv3.manufacturer": "man3",
"pv3.type": "type3",
"pv4.manufacturer": "man4",
"pv4.type": "type4",
"sensor_list": 688
},
{
"serial": "Y170000000000002",
"monitor_sn": 2000000001,
"modbus_polling": false,
"modbus_scanning.start": 2048,
"node_id": "PV-Garage3",
"suggested_area": "Garage3",
"sensor_list": 688
}
],
"batteries": [
{
"serial": "4100000000000001",
"modbus_polling": true,
"monitor_sn": 3000000000,
"node_id": "Bat-Garage3",
"suggested_area": "Garage3",
"pv1.manufacturer": "man5",
"pv1.type": "type5",
"pv2.manufacturer": "man6",
"pv2.type": "type6",
"sensor_list": 12326
}
],
"tsun.enabled": true,
"solarman.enabled": true,
"inverters.allow_all": false,
"gen3plus.at_acl.tsun.allow": [
"AT+Z",
"AT+UPURL",
"AT+SUPDATE"
],
"gen3plus.at_acl.tsun.block": [
"AT+SUPDATE"
],
"gen3plus.at_acl.mqtt.allow": [
"AT+"
],
"gen3plus.at_acl.mqtt.block": [
"AT+SUPDATE"
]
}
"""
Config.init(ConfigReadToml("app/src/cnf/default_config.toml"))
for _ in patch_open():
ConfigReadJson()
err = Config.get_error()
assert err == None
cnf = Config.get()
assert cnf == ConfigComplete

View File

@@ -1,9 +1,9 @@
# test_with_pytest.py
import pytest
import json
import json, math
import logging
from app.src.infos import Register, ClrAtMidnight
from app.src.infos import Infos
from infos import Register, ClrAtMidnight
from infos import Infos, Fmt
def test_statistic_counter():
i = Infos()
@@ -17,13 +17,13 @@ def test_statistic_counter():
assert val == None or val == 0
i.static_init() # initialize counter
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 0, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "Modbus_Command": 0}})
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 0, "Cloud_Conn_Cnt": 0, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "Modbus_Command": 0}})
val = i.dev_value(Register.INVERTER_CNT) # valid and initiliazed addr
assert val == 0
i.inc_counter('Inverter_Cnt')
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 1, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "Modbus_Command": 0}})
assert json.dumps(i.stat) == json.dumps({"proxy": {"Inverter_Cnt": 1, "Cloud_Conn_Cnt": 0, "Unknown_SNR": 0, "Unknown_Msg": 0, "Invalid_Data_Type": 0, "Internal_Error": 0,"Unknown_Ctrl": 0, "OTA_Start_Msg": 0, "SW_Exception": 0, "Invalid_Msg_Format": 0, "AT_Command": 0, "AT_Command_Blocked": 0, "Modbus_Command": 0}})
val = i.dev_value(Register.INVERTER_CNT)
assert val == 1
@@ -77,7 +77,7 @@ def test_table_definition():
for d_json, comp, node_id, id in i.ha_proxy_confs(ha_prfx="tsun/", node_id = 'proxy/', snr = '456'):
pass
pass # sideeffect is calling generator i.ha_proxy_confs()
val = i.dev_value(Register.INTERNAL_ERROR) # check internal error counter
assert val == 0
@@ -123,6 +123,30 @@ def test_table_definition():
val = i.dev_value(Register.INTERNAL_ERROR) # check internal error counter
assert val == 3
def test_table_remove():
i = Infos()
i.static_init() # initialize counter
val = i.dev_value(Register.INTERNAL_ERROR) # check internal error counter
assert val == 0
# for d_json, comp, node_id, id in i.ha_confs(ha_prfx="tsun/", node_id="garagendach/", snr='123', sug_area = 'roof'):
# pass
test = 0
for reg in Register:
res = i.ha_remove(reg, node_id="garagendach/", snr='123') # noqa: E501
if reg == Register.INVERTER_STATUS:
test += 1
assert res == ('{}', 'sensor', 'garagendach/', 'inv_status_123')
elif reg == Register.COLLECT_INTERVAL:
test += 1
assert res == ('{}', 'sensor', 'garagendach/', 'data_collect_intval_123')
assert test == 2
val = i.dev_value(Register.INTERNAL_ERROR) # check internal error counter
assert val == 0
def test_clr_at_midnight():
i = Infos()
i.static_init() # initialize counter
@@ -198,24 +222,24 @@ def test_get_value():
i.set_db_def_value(Register.PV2_VOLTAGE, 30.3)
assert 30 == i.get_db_value(Register.PV1_VOLTAGE, None)
assert 30.3 == i.get_db_value(Register.PV2_VOLTAGE, None)
assert math.isclose(30.3,i.get_db_value(Register.PV2_VOLTAGE, None), rel_tol=1e-09, abs_tol=1e-09)
def test_update_value():
i = Infos()
assert None == i.get_db_value(Register.PV1_VOLTAGE, None)
keys = i.info_defs[Register.PV1_VOLTAGE]['name']
name, update = i.update_db(keys, True, 30)
_, update = i.update_db(keys, True, 30)
assert update == True
assert 30 == i.get_db_value(Register.PV1_VOLTAGE, None)
keys = i.info_defs[Register.PV1_VOLTAGE]['name']
name, update = i.update_db(keys, True, 30)
_, update = i.update_db(keys, True, 30)
assert update == False
assert 30 == i.get_db_value(Register.PV1_VOLTAGE, None)
keys = i.info_defs[Register.PV1_VOLTAGE]['name']
name, update = i.update_db(keys, False, 29)
_, update = i.update_db(keys, False, 29)
assert update == True
assert 29 == i.get_db_value(Register.PV1_VOLTAGE, None)
@@ -232,3 +256,24 @@ def test_key_obj():
assert level == logging.DEBUG
assert unit == 'kWh'
assert must_incr == True
def test_hex4_cnv():
tst_val = (0x12ef, )
string = Fmt.hex4(tst_val)
assert string == '12ef'
val = Fmt.hex4(string, reverse=True)
assert val == tst_val[0]
def test_mac_cnv():
tst_val = (0x12, 0x34, 0x67, 0x89, 0xcd, 0xef)
string = Fmt.mac(tst_val)
assert string == '12:34:67:89:cd:ef'
val = Fmt.mac(string, reverse=True)
assert val == tst_val
def test_version_cnv():
tst_val = (0x123f, )
string = Fmt.version(tst_val)
assert string == 'V1.2.3F'
val = Fmt.version(string, reverse=True)
assert val == tst_val[0]

View File

@@ -1,10 +1,10 @@
# test_with_pytest.py
import pytest, json
from app.src.infos import Register, ClrAtMidnight
from app.src.gen3.infos_g3 import InfosG3
import pytest, json, math
from infos import Register
from gen3.infos_g3 import InfosG3, RegisterMap
@pytest.fixture
def ContrDataSeq(): # Get Time Request message
def contr_data_seq(): # Get Time Request message
msg = b'\x00\x00\x00\x15\x00\x09\x2b\xa8\x54\x10\x52\x53\x57\x5f\x34\x30\x30\x5f\x56\x31\x2e\x30\x30\x2e\x30\x36\x00\x09\x27\xc0\x54\x06\x52\x61\x79\x6d\x6f'
msg += b'\x6e\x00\x09\x2f\x90\x54\x0b\x52\x53\x57\x2d\x31\x2d\x31\x30\x30\x30\x31\x00\x09\x5a\x88\x54\x0f\x74\x2e\x72\x61\x79\x6d\x6f\x6e\x69\x6f\x74\x2e\x63\x6f\x6d\x00\x09\x5a\xec\x54'
msg += b'\x1c\x6c\x6f\x67\x67\x65\x72\x2e\x74\x61\x6c\x65\x6e\x74\x2d\x6d\x6f\x6e\x69\x74\x6f\x72\x69\x6e\x67\x2e\x63\x6f\x6d\x00\x0d\x00\x20\x49\x00\x00\x00\x01\x00\x0c\x35\x00\x49\x00'
@@ -14,7 +14,7 @@ def ContrDataSeq(): # Get Time Request message
return msg
@pytest.fixture
def Contr2DataSeq(): # Get Time Request message
def contr2_data_seq(): # Get Time Request message
msg = b'\x00\x00\x00\x39\x00\x09\x2b\xa8\x54\x10\x52'
msg += b'\x53\x57\x5f\x34\x30\x30\x5f\x56\x31\x2e\x30\x30\x2e\x32\x30\x00'
msg += b'\x09\x27\xc0\x54\x06\x52\x61\x79\x6d\x6f\x6e\x00\x09\x2f\x90\x54'
@@ -94,19 +94,99 @@ def Contr2DataSeq(): # Get Time Request message
return msg
@pytest.fixture
def InvDataSeq(): # Data indication from the controller
def contr3_data_seq(): # Get Time Request message
msg = b'\x00\x00\x00\x39\x00\x09\x2b\xa8\x54\x10\x52' # | ..^.....9..+.T.R
msg += b'\x53\x57\x5f\x34\x30\x30\x5f\x56\x32\x2e\x30\x31\x2e\x31\x33\x00' # | SW_400_V2.01.13.
msg += b'\x09\x27\xc0\x54\x06\x52\x61\x79\x6d\x6f\x6e\x00\x09\x2f\x90\x54' # | .'.T.Raymon../.T
msg += b'\x0b\x52\x53\x57\x2d\x31\x2d\x31\x30\x30\x30\x31\x00\x09\x5a\x88' # | .RSW-1-10001..Z.
msg += b'\x54\x0f\x74\x2e\x72\x61\x79\x6d\x6f\x6e\x69\x6f\x74\x2e\x63\x6f' # | T.t.raymoniot.co
msg += b'\x6d\x00\x09\x5a\xec\x54\x1c\x6c\x6f\x67\x67\x65\x72\x2e\x74\x61' # | m..Z.T.logger.ta
msg += b'\x6c\x65\x6e\x74\x2d\x6d\x6f\x6e\x69\x74\x6f\x72\x69\x6e\x67\x2e' # | lent-monitoring.
msg += b'\x63\x6f\x6d\x00\x0d\x2f\x00\x54\x10\xff\xff\xff\xff\xff\xff\xff' # | com../.T........
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x32\xe8\x54\x10\xff' # | ...........2.T..
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00' # | ................
msg += b'\x0d\x36\xd0\x54\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | .6.T............
msg += b'\xff\xff\xff\xff\xff\x00\x0d\x3a\xb8\x54\x10\xff\xff\xff\xff\xff' # | .......:.T......
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x3e\xa0\x54' # | .............>.T
msg += b'\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\x00\x0d\x42\x88\x54\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ...B.T..........
msg += b'\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x46\x70\x54\x10\xff\xff\xff' # | .........FpT....
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x4a' # | ...............J
msg += b'\x58\x54\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | XT..............
msg += b'\xff\xff\xff\x00\x0d\x4e\x40\x54\x10\xff\xff\xff\xff\xff\xff\xff' # | .....N@T........
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x52\x28\x54\x10\xff' # | ...........R(T..
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00' # | ................
msg += b'\x0d\x56\x10\x54\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | .V.T............
msg += b'\xff\xff\xff\xff\xff\x00\x0d\x59\xf8\x54\x10\xff\xff\xff\xff\xff' # | .......Y.T......
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x5d\xe0\x54' # | .............].T
msg += b'\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\x00\x0d\x61\xc8\x54\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ...a.T..........
msg += b'\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x65\xb0\x54\x10\xff\xff\xff' # | .........e.T....
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x69' # | ...............i
msg += b'\x98\x54\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | .T..............
msg += b'\xff\xff\xff\x00\x0d\x6d\x80\x54\x10\xff\xff\xff\xff\xff\xff\xff' # | .....m.T........
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x71\x68\x54\x10\xff' # | ...........qhT..
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00' # | ................
msg += b'\x0d\x75\x50\x54\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | .uPT............
msg += b'\xff\xff\xff\xff\xff\x00\x0d\x79\x38\x54\x10\xff\xff\xff\xff\xff' # | .......y8T......
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x7d\x20\x54' # | .............} T
msg += b'\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\x00\x0d\x81\x08\x54\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | .....T..........
msg += b'\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x84\xf0\x54\x10\xff\xff\xff' # | ...........T....
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x88' # | ................
msg += b'\xd8\x54\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | .T..............
msg += b'\xff\xff\xff\x00\x0d\x8c\xc0\x54\x10\xff\xff\xff\xff\xff\xff\xff' # | .......T........
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x90\xa8\x54\x10\xff' # | .............T..
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00' # | ................
msg += b'\x0d\x94\x90\x54\x10\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ...T............
msg += b'\xff\xff\xff\xff\xff\x00\x0d\x98\x78\x54\x10\xff\xff\xff\xff\xff' # | ........xT......
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x0d\x9c\x60\x54' # | ..............`T
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff' # | ................
msg += b'\x00\x0d\x00\x20\x49\x00\x00\x00\x01\x00\x0c\x35\x00\x49\x00\x00' # | ... I......5.I..
msg += b'\x00\x62\x00\x0c\x96\xa8\x49\x00\x00\x01\x4f\x00\x0c\x7f\x38\x49' # | .b....I...O...8I
msg += b'\x00\x00\x00\x01\x00\x0c\xfc\x38\x49\x00\x00\x00\x01\x00\x0c\xf8' # | .......8I.......
msg += b'\x50\x49\x00\x00\x01\x2c\x00\x0c\x63\xe0\x49\x00\x00\x00\x00\x00' # | PI...,..c.I.....
msg += b'\x0c\x67\xc8\x49\x00\x00\x00\x00\x00\x0c\x50\x58\x49\x00\x00\x00' # | .g.I......PXI...
msg += b'\x01\x00\x09\x5e\x70\x49\x00\x00\x13\x8d\x00\x09\x5e\xd4\x49\x00' # | ...^pI......^.I.
msg += b'\x00\x13\x8d\x00\x09\x5b\x50\x49\x00\x00\x00\x02\x00\x0d\x04\x08' # | .....[PI........
msg += b'\x49\x00\x00\x00\x00\x00\x07\xa1\x84\x49\x00\x00\x00\x01\x00\x0c' # | I........I......
msg += b'\x50\x59\x49\x00\x00\x00\x2d\x00\x0d\x1f\x60\x49\x00\x00\x00\x00' # | PYI...-...`I....
msg += b'\x00\x0d\x23\x48\x49\xff\xff\xff\xff\x00\x0d\x27\x30\x49\xff\xff' # | ..#HI......'0I..
msg += b'\xff\xff\x00\x0d\x2b\x18\x4c\x00\x00\x00\x00\xff\xff\xff\xff\x00' # | ....+.L.........
msg += b'\x0c\xa2\x60\x49\x00\x00\x00\x00\x00\x0d\xa0\x48\x49\x00\x00\x00' # | ..`I.......HI...
msg += b'\x00\x00\x0d\xa4\x30\x49\x00\x00\x00\xff\x00\x0d\xa8\x18\x49\x00' # | ....0I........I.
msg += b'\x00\x00\xff'
return msg
@pytest.fixture
def inv_data_seq(): # Data indication from the controller
msg = b'\x00\x00\x00\x06\x00\x00\x00\x0a\x54\x08\x4d\x69\x63\x72\x6f\x69\x6e\x76\x00\x00\x00\x14\x54\x04\x54\x53\x55\x4e\x00\x00\x00\x1E\x54\x07\x56\x35\x2e\x30\x2e\x31\x31\x00\x00\x00\x28'
msg += b'\x54\x10T170000000000001\x00\x00\x00\x32\x54\x0a\x54\x53\x4f\x4c\x2d\x4d\x53\x36\x30\x30\x00\x00\x00\x3c\x54\x05\x41\x2c\x42\x2c\x43'
return msg
@pytest.fixture
def InvalidDataSeq(): # Data indication from the controller
def invalid_data_seq(): # Data indication from the controller
msg = b'\x00\x00\x00\x06\x00\x00\x00\x0a\x54\x08\x4d\x69\x63\x72\x6f\x69\x6e\x76\x00\x00\x00\x14\x64\x04\x54\x53\x55\x4e\x00\x00\x00\x1E\x54\x07\x56\x35\x2e\x30\x2e\x31\x31\x00\x00\x00\x28'
msg += b'\x54\x10T170000000000001\x00\x00\x00\x32\x54\x0a\x54\x53\x4f\x4c\x2d\x4d\x53\x36\x30\x30\x00\x00\x00\x3c\x54\x05\x41\x2c\x42\x2c\x43'
return msg
@pytest.fixture
def InvDataSeq2(): # Data indication from the controller
def inv_data_seq2(): # Data indication from the controller
msg = b'\x00\x00\x00\xa3\x00\x00\x00\x64\x53\x00\x01\x00\x00\x00\xc8\x53\x00\x02\x00\x00\x01\x2c\x53\x00\x00\x00\x00\x01\x90\x49\x00\x00\x00\x00\x00\x00\x01\x91\x53\x00\x00'
msg += b'\x00\x00\x01\x92\x53\x00\x00\x00\x00\x01\x93\x53\x00\x00\x00\x00\x01\x94\x53\x00\x00\x00\x00\x01\x95\x53\x00\x00\x00\x00\x01\x96\x53\x00\x00\x00\x00\x01\x97\x53\x00'
msg += b'\x00\x00\x00\x01\x98\x53\x00\x00\x00\x00\x01\x99\x53\x00\x00\x00\x00\x01\x9a\x53\x00\x00\x00\x00\x01\x9b\x53\x00\x00\x00\x00\x01\x9c\x53\x00\x00\x00\x00\x01\x9d\x53'
@@ -141,7 +221,154 @@ def InvDataSeq2(): # Data indication from the controller
return msg
@pytest.fixture
def InvDataNew(): # Data indication from DSP V5.0.17
def inv_data_seq3(): # Inverter indication from MS-3000
msg = b'\x00\x00\x01\x2c\x00\x00\x00\x64\x53\x00\x00' # | ..^.....,...dS..
msg += b'\x00\x00\x00\xc8\x53\x44\x00\x00\x00\x01\x2c\x53\x00\x00\x00\x00' # | ....SD....,S....
msg += b'\x01\x90\x49\x00\x00\x00\x00\x00\x00\x01\x91\x53\x00\x00\x00\x00' # | ..I........S....
msg += b'\x01\x92\x53\x00\x00\x00\x00\x01\x93\x53\x00\x00\x00\x00\x01\x94' # | ..S......S......
msg += b'\x53\x00\x00\x00\x00\x01\x95\x53\x00\x00\x00\x00\x01\x96\x53\x00' # | S......S......S.
msg += b'\x00\x00\x00\x01\x97\x53\x00\x00\x00\x00\x01\x98\x53\x00\x00\x00' # | .....S......S...
msg += b'\x00\x01\x99\x53\x00\x00\x00\x00\x01\x9a\x53\x00\x00\x00\x00\x01' # | ...S......S.....
msg += b'\x9b\x53\x00\x00\x00\x00\x01\x9c\x53\x00\x00\x00\x00\x01\x9d\x53' # | .S......S......S
msg += b'\x00\x00\x00\x00\x01\x9e\x53\x00\x00\x00\x00\x01\x9f\x53\x00\x00' # | ......S......S..
msg += b'\x00\x00\x01\xa0\x53\x00\x00\x00\x00\x01\xf4\x49\x00\x00\x00\x00' # | ....S......I....
msg += b'\x00\x00\x01\xf5\x53\x00\x00\x00\x00\x01\xf6\x53\x00\x00\x00\x00' # | ....S......S....
msg += b'\x01\xf7\x53\x00\x00\x00\x00\x01\xf8\x53\x00\x00\x00\x00\x01\xf9' # | ..S......S......
msg += b'\x53\x00\x00\x00\x00\x01\xfa\x53\x00\x00\x00\x00\x01\xfb\x53\x00' # | S......S......S.
msg += b'\x00\x00\x00\x01\xfc\x53\x00\x00\x00\x00\x01\xfd\x53\x00\x00\x00' # | .....S......S...
msg += b'\x00\x01\xfe\x53\x00\x00\x00\x00\x01\xff\x53\x00\x00\x00\x00\x02' # | ...S......S.....
msg += b'\x00\x53\x00\x00\x00\x00\x02\x01\x53\x00\x00\x00\x00\x02\x02\x53' # | .S......S......S
msg += b'\x00\x00\x00\x00\x02\x03\x53\x00\x00\x00\x00\x02\x04\x53\x00\x00' # | ......S......S..
msg += b'\x00\x00\x02\x58\x49\x00\x00\x00\x00\x00\x00\x02\x59\x53\x00\x00' # | ...XI.......YS..
msg += b'\x00\x00\x02\x5a\x53\x00\x00\x00\x00\x02\x5b\x53\x00\x00\x00\x00' # | ...ZS.....[S....
msg += b'\x02\x5c\x53\x00\x00\x00\x00\x02\x5d\x53\x00\x00\x00\x00\x02\x5e' # | .\S.....]S.....^
msg += b'\x53\x00\x00\x00\x00\x02\x5f\x53\x00\x00\x00\x00\x02\x60\x53\x00' # | S....._S.....`S.
msg += b'\x00\x00\x00\x02\x61\x53\x00\x00\x00\x00\x02\x62\x53\x00\x00\x00' # | ....aS.....bS...
msg += b'\x00\x02\x63\x53\x00\x00\x00\x00\x02\x64\x53\x00\x00\x00\x00\x02' # | ..cS.....dS.....
msg += b'\x65\x53\x00\x00\x00\x00\x02\x66\x53\x00\x00\x00\x00\x02\x67\x53' # | eS.....fS.....gS
msg += b'\x00\x00\x00\x00\x02\x68\x53\x00\x00\x00\x00\x02\xbc\x49\x00\x00' # | .....hS......I..
msg += b'\x00\x00\x00\x00\x02\xbd\x53\x00\x00\x00\x00\x02\xbe\x53\x00\x00' # | ......S......S..
msg += b'\x00\x00\x02\xbf\x53\x00\x00\x00\x00\x02\xc0\x53\x00\x00\x00\x00' # | ....S......S....
msg += b'\x02\xc1\x53\x00\x00\x00\x00\x02\xc2\x53\x00\x00\x00\x00\x02\xc3' # | ..S......S......
msg += b'\x53\x00\x00\x00\x00\x02\xc4\x53\x00\x00\x00\x00\x02\xc5\x53\x00' # | S......S......S.
msg += b'\x00\x00\x00\x02\xc6\x53\x00\x00\x00\x00\x02\xc7\x53\x00\x00\x00' # | .....S......S...
msg += b'\x00\x02\xc8\x53\x00\x00\x00\x00\x02\xc9\x53\x00\x00\x00\x00\x02' # | ...S......S.....
msg += b'\xca\x53\x00\x00\x00\x00\x02\xcb\x53\x00\x00\x00\x00\x02\xcc\x53' # | .S......S......S
msg += b'\x00\x00\x00\x00\x03\x20\x53\x00\x01\x00\x00\x03\x84\x53\x11\x68' # | ..... S......S.h
msg += b'\x00\x00\x03\xe8\x46\x44\x23\xd1\xec\x00\x00\x04\x4c\x46\x43\xa3' # | ....FD#.....LFC.
msg += b'\xb3\x33\x00\x00\x04\xb0\x46\x00\x00\x00\x00\x00\x00\x05\x14\x46' # | .3....F........F
msg += b'\x43\x6e\x80\x00\x00\x00\x05\x78\x46\x3d\x4c\xcc\xcd\x00\x00\x05' # | Cn.....xF=L.....
msg += b'\xdc\x46\x00\x00\x00\x00\x00\x00\x06\x40\x46\x42\x48\x00\x00\x00' # | .F.......@FBH...
msg += b'\x00\x06\xa4\x53\x00\x03\x00\x00\x07\x08\x53\x00\x0c\x00\x00\x07' # | ...S......S.....
msg += b'\x6c\x53\x00\x50\x00\x00\x07\xd0\x46\x43\xa3\xb3\x33\x00\x00\x08' # | lS.P....FC..3...
msg += b'\x34\x53\x0b\xb8\x00\x00\x08\x98\x46\x00\x00\x00\x00\x00\x00\x08' # | 4S......F.......
msg += b'\xfc\x46\x00\x00\x00\x00\x00\x00\x09\x60\x46\x41\xee\xe1\x48\x00' # | .F.......`FA..H.
msg += b'\x00\x09\xc4\x53\x00\x00\x00\x00\x0a\x28\x46\x41\xf2\x00\x00\x00' # | ...S.....(FA....
msg += b'\x00\x0a\x8c\x46\x3f\xac\x28\xf6\x00\x00\x0a\xf0\x53\x00\x0c\x00' # | ...F?.(.....S...
msg += b'\x00\x0b\x54\x53\x00\x00\x00\x00\x0b\xb8\x53\x00\x00\x00\x00\x0c' # | ..TS......S.....
msg += b'\x1c\x53\x00\x00\x00\x00\x0c\x80\x53\x00\x00\x00\x00\x0c\xe4\x53' # | .S......S......S
msg += b'\x00\x00\x00\x00\x0d\x48\x53\x00\x00\x00\x00\x0d\xac\x53\x00\x00' # | .....HS......S..
msg += b'\x00\x00\x0e\x10\x53\x00\x00\x00\x00\x0e\x74\x53\x00\x00\x00\x00' # | ....S.....tS....
msg += b'\x0e\xd8\x53\x00\x00\x00\x00\x0f\x3c\x53\x00\x00\x00\x00\x0f\xa0' # | ..S.....<S......
msg += b'\x53\x00\x00\x00\x00\x10\x04\x53\x00\x00\x00\x00\x10\x68\x53\x00' # | S......S.....hS.
msg += b'\x00\x00\x00\x10\xcc\x53\x00\x00\x00\x00\x11\x30\x53\x00\x00\x00' # | .....S.....0S...
msg += b'\x00\x11\x94\x53\x00\x00\x00\x00\x11\xf8\x53\x00\x00\x00\x00\x12' # | ...S......S.....
msg += b'\x5c\x53\x00\x00\x00\x00\x12\xc0\x53\x00\x00\x00\x00\x13\x24\x46' # | \S......S.....$F
msg += b'\x42\x9d\x33\x33\x00\x00\x13\x88\x46\x00\x00\x00\x00\x00\x00\x13' # | B.33....F.......
msg += b'\xec\x46\x00\x00\x00\x00\x00\x00\x14\x50\x46\x42\xdc\x00\x00\x00' # | .F.......PFB....
msg += b'\x00\x14\xb4\x53\x00\x00\x00\x00\x15\x18\x53\x00\x00\x00\x00\x15' # | ...S......S.....
msg += b'\x7c\x53\x00\x00\x00\x00\x15\x7d\x53\x00\x00\x00\x00\x15\x7e\x53' # | |S.....}S.....~S
msg += b'\x00\x00\x00\x00\x15\x7f\x53\x00\x00\x00\x00\x15\x80\x53\x00\x00' # | ......S......S..
msg += b'\x00\x00\x15\x81\x53\x00\x00\x00\x00\x15\x82\x53\x00\x00\x00\x00' # | ....S......S....
msg += b'\x15\x83\x53\x00\x00\x00\x00\x15\x84\x53\x00\x00\x00\x00\x15\x85' # | ..S......S......
msg += b'\x53\x00\x00\x00\x00\x15\x86\x53\x00\x00\x00\x00\x15\x87\x53\x00' # | S......S......S.
msg += b'\x00\x00\x00\x15\x88\x53\x00\x00\x00\x00\x15\x89\x53\x00\x00\x00' # | .....S......S...
msg += b'\x00\x15\x8a\x53\x00\x00\x00\x00\x15\x8b\x53\x00\x00\x00\x00\x15' # | ...S......S.....
msg += b'\x8c\x53\x00\x00\x00\x00\x15\xe0\x46\x42\x68\x66\x66\x00\x00\x16' # | .S......FBhff...
msg += b'\x44\x46\x00\x00\x00\x00\x00\x00\x16\xa8\x46\x00\x00\x00\x00\x00' # | DF........F.....
msg += b'\x00\x17\x0c\x46\x42\xdc\x00\x00\x00\x00\x17\x70\x53\x00\x00\x00' # | ...FB......pS...
msg += b'\x00\x17\xd4\x53\x00\x00\x00\x00\x18\x38\x53\x00\x00\x00\x00\x18' # | ...S.....8S.....
msg += b'\x39\x53\x00\x00\x00\x00\x18\x3a\x53\x00\x00\x00\x00\x18\x3b\x53' # | 9S.....:S.....;S
msg += b'\x00\x00\x00\x00\x18\x3c\x53\x00\x00\x00\x00\x18\x3d\x53\x00\x00' # | .....<S.....=S..
msg += b'\x00\x00\x18\x3e\x53\x00\x00\x00\x00\x18\x3f\x53\x00\x00\x00\x00' # | ...>S.....?S....
msg += b'\x18\x40\x53\x00\x00\x00\x00\x18\x41\x53\x00\x00\x00\x00\x18\x42' # | .@S.....AS.....B
msg += b'\x53\x00\x00\x00\x00\x18\x43\x53\x00\x00\x00\x00\x18\x44\x53\x00' # | S.....CS.....DS.
msg += b'\x00\x00\x00\x18\x45\x53\x00\x00\x00\x00\x18\x46\x53\x00\x00\x00' # | ....ES.....FS...
msg += b'\x00\x18\x47\x53\x00\x00\x00\x00\x18\x48\x53\x00\x00\x00\x00\x18' # | ..GS.....HS.....
msg += b'\x9c\x46\x42\x6b\x33\x33\x00\x00\x19\x00\x46\x00\x00\x00\x00\x00' # | .FBk33....F.....
msg += b'\x00\x19\x64\x46\x00\x00\x00\x00\x00\x00\x19\xc8\x46\x42\xdc\x00' # | ..dF........FB..
msg += b'\x00\x00\x00\x1a\x2c\x53\x00\x00\x00\x00\x1a\x90\x53\x00\x00\x00' # | ....,S......S...
msg += b'\x00\x1a\xf4\x53\x00\x00\x00\x00\x1a\xf5\x53\x00\x00\x00\x00\x1a' # | ...S......S.....
msg += b'\xf6\x53\x00\x00\x00\x00\x1a\xf7\x53\x00\x00\x00\x00\x1a\xf8\x53' # | .S......S......S
msg += b'\x00\x00\x00\x00\x1a\xf9\x53\x00\x00\x00\x00\x1a\xfa\x53\x00\x00' # | ......S......S..
msg += b'\x00\x00\x1a\xfb\x53\x00\x00\x00\x00\x1a\xfc\x53\x00\x00\x00\x00' # | ....S......S....
msg += b'\x1a\xfd\x53\x00\x00\x00\x00\x1a\xfe\x53\x00\x00\x00\x00\x1a\xff' # | ..S......S......
msg += b'\x53\x00\x00\x00\x00\x1b\x00\x53\x00\x00\x00\x00\x1b\x01\x53\x00' # | S......S......S.
msg += b'\x00\x00\x00\x1b\x02\x53\x00\x00\x00\x00\x1b\x03\x53\x00\x00\x00' # | .....S......S...
msg += b'\x00\x1b\x04\x53\x00\x00\x00\x00\x1b\x58\x53\x00\x00\x00\x00\x1b' # | ...S.....XS.....
msg += b'\xbc\x53\x11\x3d\x00\x00\x1c\x20\x46\x3c\x23\xd7\x0a\x00\x00\x1c' # | .S.=... F<#.....
msg += b'\x84\x46\x00\x00\x00\x00\x00\x00\x1c\xe8\x46\x42\x04\x00\x00\x00' # | .F........FB....
msg += b'\x00\x1d\x4c\x46\x00\x00\x00\x00\x00\x00\x1d\xb0\x46\x00\x00\x00' # | ..LF........F...
msg += b'\x00\x00\x00\x1e\x14\x53\x00\x02\x00\x00\x1e\x78\x46\x41\x8b\x33' # | .....S.....xFA.3
msg += b'\x33\x00\x00\x1e\xdc\x46\x3c\xa3\xd7\x0a\x00\x00\x1f\x40\x46\x3e' # | 3....F<......@F>
msg += b'\x99\x99\x9a\x00\x00\x1f\xa4\x46\x40\x99\x99\x9a\x00\x00\x20\x08' # | .......F@..... .
msg += b'\x53\x00\x00\x00\x00\x20\x6c\x53\x00\x00\x00\x00\x20\xd0\x53\x05' # | S.... lS.... .S.
msg += b'\x00\x00\x00\x20\xd1\x53\x00\x00\x00\x00\x20\xd2\x53\x00\x00\x00' # | ... .S.... .S...
msg += b'\x00\x20\xd3\x53\x00\x00\x00\x00\x20\xd4\x53\x00\x00\x00\x00\x20' # | . .S.... .S....
msg += b'\xd5\x53\x00\x00\x00\x00\x20\xd6\x53\x00\x00\x00\x00\x20\xd7\x53' # | .S.... .S.... .S
msg += b'\x00\x00\x00\x00\x20\xd8\x53\x00\x00\x00\x00\x20\xd9\x53\x00\x01' # | .... .S.... .S..
msg += b'\x00\x00\x20\xda\x53\x00\x00\x00\x00\x20\xdb\x53\x00\x01\x00\x00' # | .. .S.... .S....
msg += b'\x20\xdc\x53\x00\x00\x00\x00\x20\xdd\x53\x00\x00\x00\x00\x20\xde' # | .S.... .S.... .
msg += b'\x53\x00\x00\x00\x00\x20\xdf\x53\x00\x00\x00\x00\x20\xe0\x53\x00' # | S.... .S.... .S.
msg += b'\x00\x00\x00\x21\x34\x46\x00\x00\x00\x00\x00\x00\x21\x98\x46\x00' # | ...!4F......!.F.
msg += b'\x00\x00\x00\x00\x00\x21\xfc\x46\x00\x00\x00\x00\x00\x00\x22\x60' # | .....!.F......"`
msg += b'\x46\x00\x00\x00\x00\x00\x00\x22\xc4\x53\x00\x00\x00\x00\x23\x28' # | F......".S....#(
msg += b'\x53\x00\x00\x00\x00\x23\x8c\x53\x00\x00\x00\x00\x23\x8d\x53\x00' # | S....#.S....#.S.
msg += b'\x00\x00\x00\x23\x8e\x53\x00\x00\x00\x00\x23\x8f\x53\x00\x00\x00' # | ...#.S....#.S...
msg += b'\x00\x23\x90\x53\x00\x00\x00\x00\x23\x91\x53\x00\x00\x00\x00\x23' # | .#.S....#.S....#
msg += b'\x92\x53\x00\x00\x00\x00\x23\x93\x53\x00\x00\x00\x00\x23\x94\x53' # | .S....#.S....#.S
msg += b'\x00\x00\x00\x00\x23\x95\x53\x00\x00\x00\x00\x23\x96\x53\x00\x00' # | ....#.S....#.S..
msg += b'\x00\x00\x23\x97\x53\x00\x00\x00\x00\x23\x98\x53\x00\x00\x00\x00' # | ..#.S....#.S....
msg += b'\x23\x99\x53\x00\x00\x00\x00\x23\x9a\x53\x00\x00\x00\x00\x23\x9b' # | #.S....#.S....#.
msg += b'\x53\x00\x00\x00\x00\x23\x9c\x53\x00\x00\x00\x00\x23\xf0\x46\x00' # | S....#.S....#.F.
msg += b'\x00\x00\x00\x00\x00\x24\x54\x46\x00\x00\x00\x00\x00\x00\x24\xb8' # | .....$TF......$.
msg += b'\x46\x00\x00\x00\x00\x00\x00\x25\x1c\x46\x00\x00\x00\x00\x00\x00' # | F......%.F......
msg += b'\x25\x80\x53\x00\x00\x00\x00\x25\xe4\x53\x00\x00\x00\x00\x26\x48' # | %.S....%.S....&H
msg += b'\x53\x00\x00\x00\x00\x26\x49\x53\x00\x00\x00\x00\x26\x4a\x53\x00' # | S....&IS....&JS.
msg += b'\x00\x00\x00\x26\x4b\x53\x00\x00\x00\x00\x26\x4c\x53\x00\x00\x00' # | ...&KS....&LS...
msg += b'\x00\x26\x4d\x53\x00\x00\x00\x00\x26\x4e\x53\x00\x00\x00\x00\x26' # | .&MS....&NS....&
msg += b'\x4f\x53\x00\x00\x00\x00\x26\x50\x53\x00\x00\x00\x00\x26\x51\x53' # | OS....&PS....&QS
msg += b'\x00\x00\x00\x00\x26\x52\x53\x00\x00\x00\x00\x26\x53\x53\x00\x00' # | ....&RS....&SS..
msg += b'\x00\x00\x26\x54\x53\x00\x00\x00\x00\x26\x55\x53\x00\x00\x00\x00' # | ..&TS....&US....
msg += b'\x26\x56\x53\x00\x00\x00\x00\x26\x57\x53\x00\x00\x00\x00\x26\x58' # | &VS....&WS....&X
msg += b'\x53\x00\x00\x00\x00\x26\xac\x53\x00\x00\x00\x00\x27\x10\x53\x11' # | S....&.S....'.S.
msg += b'\x3d\x00\x00\x27\x74\x46\x00\x00\x00\x00\x00\x00\x27\xd8\x46\x00' # | =..'tF......'.F.
msg += b'\x00\x00\x00\x00\x00\x28\x3c\x46\x42\x03\xf5\xc3\x00\x00\x28\xa0' # | .....(<FB.....(.
msg += b'\x46\x00\x00\x00\x00\x00\x00\x29\x04\x46\x00\x00\x00\x00\x00\x00' # | F......).F......
msg += b'\x29\x68\x53\x00\x02\x00\x00\x29\xcc\x53\x00\x03\x00\x00\x2a\x30' # | )hS....).S....*0
msg += b'\x46\x42\x20\x00\x00\x00\x00\x2a\x94\x46\x42\x20\x00\x00\x00\x00' # | FB ....*.FB ....
msg += b'\x2a\xf8\x46\x44\x20\x00\x00\x00\x00\x2b\x5c\x46\x43\x7b\x00\x00' # | *.FD ....+\FC{..
msg += b'\x00\x00\x2b\xc0\x46\x43\x50\x00\x00\x00\x00\x2c\x24\x46\x42\x48' # | ..+.FCP....,$FBH
msg += b'\x5c\x29\x00\x00\x2c\x88\x46\x42\x47\xa3\xd7\x00\x00\x2c\xec\x53' # | \)..,.FBG....,.S
msg += b'\x00\x00\x00\x00\x2d\x50\x46\x43\x42\x00\x00\x00\x00\x2d\xb4\x46' # | ....-PFCB....-.F
msg += b'\x42\xbc\x00\x00\x00\x00\x2e\x18\x46\x3f\xe6\x66\x66\x00\x00\x2e' # | B.......F?.ff...
msg += b'\x7c\x46\x3f\xe6\x66\x66\x00\x00\x2e\xe0\x46\x43\x7e\x00\x00\x00' # | |F?.ff....FC~...
msg += b'\x00\x2f\x44\x46\x43\x83\xf3\x33\x00\x00\x2f\xa8\x46\x3f\xe6\x66' # | ./DFC..3../.F?.f
msg += b'\x66\x00\x00\x30\x0c\x46\x3f\xe6\x66\x66\x00\x00\x30\x70\x46\x43' # | f..0.F?.ff..0pFC
msg += b'\x7e\x00\x00\x00\x00\x30\xd4\x46\x42\x3f\xeb\x85\x00\x00\x31\x38' # | ~....0.FB?....18
msg += b'\x46\x42\x3d\xeb\x85\x00\x00\x31\x9c\x46\x3e\x4c\xcc\xcd\x00\x00' # | FB=....1.F>L....
msg += b'\x32\x00\x46\x3e\x4c\xcc\xcd\x00\x00\x32\x64\x46\x42\x4c\x14\x7b' # | 2.F>L....2dFBL.{
msg += b'\x00\x00\x32\xc8\x46\x42\x4d\xeb\x85\x00\x00\x33\x2c\x46\x3e\x4c' # | ..2.FBM....3,F>L
msg += b'\xcc\xcd\x00\x00\x33\x90\x46\x3e\x4c\xcc\xcd\x00\x00\x33\xf4\x53' # | ....3.F>L....3.S
msg += b'\x00\x00\x00\x00\x34\x58\x53\x00\x00\x00\x00\x34\xbc\x53\x04\x00' # | ....4XS....4.S..
msg += b'\x00\x00\x35\x20\x53\x00\x01\x00\x00\x35\x84\x53\x13\x9c\x00\x00' # | ..5 S....5.S....
msg += b'\x35\xe8\x53\x0f\xa0\x00\x00\x36\x4c\x53\x00\x00\x00\x00\x36\xb0' # | 5.S....6LS....6.
msg += b'\x53\x00\x66' # | S.f'
return msg
@pytest.fixture
def inv_data_new(): # Data indication from DSP V5.0.17
msg = b'\x00\x00\x00\xa3\x00\x00\x00\x00\x53\x00\x00'
msg += b'\x00\x00\x00\x80\x53\x00\x00\x00\x00\x01\x04\x53\x00\x00\x00\x00'
msg += b'\x01\x90\x41\x00\x00\x01\x91\x53\x00\x00\x00\x00\x01\x90\x53\x00'
@@ -217,7 +444,7 @@ def InvDataNew(): # Data indication from DSP V5.0.17
return msg
@pytest.fixture
def InvDataSeq2_Zero(): # Data indication from the controller
def inv_data_seq2_zero(): # Data indication from the controller
msg = b'\x00\x00\x00\xa3\x00\x00\x00\x64\x53\x00\x01\x00\x00\x00\xc8\x53\x00\x02\x00\x00\x01\x2c\x53\x00\x00\x00\x00\x01\x90\x49\x00\x00\x00\x00\x00\x00\x01\x91\x53\x00\x00'
msg += b'\x00\x00\x01\x92\x53\x00\x00\x00\x00\x01\x93\x53\x00\x00\x00\x00\x01\x94\x53\x00\x00\x00\x00\x01\x95\x53\x00\x00\x00\x00\x01\x96\x53\x00\x00\x00\x00\x01\x97\x53\x00'
msg += b'\x00\x00\x00\x01\x98\x53\x00\x00\x00\x00\x01\x99\x53\x00\x00\x00\x00\x01\x9a\x53\x00\x00\x00\x00\x01\x9b\x53\x00\x00\x00\x00\x01\x9c\x53\x00\x00\x00\x00\x01\x9d\x53'
@@ -252,47 +479,74 @@ def InvDataSeq2_Zero(): # Data indication from the controller
return msg
def test_parse_control(ContrDataSeq):
def test_parse_control(contr_data_seq):
i = InfosG3()
for key, result in i.parse (ContrDataSeq):
pass
for key, result in i.parse (contr_data_seq, sensor=0x0e100000):
pass # side effect in calling i.parse()
assert json.dumps(i.db) == json.dumps(
{"collector": {"Collector_Fw_Version": "RSW_400_V1.00.06", "Chip_Type": "Raymon", "Chip_Model": "RSW-1-10001", "Trace_URL": "t.raymoniot.com", "Logger_URL": "logger.talent-monitoring.com"}, "controller": {"Collect_Interval": 1, "Signal_Strength": 100, "Power_On_Time": 29, "Communication_Type": 1, "Connect_Count": 1, "Data_Up_Interval": 300}})
def test_parse_control2(Contr2DataSeq):
def test_parse_control2(contr2_data_seq):
i = InfosG3()
for key, result in i.parse (Contr2DataSeq):
pass
for key, result in i.parse (contr2_data_seq, sensor=0x0e100000):
pass # side effect in calling i.parse()
assert json.dumps(i.db) == json.dumps(
{"collector": {"Collector_Fw_Version": "RSW_400_V1.00.20", "Chip_Type": "Raymon", "Chip_Model": "RSW-1-10001", "Trace_URL": "t.raymoniot.com", "Logger_URL": "logger.talent-monitoring.com"}, "controller": {"Collect_Interval": 1, "Signal_Strength": 16, "Power_On_Time": 334, "Communication_Type": 1, "Connect_Count": 1, "Data_Up_Interval": 300}})
def test_parse_inverter(InvDataSeq):
def test_parse_control3(contr3_data_seq):
i = InfosG3()
for key, result in i.parse (InvDataSeq):
pass
for key, result in i.parse (contr3_data_seq, sensor=0x0e100000):
pass # side effect in calling i.parse()
assert json.dumps(i.db) == json.dumps(
{"collector": {"Collector_Fw_Version": "RSW_400_V2.01.13", "Chip_Type": "Raymon", "Chip_Model": "RSW-1-10001", "Trace_URL": "t.raymoniot.com", "Logger_URL": "logger.talent-monitoring.com"}, "controller": {"Collect_Interval": 1, "Signal_Strength": 98, "Power_On_Time": 335, "Communication_Type": 1, "Connect_Count": 1, "Data_Up_Interval": 300}})
def test_parse_inverter(inv_data_seq):
i = InfosG3()
for key, result in i.parse (inv_data_seq, sensor=0x01900001):
pass # side effect in calling i.parse()
assert json.dumps(i.db) == json.dumps(
{"inverter": {"Product_Name": "Microinv", "Manufacturer": "TSUN", "Version": "V5.0.11", "Serial_Number": "T170000000000001", "Equipment_Model": "TSOL-MS600"}})
def test_parse_cont_and_invert(ContrDataSeq, InvDataSeq):
def test_parse_cont_and_invert(contr_data_seq, inv_data_seq):
i = InfosG3()
for key, result in i.parse (ContrDataSeq):
pass
for key, result in i.parse (contr_data_seq, sensor=0x0e100000):
pass # side effect in calling i.parse()
for key, result in i.parse (InvDataSeq):
pass
for key, result in i.parse (inv_data_seq, sensor=0x01900001):
pass # side effect in calling i.parse()
assert json.dumps(i.db) == json.dumps(
{
"collector": {"Collector_Fw_Version": "RSW_400_V1.00.06", "Chip_Type": "Raymon", "Chip_Model": "RSW-1-10001", "Trace_URL": "t.raymoniot.com", "Logger_URL": "logger.talent-monitoring.com"}, "controller": {"Collect_Interval": 1, "Signal_Strength": 100, "Power_On_Time": 29, "Communication_Type": 1, "Connect_Count": 1, "Data_Up_Interval": 300},
"inverter": {"Product_Name": "Microinv", "Manufacturer": "TSUN", "Version": "V5.0.11", "Serial_Number": "T170000000000001", "Equipment_Model": "TSOL-MS600"}})
def test_parse_cont_and_invert2(contr3_data_seq, inv_data_seq3):
i = InfosG3()
for key, result in i.parse (contr3_data_seq, sensor=0x0e100000):
pass # side effect in calling i.parse()
def test_build_ha_conf1(ContrDataSeq):
for key, result in i.parse (inv_data_seq3, sensor=0x01900000):
pass # side effect in calling i.parse()
assert json.dumps(i.db) == json.dumps(
{
"collector": {"Collector_Fw_Version": "RSW_400_V2.01.13", "Chip_Type": "Raymon", "Chip_Model": "RSW-1-10001", "Trace_URL": "t.raymoniot.com", "Logger_URL": "logger.talent-monitoring.com"}, "controller": {"Collect_Interval": 1, "Signal_Strength": 98, "Power_On_Time": 335, "Communication_Type": 1, "Connect_Count": 1, "Data_Up_Interval": 300},
"env": {"Inverter_Status": 0},
"events": {"Inverter_Alarm": 0, "Inverter_Fault": 0, "Inverter_Bitfield_1": 0, "Inverter_bitfield_2": 0},
"input": {"iVal_1": 1, "Val_0": 655.28, "Val_1": 327.4, "Val_2": 0.0, "Val_3": 0.0, "iVal_2": 3, "iVal_3": 12, "iVal_4": 80, "Val_4": 327.4, "iVal_5": 0, "Val_10": 30.25, "Val_11": 1.35, "iVal_6": 12, "pv1": {"Voltage": 78.6, "Current": 0.0, "Power": 0.0}, "Val_5": 110.0, "pv2": {"Voltage": 58.1, "Current": 0.0, "Power": 0.0}, "Val_6": 110.0, "pv3": {"Voltage": 58.8, "Current": 0.0, "Power": 0.0}, "Val_7": 110.0, "Val_14": 0.01, "Val_15": 0.0, "Val_16": 33.0, "Val_17": 0.0, "Val_18": 0.0, "iVal_8": 2, "pv4": {"Voltage": 17.4, "Current": 0.02, "Power": 0.3}, "Val_8": 4.8, "iVal_10": 1, "pv5": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}, "pv6": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}, "Val_24": 0.0, "Val_25": 0.0, "Val_26": 32.99, "Val_27": 0.0, "Val_28": 0.0, "iVal_11": 2, "iVal_12": 3},
"grid": {"Voltage": 238.5, "Current": 0.05, "Frequency": 50.0, "Output_Power": 0.0},
"inverter": {"Max_Designed_Power": 3000},
"total": {"Total_Generation": 29.86}
})
def test_build_ha_conf1(contr_data_seq):
i = InfosG3()
i.static_init() # initialize counter
i.set_db_def_value(Register.SENSOR_LIST, "01900001")
tests = 0
for d_json, comp, node_id, id in i.ha_confs(ha_prfx="tsun/", node_id="garagendach/", snr='123'):
@@ -325,7 +579,12 @@ def test_build_ha_conf1(ContrDataSeq):
assert tests==4
def test_build_ha_conf2(contr_data_seq):
i = InfosG3()
i.static_init() # initialize counter
i.set_db_def_value(Register.SENSOR_LIST, "01900001")
tests = 0
for d_json, comp, node_id, id in i.ha_proxy_confs(ha_prfx="tsun/", node_id = 'proxy/', snr = '456'):
if id == 'out_power_123':
@@ -344,28 +603,29 @@ def test_build_ha_conf1(ContrDataSeq):
assert d_json == json.dumps({"name": "Active Inverter Connections", "stat_t": "tsun/proxy/proxy", "dev_cla": None, "stat_cla": None, "uniq_id": "inv_count_456", "val_tpl": "{{value_json['Inverter_Cnt'] | int}}", "ic": "mdi:counter", "dev": {"name": "Proxy", "sa": "Proxy", "mdl": "proxy", "mf": "Stefan Allius", "sw": "unknown", "ids": ["proxy"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
assert tests==5
assert tests==1
def test_build_ha_conf2(ContrDataSeq, InvDataSeq, InvDataSeq2):
def test_build_ha_conf3(contr_data_seq, inv_data_seq, inv_data_seq2):
i = InfosG3()
for key, result in i.parse (ContrDataSeq):
pass
for key, result in i.parse (InvDataSeq):
pass
for key, result in i.parse (InvDataSeq2):
pass
i.set_db_def_value(Register.SENSOR_LIST, "01900001")
for key, result in i.parse (contr_data_seq, sensor=0x0e100000):
pass # side effect in calling i.parse()
for key, result in i.parse (inv_data_seq, sensor=0x01900001):
pass # side effect in calling i.parse()
for key, result in i.parse (inv_data_seq2, sensor=0x01900001):
pass # side effect in calling i.parse()
tests = 0
for d_json, comp, node_id, id in i.ha_confs(ha_prfx="tsun/", node_id="garagendach/", snr='123', sug_area = 'roof'):
if id == 'out_power_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/grid", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "out_power_123", "val_tpl": "{{value_json['Output_Power'] | float}}", "unit_of_meas": "W", "dev": {"name": "Micro Inverter - roof", "sa": "Micro Inverter - roof", "via_device": "controller_123", "mdl": "TSOL-MS600", "mf": "TSUN", "sw": "V5.0.11", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/grid", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "out_power_123", "val_tpl": "{{value_json['Output_Power'] | float}}", "unit_of_meas": "W", "dev": {"name": "Micro Inverter - roof", "sa": "Micro Inverter - roof", "via_device": "controller_123", "mdl": "TSOL-MS600", "mf": "TSUN", "sw": "V5.0.11", "sn": "T170000000000001", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
if id == 'daily_gen_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Daily Generation", "stat_t": "tsun/garagendach/total", "dev_cla": "energy", "stat_cla": "total_increasing", "uniq_id": "daily_gen_123", "val_tpl": "{{value_json['Daily_Generation'] | float}}", "unit_of_meas": "kWh", "ic": "mdi:solar-power-variant", "dev": {"name": "Micro Inverter - roof", "sa": "Micro Inverter - roof", "via_device": "controller_123", "mdl": "TSOL-MS600", "mf": "TSUN", "sw": "V5.0.11", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
assert d_json == json.dumps({"name": "Daily Generation", "stat_t": "tsun/garagendach/total", "dev_cla": "energy", "stat_cla": "total_increasing", "uniq_id": "daily_gen_123", "val_tpl": "{{value_json['Daily_Generation'] | float}}", "unit_of_meas": "kWh", "ic": "mdi:solar-power-variant", "dev": {"name": "Micro Inverter - roof", "sa": "Micro Inverter - roof", "via_device": "controller_123", "mdl": "TSOL-MS600", "mf": "TSUN", "sw": "V5.0.11", "sn": "T170000000000001", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'power_pv1_123':
@@ -384,50 +644,100 @@ def test_build_ha_conf2(ContrDataSeq, InvDataSeq, InvDataSeq2):
tests +=1
assert tests==5
def test_must_incr_total(InvDataSeq2, InvDataSeq2_Zero):
def test_build_ha_conf4(contr_data_seq, inv_data_seq):
i = InfosG3()
i.set_db_def_value(Register.SENSOR_LIST, "01900001")
for key, result in i.parse (contr_data_seq, sensor=0x0e100000):
pass # side effect in calling i.parse()
for key, result in i.parse (inv_data_seq, sensor=0x01900001):
pass # side effect in calling i.parse()
i.set_db_def_value(Register.MAC_ADDR, "00a057123456")
tests = 0
for d_json, comp, node_id, id in i.ha_confs(ha_prfx="tsun/", node_id="garagendach/", snr='123', sug_area = 'roof'):
if id == 'signal_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Signal Strength", "stat_t": "tsun/garagendach/controller", "dev_cla": None, "stat_cla": "measurement", "uniq_id": "signal_123", "val_tpl": "{{value_json[\'Signal_Strength\'] | int}}", "unit_of_meas": "%", "ic": "mdi:wifi", "dev": {"name": "Controller - roof", "sa": "Controller - roof", "via_device": "proxy", "mdl": "RSW-1-10001", "mf": "Raymon", "sw": "RSW_400_V1.00.06", "ids": ["controller_123"], "cns": [["mac", "00:a0:57:12:34:56"]]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
assert tests==1
i.set_db_def_value(Register.MAC_ADDR, "00:a0:57:12:34:57")
tests = 0
for d_json, comp, node_id, id in i.ha_confs(ha_prfx="tsun/", node_id="garagendach/", snr='123', sug_area = 'roof'):
if id == 'signal_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Signal Strength", "stat_t": "tsun/garagendach/controller", "dev_cla": None, "stat_cla": "measurement", "uniq_id": "signal_123", "val_tpl": "{{value_json[\'Signal_Strength\'] | int}}", "unit_of_meas": "%", "ic": "mdi:wifi", "dev": {"name": "Controller - roof", "sa": "Controller - roof", "via_device": "proxy", "mdl": "RSW-1-10001", "mf": "Raymon", "sw": "RSW_400_V1.00.06", "ids": ["controller_123"], "cns": [["mac", "00:a0:57:12:34:57"]]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
assert tests==1
def test_build_ha_conf5(contr3_data_seq, inv_data_seq3):
i = InfosG3()
i.set_db_def_value(Register.SENSOR_LIST, "01900000")
for key, result in i.parse (contr3_data_seq, sensor=0x0e100000):
pass # side effect in calling i.parse()
for key, result in i.parse (inv_data_seq3, sensor=0x01900000):
pass # side effect in calling i.parse()
i.set_db_def_value(Register.MAC_ADDR, "00a057123456")
tests = 0
for d_json, comp, node_id, id in i.ha_confs(ha_prfx="tsun/", node_id="garagendach/", snr='123', sug_area = 'roof'):
if id == 'signal_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Signal Strength", "stat_t": "tsun/garagendach/controller", "dev_cla": None, "stat_cla": "measurement", "uniq_id": "signal_123", "val_tpl": "{{value_json[\'Signal_Strength\'] | int}}", "unit_of_meas": "%", "ic": "mdi:wifi", "dev": {"name": "Controller - roof", "sa": "Controller - roof", "via_device": "proxy", "mdl": "RSW-1-10001", "mf": "Raymon", "sw": "RSW_400_V2.01.13", "ids": ["controller_123"], "cns": [["mac", "00:a0:57:12:34:56"]]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
assert tests==1
i.set_db_def_value(Register.MAC_ADDR, "00:a0:57:12:34:57")
tests = 0
for d_json, comp, node_id, id in i.ha_confs(ha_prfx="tsun/", node_id="garagendach/", snr='123', sug_area = 'roof'):
if id == 'signal_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Signal Strength", "stat_t": "tsun/garagendach/controller", "dev_cla": None, "stat_cla": "measurement", "uniq_id": "signal_123", "val_tpl": "{{value_json[\'Signal_Strength\'] | int}}", "unit_of_meas": "%", "ic": "mdi:wifi", "dev": {"name": "Controller - roof", "sa": "Controller - roof", "via_device": "proxy", "mdl": "RSW-1-10001", "mf": "Raymon", "sw": "RSW_400_V2.01.13", "ids": ["controller_123"], "cns": [["mac", "00:a0:57:12:34:57"]]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
assert tests==1
def test_must_incr_total(inv_data_seq2, inv_data_seq2_zero):
i = InfosG3()
tests = 0
for key, update in i.parse (InvDataSeq2):
for key, update in i.parse (inv_data_seq2, sensor=0x01900001):
if key == 'total' or key == 'inverter' or key == 'env':
assert update == True
tests +=1
assert tests==5
assert tests==12
assert json.dumps(i.db['total']) == json.dumps({'Daily_Generation': 1.7, 'Total_Generation': 17.36})
assert json.dumps(i.db['input']) == json.dumps({"pv1": {"Voltage": 33.6, "Current": 1.91, "Power": 64.5, "Daily_Generation": 1.08, "Total_Generation": 9.74}, "pv2": {"Voltage": 33.5, "Current": 1.36, "Power": 45.7, "Daily_Generation": 0.62, "Total_Generation": 7.62}, "pv3": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}, "pv4": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}})
assert json.dumps(i.db['env']) == json.dumps({"Inverter_Temp": 23})
assert json.dumps(i.db['env']) == json.dumps({"Inverter_Status": 1, "Inverter_Temp": 23})
tests = 0
for key, update in i.parse (InvDataSeq2):
if key == 'total':
for key, update in i.parse (inv_data_seq2, sensor=0x01900001):
if key == 'total' or key == 'env':
assert update == False
tests +=1
elif key == 'env':
assert update == False
tests +=1
assert tests==3
assert tests==4
assert json.dumps(i.db['total']) == json.dumps({'Daily_Generation': 1.7, 'Total_Generation': 17.36})
assert json.dumps(i.db['input']) == json.dumps({"pv1": {"Voltage": 33.6, "Current": 1.91, "Power": 64.5, "Daily_Generation": 1.08, "Total_Generation": 9.74}, "pv2": {"Voltage": 33.5, "Current": 1.36, "Power": 45.7, "Daily_Generation": 0.62, "Total_Generation": 7.62}, "pv3": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}, "pv4": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}})
assert json.dumps(i.db['env']) == json.dumps({"Inverter_Temp": 23})
assert json.dumps(i.db['inverter']) == json.dumps({"Rated_Power": 600, "No_Inputs": 2})
assert json.dumps(i.db['env']) == json.dumps({"Inverter_Status": 1, "Inverter_Temp": 23})
assert json.dumps(i.db['inverter']) == json.dumps({"Rated_Power": 600, "BOOT_STATUS": 0, "DSP_STATUS": 21930, "Work_Mode": 0, "Max_Designed_Power": -1, "Input_Coefficient": -0.1, "Output_Coefficient": 100.0, "No_Inputs": 2})
tests = 0
for key, update in i.parse (InvDataSeq2_Zero):
for key, update in i.parse (inv_data_seq2_zero, sensor=0x01900001):
if key == 'total':
assert update == False
tests +=1
elif key == 'env':
assert update == True
tests +=1
assert tests==3
assert tests==4
assert json.dumps(i.db['total']) == json.dumps({'Daily_Generation': 1.7, 'Total_Generation': 17.36})
assert json.dumps(i.db['input']) == json.dumps({"pv1": {"Voltage": 33.6, "Current": 1.91, "Power": 0.0, "Daily_Generation": 1.08, "Total_Generation": 9.74}, "pv2": {"Voltage": 33.5, "Current": 1.36, "Power": 0.0, "Daily_Generation": 0.62, "Total_Generation": 7.62}, "pv3": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}, "pv4": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}})
assert json.dumps(i.db['env']) == json.dumps({"Inverter_Temp": 0})
assert json.dumps(i.db['env']) == json.dumps({"Inverter_Status": 1, "Inverter_Temp": 0})
def test_must_incr_total2(InvDataSeq2, InvDataSeq2_Zero):
def test_must_incr_total2(inv_data_seq2, inv_data_seq2_zero):
i = InfosG3()
tests = 0
for key, update in i.parse (InvDataSeq2_Zero):
for key, update in i.parse (inv_data_seq2_zero, sensor=0x01900001):
if key == 'total':
assert update == False
tests +=1
@@ -435,42 +745,35 @@ def test_must_incr_total2(InvDataSeq2, InvDataSeq2_Zero):
assert update == True
tests +=1
assert tests==3
assert tests==4
assert json.dumps(i.db['total']) == json.dumps({})
assert json.dumps(i.db['input']) == json.dumps({"pv1": {"Voltage": 33.6, "Current": 1.91, "Power": 0.0}, "pv2": {"Voltage": 33.5, "Current": 1.36, "Power": 0.0}, "pv3": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}, "pv4": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}})
assert json.dumps(i.db['env']) == json.dumps({"Inverter_Temp": 0})
assert json.dumps(i.db['env']) == json.dumps({"Inverter_Status": 1, "Inverter_Temp": 0})
tests = 0
for key, update in i.parse (InvDataSeq2_Zero):
if key == 'total':
for key, update in i.parse (inv_data_seq2_zero, sensor=0x01900001):
if key == 'total' or key == 'env':
assert update == False
tests +=1
elif key == 'env':
assert update == False
tests +=1
assert tests==3
assert tests==4
assert json.dumps(i.db['total']) == json.dumps({})
assert json.dumps(i.db['input']) == json.dumps({"pv1": {"Voltage": 33.6, "Current": 1.91, "Power": 0.0}, "pv2": {"Voltage": 33.5, "Current": 1.36, "Power": 0.0}, "pv3": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}, "pv4": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}})
assert json.dumps(i.db['env']) == json.dumps({"Inverter_Temp": 0})
assert json.dumps(i.db['env']) == json.dumps({"Inverter_Status": 1, "Inverter_Temp": 0})
tests = 0
for key, update in i.parse (InvDataSeq2):
if key == 'total':
assert update == True
for key, update in i.parse (inv_data_seq2, sensor=0x01900001):
if key == 'total' or key == 'env':
tests +=1
elif key == 'env':
assert update == True
tests +=1
assert tests==3
assert tests==4
assert json.dumps(i.db['total']) == json.dumps({'Daily_Generation': 1.7, 'Total_Generation': 17.36})
assert json.dumps(i.db['input']) == json.dumps({"pv1": {"Voltage": 33.6, "Current": 1.91, "Power": 64.5, "Daily_Generation": 1.08, "Total_Generation": 9.74}, "pv2": {"Voltage": 33.5, "Current": 1.36, "Power": 45.7, "Daily_Generation": 0.62, "Total_Generation": 7.62}, "pv3": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}, "pv4": {"Voltage": 0.0, "Current": 0.0, "Power": 0.0}})
def test_new_data_types(InvDataNew):
def test_new_data_types(inv_data_new):
i = InfosG3()
tests = 0
for key, update in i.parse (InvDataNew):
for key, update in i.parse (inv_data_new, sensor=0x01900001):
if key == 'events':
tests +=1
elif key == 'inverter':
@@ -482,12 +785,12 @@ def test_new_data_types(InvDataNew):
else:
assert False
assert tests==15
assert json.dumps(i.db['inverter']) == json.dumps({"Manufacturer": 0})
assert tests==7
assert json.dumps(i.db['inverter']) == json.dumps({"Manufacturer": 0, "DSP_STATUS": 0})
assert json.dumps(i.db['input']) == json.dumps({"pv1": {}})
assert json.dumps(i.db['events']) == json.dumps({"401_": 0, "404_": 0, "405_": 0, "408_": 0, "409_No_Utility": 0, "406_": 0, "416_": 0})
assert json.dumps(i.db['events']) == json.dumps({"Inverter_Alarm": 0, "Inverter_Fault": 0})
def test_invalid_data_type(InvalidDataSeq):
def test_invalid_data_type(invalid_data_seq):
i = InfosG3()
i.static_init() # initialize counter
@@ -495,8 +798,8 @@ def test_invalid_data_type(InvalidDataSeq):
assert val == 0
for key, result in i.parse (InvalidDataSeq):
pass
for key, result in i.parse (invalid_data_seq, sensor=0x01900001):
pass # side effect in calling i.parse()
assert json.dumps(i.db) == json.dumps({"inverter": {"Product_Name": "Microinv"}})
val = i.dev_value(Register.INVALID_DATA_TYPE) # check invalid data type counter

View File

@@ -1,18 +1,33 @@
# test_with_pytest.py
import pytest, json
from app.src.infos import Register
from app.src.gen3plus.infos_g3p import InfosG3P
from app.src.gen3plus.infos_g3p import RegisterMap
import pytest, json, math, random
from infos import Register
from gen3plus.infos_g3p import InfosG3P
from gen3plus.infos_g3p import RegisterMap
@pytest.fixture(scope="session")
def str_test_ip():
ip = ".".join(str(random.randint(1, 254)) for _ in range(4))
print(f'random_ip: {ip}')
return ip
@pytest.fixture(scope="session")
def bytes_test_ip(str_test_ip):
ip = bytes(str.encode(str_test_ip))
l = len(ip)
if l < 16:
ip = ip + bytearray(16-l)
print(f'random_ip: {ip}')
return ip
@pytest.fixture
def DeviceData(): # 0x4110 ftype: 0x02
def device_data(bytes_test_ip): # 0x4110 ftype: 0x02
msg = b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\xba\xd2\x00\x00'
msg += b'\x19\x00\x00\x00\x00\x00\x00\x00\x05\x3c\x78\x01\x64\x01\x4c\x53'
msg += b'\x57\x35\x42\x4c\x45\x5f\x31\x37\x5f\x30\x32\x42\x30\x5f\x31\x2e'
msg += b'\x30\x35\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x40\x2a\x8f\x4f\x51\x54\x31\x39\x32\x2e'
msg += b'\x31\x36\x38\x2e\x38\x30\x2e\x34\x39\x00\x00\x00\x0f\x00\x01\xb0'
msg += b'\x00\x00\x00\x00\x00\x00\x40\x2a\x8f\x4f\x51\x54' + bytes_test_ip
msg += b'\x0f\x00\x01\xb0'
msg += b'\x02\x0f\x00\xff\x56\x31\x2e\x31\x2e\x30\x30\x2e\x30\x42\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xfe\xfe\x00\x00'
@@ -24,7 +39,7 @@ def DeviceData(): # 0x4110 ftype: 0x02
return msg
@pytest.fixture
def InverterData(): # 0x4210 ftype: 0x01
def inverter_data(): # 0x4210 ftype: 0x01
msg = b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\xb0\x02\xbc\xc8'
msg += b'\x24\x32\x6c\x1f\x00\x00\xa0\x47\xe4\x33\x01\x00\x03\x08\x00\x00'
msg += b'\x59\x31\x37\x45\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x45'
@@ -42,6 +57,7 @@ def InverterData(): # 0x4210 ftype: 0x01
msg += b'\x01\x61\x00\xa8\x02\x54\x01\x5a\x00\x8a\x01\xe4\x01\x5a\x00\xbd'
msg += b'\x02\x8f\x00\x11\x00\x01\x00\x00\x00\x0b\x00\x00\x27\x98\x00\x04'
msg += b'\x00\x00\x0c\x04\x00\x03\x00\x00\x0a\xe7\x00\x05\x00\x00\x0c\x75'
msg += b'\x00\x00\x00\x00\x06\x16\x02\x00\x00\x00\x55\xaa\x00\x01\x00\x00'
msg += b'\x00\x00\x00\x00\xff\xff\x07\xd0\x00\x03\x04\x00\x04\x00\x04\x00'
msg += b'\x04\x00\x00\x01\xff\xff\x00\x01\x00\x06\x00\x68\x00\x68\x05\x00'
@@ -54,48 +70,166 @@ def InverterData(): # 0x4210 ftype: 0x01
msg += b'\x00\x00\x00\x00'
return msg
@pytest.fixture
def batterie_data(): # 0x4210 ftype: 0x01
msg = b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x26\x30\xc7\xde'
msg += b'\x2d\x32\x28\x00\x00\x00\x84\x17\x79\x35\x01\x00\x4c\x12\x00\x00'
msg += b'\x34\x31\x30\x31\x32\x34\x30\x37\x30\x31\x34\x39\x30\x33\x31\x34'
msg += b'\x0d\x3a\x00\x70\x0d\x2c\x00\x00\x00\x00\x08\x20\x00\x00\x00\x00'
msg += b'\x14\x0e\xff\xfe\x03\xe8\x0c\x89\x0c\x89\x0c\x89\x0c\x8a\x0c\x89'
msg += b'\x0c\x89\x0c\x8a\x0c\x89\x0c\x89\x0c\x8a\x0c\x8a\x0c\x89\x0c\x89'
msg += b'\x0c\x89\x0c\x89\x0c\x88\x00\x0f\x00\x0f\x00\x0f\x00\x0e\x00\x00'
msg += b'\x00\x00\x00\x0f\x00\x00\x02\x05\x02\x01'
return msg
@pytest.fixture
def batterie_data1(): # 0x4210 ftype: 0x01
msg = b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x26\x30\xc7\xde'
msg += b'\x2d\x32\x28\x00\x00\x00\x84\x17\x79\x35\x01\x00\x4c\x12\x00\x00'
msg += b'\x34\x31\x30\x31\x32\x34\x30\x37\x30\x31\x34\x39\x30\x33\x31\x34'
msg += b'\x0d\x3a\x00\x70\x0d\x2c\x00\x00\x00\x00\x08\x20\x00\x00\x00\x00'
msg += b'\x01\x00\x00\x00\x03\xe8\x0c\x89\x0c\x89\x0c\x89\x0c\x8a\x0c\x89'
msg += b'\x0c\x89\x0c\x8a\x0c\x89\x0c\x89\x0c\x8a\x0c\x8a\x0c\x89\x0c\x89'
msg += b'\x0c\x89\x0c\x89\x0c\x88\x00\x0f\x00\x0f\x00\x0f\x0c\x0e\x01\x00'
msg += b'\x00\x00\x00\x0f\x00\x00\x02\x05\x02\x01'
return msg
@pytest.fixture
def batterie_data2(): # 0x4210 ftype: 0x01
msg = b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x26\x30\xc7\xde'
msg += b'\x2d\x32\x28\x00\x00\x00\x84\x17\x79\x35\x01\x00\x4c\x12\x00\x00'
msg += b'\x34\x31\x30\x31\x32\x34\x30\x37\x30\x31\x34\x39\x30\x33\x31\x34'
msg += b'\x0d\x3a\x00\x70\x0d\x2c\x00\x00\x00\x00\x08\x20\x00\x00\x00\x00'
msg += b'\x14\x0e\x02\xfe\x03\xe8\x0c\x89\x0c\x89\x0c\x89\x0c\x8a\x0c\x89'
msg += b'\x0c\x89\x0c\x8a\x0c\x89\x0c\x89\x0c\x8a\x0c\x8a\x0c\x89\x0c\x89'
msg += b'\x0c\x89\x0c\x89\x0c\x88\x00\x0f\x00\x0f\x00\x0f\x00\x0e'
return msg
def test_default_db():
i = InfosG3P()
i = InfosG3P(client_mode=False)
assert json.dumps(i.db) == json.dumps({
"inverter": {"Manufacturer": "TSUN", "Equipment_Model": "TSOL-MSxx00"},
"inverter": {"Manufacturer": "TSUN", "Equipment_Model": "TSOL-MSxx00", "No_Inputs": 4},
"collector": {"Chip_Type": "IGEN TECH"},
})
def test_parse_4110(DeviceData: bytes):
i = InfosG3P()
def test_parse_4110(str_test_ip, device_data: bytes):
i = InfosG3P(client_mode=False)
i.db.clear()
for key, update in i.parse (DeviceData, 0x41, 2):
pass
for key, update in i.parse (device_data, 0x41, 2):
pass # side effect is calling generator i.parse()
assert json.dumps(i.db) == json.dumps({
'controller': {"Data_Up_Interval": 300, "Collect_Interval": 1, "Heartbeat_Interval": 120, "Signal_Strength": 100, "IP_Address": "192.168.80.49"},
'collector': {"Chip_Model": "LSW5BLE_17_02B0_1.05", "Collector_Fw_Version": "V1.1.00.0B"},
'controller': {"Data_Up_Interval": 300, "Collect_Interval": 1, "Heartbeat_Interval": 120, "Signal_Strength": 100, "IP_Address": str_test_ip, "Sensor_List": "02b0", "WiFi_SSID": "Allius-Home"},
'collector': {"Chip_Model": "LSW5BLE_17_02B0_1.05", "MAC-Addr": "40:2a:8f:4f:51:54", "Collector_Fw_Version": "V1.1.00.0B"},
})
def test_parse_4210(InverterData: bytes):
i = InfosG3P()
def test_build_4110(str_test_ip, device_data: bytes):
i = InfosG3P(client_mode=False)
i.db.clear()
for key, update in i.parse (device_data, 0x41, 2):
pass # side effect is calling generator i.parse()
build_msg = i.build(len(device_data), 0x41, 2)
for i in range(11, 20):
build_msg[i] = device_data[i]
assert device_data == build_msg
def test_parse_4210_02b0(inverter_data: bytes):
i = InfosG3P(client_mode=False)
i.db.clear()
for key, update in i.parse (InverterData, 0x42, 1):
pass
for key, update in i.parse (inverter_data, 0x42, 1, 0x02b0):
pass # side effect is calling generator i.parse()
assert json.dumps(i.db) == json.dumps({
"controller": {"Power_On_Time": 2051},
"inverter": {"Serial_Number": "Y17E00000000000E", "Version": "V4.0.10", "Rated_Power": 600, "Max_Designed_Power": 2000, "No_Inputs": 4},
"env": {"Inverter_Status": 1, "Inverter_Temp": 14},
"controller": {"Sensor_List": "02b0", "Power_On_Time": 2051},
"inverter": {"Serial_Number": "Y17E00000000000E", "Version": "V4.0.10", "Rated_Power": 600, "BOOT_STATUS": 0, "DSP_STATUS": 21930, "Work_Mode": 0, "Max_Designed_Power": 2000, "Input_Coefficient": 100.0, "Output_Coefficient": 100.0},
"env": {"Inverter_Status": 1, "Detect_Status_1": 2, "Detect_Status_2": 0, "Inverter_Temp": 14},
"events": {"Inverter_Alarm": 0, "Inverter_Fault": 0, "Inverter_Bitfield_1": 0, "Inverter_bitfield_2": 0},
"grid": {"Voltage": 224.8, "Current": 0.73, "Frequency": 50.05, "Output_Power": 165.8},
"input": {"pv1": {"Voltage": 35.3, "Current": 1.68, "Power": 59.6, "Daily_Generation": 0.04, "Total_Generation": 30.76},
"pv2": {"Voltage": 34.6, "Current": 1.38, "Power": 48.4, "Daily_Generation": 0.03, "Total_Generation": 27.91},
"pv3": {"Voltage": 34.6, "Current": 1.89, "Power": 65.5, "Daily_Generation": 0.05, "Total_Generation": 31.89},
"pv4": {"Voltage": 1.7, "Current": 0.01, "Power": 0.0, "Total_Generation": 15.58}},
"total": {"Daily_Generation": 0.11, "Total_Generation": 101.36}
"total": {"Daily_Generation": 0.11, "Total_Generation": 101.36},
"inv_unknown": {"Unknown_1": 512},
"other": {"Output_Shutdown": 65535, "Rated_Level": 3, "Grid_Volt_Cal_Coef": 1024, "Prod_Compliance_Type": 6}
})
def test_parse_4210_3026(batterie_data: bytes):
i = InfosG3P(client_mode=False)
i.db.clear()
for key, update in i.parse (batterie_data, 0x42, 1, 0x3026):
pass # side effect is calling generator i.parse()
assert json.dumps(i.db) == json.dumps({
"controller": {"Sensor_List": "3026", "Power_On_Time": 4684},
"inverter": {"Serial_Number": "4101240701490314"},
"batterie": {"pv1": {"Voltage": 33.86, "Current": 1.12, "MPPT-Status": 0},
"pv2": {"Voltage": 33.72, "Current": 0.0, "MPPT-Status": 0},
"batt": {"Total_Charging": 20.8, "Voltage": 51.34, "Current": -0.02, "SOC": 10.0, "Power": -1.0268000000000002, 'Batt_State': 0},
"cell": {"Volt1": 3.21, "Volt2": 3.21, "Volt3": 3.21, "Volt4": 3.21, "Volt5": 3.21, "Volt6": 3.21, "Volt7": 3.21, "Volt8": 3.21, "Volt9": 3.21, "Volt10": 3.21, "Volt11": 3.21, "Volt12": 3.21, "Volt13": 3.21, "Volt14": 3.21, "Volt15": 3.21, "Volt16": 3.21, "Temp_1": 15, "Temp_2": 15, "Temp_3": 15},
"out": {"Voltage": 0.14, "Current": 0.0, "Out_Status": 0, "Power": 0.0, "Suppl_State": 0},
"Controller_Temp": 15, "Batterie_Alarm": 0, "Hardware_Version": 517, "Software_Version": 513,
"PV_Power": 37.9232},
})
def test_parse_4210_3026_prod(batterie_data1: bytes):
i = InfosG3P(client_mode=False)
i.db.clear()
for key, update in i.parse (batterie_data1, 0x42, 1, 0x3026):
pass # side effect is calling generator i.parse()
assert json.dumps(i.db) == json.dumps({
"controller": {"Sensor_List": "3026", "Power_On_Time": 4684},
"inverter": {"Serial_Number": "4101240701490314"},
"batterie": {"pv1": {"Voltage": 33.86, "Current": 1.12, "MPPT-Status": 0},
"pv2": {"Voltage": 33.72, "Current": 0.0, "MPPT-Status": 0},
"batt": {"Total_Charging": 20.8, "Voltage": 2.56, "Current": 0.0, "SOC": 10.0, "Power": 0.0, 'Batt_State': 1},
"cell": {"Volt1": 3.21, "Volt2": 3.21, "Volt3": 3.21, "Volt4": 3.21, "Volt5": 3.21, "Volt6": 3.21, "Volt7": 3.21, "Volt8": 3.21, "Volt9": 3.21, "Volt10": 3.21, "Volt11": 3.21, "Volt12": 3.21, "Volt13": 3.21, "Volt14": 3.21, "Volt15": 3.21, "Volt16": 3.21, "Temp_1": 15, "Temp_2": 15, "Temp_3": 15},
"out": {"Voltage": 30.86, "Current": 2.56, "Out_Status": 0, "Power": 79.0016, "Suppl_State": 1},
"Controller_Temp": 15, "Batterie_Alarm": 0, "Hardware_Version": 517, "Software_Version": 513,
"PV_Power": 37.9232},
})
def test_parse_4210_3026_incomplete(batterie_data2: bytes):
i = InfosG3P(client_mode=False)
i.db.clear()
for key, update in i.parse (batterie_data2, 0x42, 1, 0x3026):
pass # side effect is calling generator i.parse()
assert json.dumps(i.db) == json.dumps({
"controller": {"Sensor_List": "3026", "Power_On_Time": 4684},
"inverter": {"Serial_Number": "4101240701490314"},
"batterie": {"pv1": {"Voltage": 33.86, "Current": 1.12, "MPPT-Status": 0},
"pv2": {"Voltage": 33.72, "Current": 0.0, "MPPT-Status": 0},
"batt": {"Total_Charging": 20.8, "Voltage": 51.34, "Current": 7.66, "SOC": 10.0, "Power": 393.2644, 'Batt_State': 2},
"cell": {"Volt1": 3.21, "Volt2": 3.21, "Volt3": 3.21, "Volt4": 3.21, "Volt5": 3.21, "Volt6": 3.21, "Volt7": 3.21, "Volt8": 3.21, "Volt9": 3.21, "Volt10": 3.21, "Volt11": 3.21, "Volt12": 3.21, "Volt13": 3.21, "Volt14": 3.21, "Volt15": 3.21, "Volt16": 3.21, "Temp_1": 15, "Temp_2": 15, "Temp_3": 15},
"out": {"Voltage": 0.14, "Current": None, "Out_Status": None, "Power": None, "Suppl_State": None},
"Controller_Temp": None, "Batterie_Alarm": None, "Hardware_Version": None, "Software_Version": None,
"PV_Power": 37.9232},
})
def test_build_4210(inverter_data: bytes):
i = InfosG3P(client_mode=False)
i.db.clear()
for key, update in i.parse (inverter_data, 0x42, 1, 0x02b0):
pass # side effect is calling generator i.parse()
build_msg = i.build(len(inverter_data), 0x42, 1, 0x02b0)
for i in range(11, 31):
build_msg[i] = inverter_data[i]
assert inverter_data == build_msg
def test_build_ha_conf1():
i = InfosG3P()
i = InfosG3P(client_mode=False)
i.static_init() # initialize counter
i.set_db_def_value(Register.SENSOR_LIST, "02b0")
tests = 0
for d_json, comp, node_id, id in i.ha_confs(ha_prfx="tsun/", node_id="garagendach/", snr='123'):
@@ -116,8 +250,19 @@ def test_build_ha_conf1():
tests +=1
elif id == 'power_pv2_123':
assert False # if we haven't received and parsed a control data msg, we don't know the number of inputs. In this case we only register the first one!!
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv2_123", "val_tpl": "{{ (value_json['pv2']['Power'] | float)}}", "unit_of_meas": "W", "dev": {"name": "Module PV2", "sa": "Module PV2", "via_device": "inverter_123", "ids": ["input_pv2_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'power_pv3_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv3_123", "val_tpl": "{{ (value_json['pv3']['Power'] | float)}}", "unit_of_meas": "W", "dev": {"name": "Module PV3", "sa": "Module PV3", "via_device": "inverter_123", "ids": ["input_pv3_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'power_pv4_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv4_123", "val_tpl": "{{ (value_json['pv4']['Power'] | float)}}", "unit_of_meas": "W", "dev": {"name": "Module PV4", "sa": "Module PV4", "via_device": "inverter_123", "ids": ["input_pv4_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'signal_123':
assert comp == 'sensor'
@@ -126,9 +271,13 @@ def test_build_ha_conf1():
elif id == 'inv_count_456':
assert False
assert tests==4
assert tests==7
def test_build_ha_conf2():
i = InfosG3P(client_mode=False)
i.static_init() # initialize counter
tests = 0
for d_json, comp, node_id, id in i.ha_proxy_confs(ha_prfx="tsun/", node_id = 'proxy/', snr = '456'):
if id == 'out_power_123':
@@ -138,8 +287,11 @@ def test_build_ha_conf1():
elif id == 'power_pv1_123':
assert False
elif id == 'power_pv2_123':
assert False # if we haven't received and parsed a control data msg, we don't know the number of inputs. In this case we only register the first one!!
assert False
elif id == 'power_pv3_123':
assert False
elif id == 'power_pv4_123':
assert False
elif id == 'signal_123':
assert False
elif id == 'inv_count_456':
@@ -147,30 +299,166 @@ def test_build_ha_conf1():
assert d_json == json.dumps({"name": "Active Inverter Connections", "stat_t": "tsun/proxy/proxy", "dev_cla": None, "stat_cla": None, "uniq_id": "inv_count_456", "val_tpl": "{{value_json['Inverter_Cnt'] | int}}", "ic": "mdi:counter", "dev": {"name": "Proxy", "sa": "Proxy", "mdl": "proxy", "mf": "Stefan Allius", "sw": "unknown", "ids": ["proxy"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
assert tests==5
assert tests==1
def test_exception_and_eval(InverterData: bytes):
def test_build_ha_conf3():
i = InfosG3P(client_mode=True)
i.static_init() # initialize counter
i.set_db_def_value(Register.SENSOR_LIST, "02b0")
# add eval to convert temperature from °F to °C
RegisterMap.map[0x420100d8]['eval'] = '(result-32)/1.8'
tests = 0
for d_json, comp, node_id, id in i.ha_confs(ha_prfx="tsun/", node_id="garagendach/", snr='123'):
if id == 'out_power_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/grid", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "out_power_123", "val_tpl": "{{value_json['Output_Power'] | float}}", "unit_of_meas": "W", "dev": {"name": "Micro Inverter", "sa": "Micro Inverter", "via_device": "controller_123", "mdl": "TSOL-MSxx00", "mf": "TSUN", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'daily_gen_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Daily Generation", "stat_t": "tsun/garagendach/total", "dev_cla": "energy", "stat_cla": "total_increasing", "uniq_id": "daily_gen_123", "val_tpl": "{{value_json['Daily_Generation'] | float}}", "unit_of_meas": "kWh", "ic": "mdi:solar-power-variant", "dev": {"name": "Micro Inverter", "sa": "Micro Inverter", "via_device": "controller_123", "mdl": "TSOL-MSxx00", "mf": "TSUN", "ids": ["inverter_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'power_pv1_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv1_123", "val_tpl": "{{ (value_json['pv1']['Power'] | float)}}", "unit_of_meas": "W", "dev": {"name": "Module PV1", "sa": "Module PV1", "via_device": "inverter_123", "ids": ["input_pv1_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'power_pv2_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv2_123", "val_tpl": "{{ (value_json['pv2']['Power'] | float)}}", "unit_of_meas": "W", "dev": {"name": "Module PV2", "sa": "Module PV2", "via_device": "inverter_123", "ids": ["input_pv2_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'power_pv3_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv3_123", "val_tpl": "{{ (value_json['pv3']['Power'] | float)}}", "unit_of_meas": "W", "dev": {"name": "Module PV3", "sa": "Module PV3", "via_device": "inverter_123", "ids": ["input_pv3_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'power_pv4_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Power", "stat_t": "tsun/garagendach/input", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "power_pv4_123", "val_tpl": "{{ (value_json['pv4']['Power'] | float)}}", "unit_of_meas": "W", "dev": {"name": "Module PV4", "sa": "Module PV4", "via_device": "inverter_123", "ids": ["input_pv4_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'signal_123':
assert comp == 'sensor'
assert d_json == json.dumps({})
tests +=1
elif id == 'inv_count_456':
assert False
assert tests==7
def test_build_ha_conf4():
i = InfosG3P(client_mode=True)
i.static_init() # initialize counter
tests = 0
for d_json, comp, node_id, id in i.ha_proxy_confs(ha_prfx="tsun/", node_id = 'proxy/', snr = '456'):
if id == 'out_power_123':
assert False
elif id == 'daily_gen_123':
assert False
elif id == 'power_pv1_123':
assert False
elif id == 'power_pv2_123':
assert False
elif id == 'power_pv3_123':
assert False
elif id == 'power_pv4_123':
assert False
elif id == 'signal_123':
assert False
elif id == 'inv_count_456':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Active Inverter Connections", "stat_t": "tsun/proxy/proxy", "dev_cla": None, "stat_cla": None, "uniq_id": "inv_count_456", "val_tpl": "{{value_json['Inverter_Cnt'] | int}}", "ic": "mdi:counter", "dev": {"name": "Proxy", "sa": "Proxy", "mdl": "proxy", "mf": "Stefan Allius", "sw": "unknown", "ids": ["proxy"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
assert tests==1
def test_build_ha_conf5():
i = InfosG3P(client_mode=True)
i.static_init() # initialize counter
i.set_db_def_value(Register.SENSOR_LIST, "3026")
tests = 0
for d_json, comp, node_id, id in i.ha_confs(ha_prfx="tsun/", node_id="garagendach/", snr='123'):
if id == 'out_power_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Supply Power", "stat_t": "tsun/garagendach/batterie", "dev_cla": "power", "stat_cla": "measurement", "uniq_id": "out_power_123", "val_tpl": "{{ (value_json['out']['Power'] | int)}}", "unit_of_meas": "W", "dev": {"name": "Batterie", "sa": "Batterie", "via_device": "controller_123", "mdl": "TSOL-MSxx00", "mf": "TSUN", "ids": ["batterie_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'daily_gen_123':
assert False
elif id == 'volt_pv1_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Voltage", "stat_t": "tsun/garagendach/batterie", "dev_cla": "voltage", "stat_cla": "measurement", "uniq_id": "volt_pv1_123", "val_tpl": "{{ (value_json['pv1']['Voltage'] | float)}}", "unit_of_meas": "V", "ic": "mdi:gauge", "ent_cat": "diagnostic", "dev": {"name": "Module PV1", "sa": "Module PV1", "via_device": "batterie_123", "ids": ["bat_inp_pv1_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'volt_pv2_123':
assert comp == 'sensor'
assert d_json == json.dumps({"name": "Voltage", "stat_t": "tsun/garagendach/batterie", "dev_cla": "voltage", "stat_cla": "measurement", "uniq_id": "volt_pv2_123", "val_tpl": "{{ (value_json['pv2']['Voltage'] | float)}}", "unit_of_meas": "V", "ic": "mdi:gauge", "ent_cat": "diagnostic", "dev": {"name": "Module PV2", "sa": "Module PV2", "via_device": "batterie_123", "ids": ["bat_inp_pv2_123"]}, "o": {"name": "proxy", "sw": "unknown"}})
tests +=1
elif id == 'signal_123':
assert comp == 'sensor'
assert d_json == json.dumps({})
tests +=1
elif id == 'inv_count_456':
assert False
else:
print(id)
assert tests==4
def test_exception_and_calc(inverter_data: bytes):
# patch table to convert temperature from °F to °C
ofs = RegisterMap.map_02b0[0x420100d8]['offset']
RegisterMap.map_02b0[0x420100d8]['quotient'] = 1.8
RegisterMap.map_02b0[0x420100d8]['offset'] = -32/1.8
# map PV1_VOLTAGE to invalid register
RegisterMap.map[0x420100e0]['reg'] = Register.TEST_REG2
RegisterMap.map_02b0[0x420100e0]['reg'] = Register.TEST_REG2
# set invalid maping entry for OUTPUT_POWER (string instead of dict type)
Backup = RegisterMap.map[0x420100de]
RegisterMap.map[0x420100de] = 'invalid_entry'
backup = RegisterMap.map_02b0[0x420100de]
RegisterMap.map_02b0[0x420100de] = 'invalid_entry'
i = InfosG3P()
# i.db.clear()
i = InfosG3P(client_mode=False)
i.db.clear()
for key, update in i.parse (InverterData, 0x42, 1):
pass
assert 12.2222 == round (i.get_db_value(Register.INVERTER_TEMP, 0),4)
for key, update in i.parse (inverter_data, 0x42, 1, 0x02b0):
pass # side effect is calling generator i.parse()
assert math.isclose(12.2222, round (i.get_db_value(Register.INVERTER_TEMP, 0),4), rel_tol=1e-09, abs_tol=1e-09)
del RegisterMap.map[0x420100d8]['eval'] # remove eval
RegisterMap.map[0x420100e0]['reg'] = Register.PV1_VOLTAGE # reset mapping
RegisterMap.map[0x420100de] = Backup # reset mapping
build_msg = i.build(len(inverter_data), 0x42, 1, 0x02b0)
assert build_msg[32:0xde] == inverter_data[32:0xde]
assert build_msg[0xde:0xe2] == b'\x00\x00\x00\x00'
assert build_msg[0xe2:-1] == inverter_data[0xe2:-1]
# remove a table entry and test parsing and building
del RegisterMap.map_02b0[0x420100d8]['quotient']
del RegisterMap.map_02b0[0x420100d8]['offset']
i.db.clear()
for key, update in i.parse (InverterData, 0x42, 1):
pass
for key, update in i.parse (inverter_data, 0x42, 1, 0x02b0):
pass # side effect is calling generator i.parse()
assert 54 == i.get_db_value(Register.INVERTER_TEMP, 0)
build_msg = i.build(len(inverter_data), 0x42, 1, 0x02b0)
assert build_msg[32:0xd8] == inverter_data[32:0xd8]
assert build_msg[0xd8:0xe2] == b'\x006\x00\x00\x02X\x00\x00\x00\x00'
assert build_msg[0xe2:-1] == inverter_data[0xe2:-1]
# test restore table
RegisterMap.map_02b0[0x420100d8]['offset'] = ofs
RegisterMap.map_02b0[0x420100e0]['reg'] = Register.PV1_VOLTAGE # reset mapping
RegisterMap.map_02b0[0x420100de] = backup # reset mapping
# test orginial table
i.db.clear()
for key, update in i.parse (inverter_data, 0x42, 1, 0x02b0):
pass # side effect is calling generator i.parse()
assert 14 == i.get_db_value(Register.INVERTER_TEMP, 0)
build_msg = i.build(len(inverter_data), 0x42, 1, 0x02b0)
assert build_msg[32:-1] == inverter_data[32:-1]

View File

@@ -0,0 +1,416 @@
# test_with_pytest.py
import pytest
import asyncio
import gc
from mock import patch
from enum import Enum
from infos import Infos
from cnf.config import Config
from gen3.talent import Talent
from inverter_base import InverterBase
from singleton import Singleton
from async_stream import AsyncStream, AsyncStreamClient
from test_modbus_tcp import patch_mqtt_err, patch_mqtt_except, test_port, test_hostname
pytest_plugins = ('pytest_asyncio',)
# initialize the proxy statistics
Infos.static_init()
@pytest.fixture
def config_conn():
Config.act_config = {
'mqtt':{
'host': test_hostname,
'port': test_port,
'user': '',
'passwd': ''
},
'ha':{
'auto_conf_prefix': 'homeassistant',
'discovery_prefix': 'homeassistant',
'entity_prefix': 'tsun',
'proxy_node_id': 'test_1',
'proxy_unique_id': ''
},
'tsun':{'enabled': True, 'host': 'test_cloud.local', 'port': 1234}, 'inverters':{'allow_all':True}
}
@pytest.fixture(scope="module", autouse=True)
def module_init():
Singleton._instances.clear()
yield
class FakeReader():
def __init__(self):
self.on_recv = asyncio.Event()
async def read(self, max_len: int):
await self.on_recv.wait()
return b''
def feed_eof(self):
return
class FakeWriter():
peer = ('47.1.2.3', 10000)
def write(self, buf: bytes):
return
def get_extra_info(self, sel: str):
if sel == 'peername':
return self.peer
elif sel == 'sockname':
return 'sock:1234'
assert False
def is_closing(self):
return False
def close(self):
return
async def wait_closed(self):
return
class MockType(Enum):
RD_TEST_0_BYTES = 1
RD_TEST_TIMEOUT = 2
RD_TEST_EXCEPT = 3
test = MockType.RD_TEST_0_BYTES
@pytest.fixture
def patch_open_connection():
async def new_conn(conn):
await asyncio.sleep(0)
return FakeReader(), FakeWriter()
def new_open(host: str, port: int):
if test == MockType.RD_TEST_TIMEOUT:
raise ConnectionRefusedError
elif test == MockType.RD_TEST_EXCEPT:
raise ValueError("Value cannot be negative") # Compliant
return new_conn(None)
with patch.object(asyncio, 'open_connection', new_open) as conn:
yield conn
@pytest.fixture
def patch_healthy():
with patch.object(AsyncStream, 'healthy') as conn:
yield conn
@pytest.fixture
def patch_unhealthy():
def new_healthy(self):
return False
with patch.object(AsyncStream, 'healthy', new_healthy) as conn:
yield conn
@pytest.fixture
def patch_unhealthy_remote():
def new_healthy(self):
return False
with patch.object(AsyncStreamClient, 'healthy', new_healthy) as conn:
yield conn
def test_inverter_iter():
InverterBase._registry.clear()
cnt = 0
reader = FakeReader()
writer = FakeWriter()
with InverterBase(reader, writer, 'tsun', Talent) as inverter:
for inv in InverterBase:
assert inv == inverter
cnt += 1
del inv
del inverter
assert cnt == 1
for inv in InverterBase:
assert False
def test_method_calls(patch_healthy):
spy = patch_healthy
InverterBase._registry.clear()
reader = FakeReader()
writer = FakeWriter()
with InverterBase(reader, writer, 'tsun', Talent) as inverter:
assert inverter.local.stream
assert inverter.local.ifc
# call healthy inside the contexter manager
for inv in InverterBase:
assert inv.healthy()
del inv
spy.assert_called_once()
# outside context manager the health function of AsyncStream is not reachable
cnt = 0
for inv in InverterBase:
assert inv.healthy()
cnt += 1
del inv
assert cnt == 1
spy.assert_called_once() # counter don't increase and keep one!
del inverter
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
def test_unhealthy(patch_unhealthy):
_ = patch_unhealthy
InverterBase._registry.clear()
reader = FakeReader()
writer = FakeWriter()
with InverterBase(reader, writer, 'tsun', Talent) as inverter:
assert inverter.local.stream
assert inverter.local.ifc
# call healthy inside the contexter manager
assert not inverter.healthy()
# outside context manager the unhealth AsyncStream is released
cnt = 0
for inv in InverterBase:
assert inv.healthy() # inverter is healthy again (without the unhealty AsyncStream)
cnt += 1
del inv
assert cnt == 1
del inverter
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
def test_unhealthy_remote(patch_unhealthy_remote):
_ = patch_unhealthy
InverterBase._registry.clear()
reader = FakeReader()
writer = FakeWriter()
with InverterBase(reader, writer, 'tsun', Talent) as inverter:
assert inverter.local.stream
assert inverter.local.ifc
# call healthy inside the contexter manager
assert not inverter.healthy()
# outside context manager the unhealth AsyncStream is released
cnt = 0
for inv in InverterBase:
assert inv.healthy() # inverter is healthy again (without the unhealty AsyncStream)
cnt += 1
del inv
assert cnt == 1
del inverter
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_conn(config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
reader = FakeReader()
writer = FakeWriter()
with InverterBase(reader, writer, 'tsun', Talent) as inverter:
await inverter.create_remote()
await asyncio.sleep(0)
assert inverter.remote.stream
assert inverter.remote.ifc
# call healthy inside the contexter manager
assert inverter.healthy()
# call healthy outside the contexter manager (__exit__() was called)
assert inverter.healthy()
del inverter
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_conn_to_private(config_conn, patch_open_connection):
'''check DNS resolving of the TSUN FQDN to a local address'''
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
InverterBase._registry.clear()
reader = FakeReader()
writer = FakeWriter()
FakeWriter.peer = ("192.168.0.1", 10000)
with InverterBase(reader, writer, 'tsun', Talent) as inverter:
assert inverter.local.stream
assert inverter.local.ifc
await inverter.create_remote()
await asyncio.sleep(0)
assert not Config.act_config['tsun']['enabled']
assert inverter.remote.stream
assert inverter.remote.ifc
assert inverter.local.ifc.healthy()
# outside context manager the unhealth AsyncStream is released
FakeWriter.peer = ("47.1.2.3", 10000)
cnt = 0
for inv in InverterBase:
assert inv.healthy() # inverter is healthy again (without the unhealty AsyncStream)
cnt += 1
del inv
assert cnt == 1
del inverter
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_conn_to_loopback(config_conn, patch_open_connection):
'''check DNS resolving of the TSUN FQDN to the loopback address'''
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
InverterBase._registry.clear()
reader = FakeReader()
writer = FakeWriter()
FakeWriter.peer = ("127.0.0.1", 10000)
with InverterBase(reader, writer, 'tsun', Talent) as inverter:
assert inverter.local.stream
assert inverter.local.ifc
await inverter.create_remote()
await asyncio.sleep(0)
assert not Config.act_config['tsun']['enabled']
assert inverter.remote.stream
assert inverter.remote.ifc
assert inverter.local.ifc.healthy()
# outside context manager the unhealth AsyncStream is released
FakeWriter.peer = ("47.1.2.3", 10000)
cnt = 0
for inv in InverterBase:
assert inv.healthy() # inverter is healthy again (without the unhealty AsyncStream)
cnt += 1
del inv
assert cnt == 1
del inverter
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_conn_to_none(config_conn, patch_open_connection):
'''check if get_extra_info() return None in case of an error'''
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
InverterBase._registry.clear()
reader = FakeReader()
writer = FakeWriter()
FakeWriter.peer = None
with InverterBase(reader, writer, 'tsun', Talent) as inverter:
assert inverter.local.stream
assert inverter.local.ifc
await inverter.create_remote()
await asyncio.sleep(0)
assert Config.act_config['tsun']['enabled']
assert inverter.remote.stream
assert inverter.remote.ifc
assert inverter.local.ifc.healthy()
# outside context manager the unhealth AsyncStream is released
FakeWriter.peer = ("47.1.2.3", 10000)
cnt = 0
for inv in InverterBase:
assert inv.healthy() # inverter is healthy again (without the unhealty AsyncStream)
cnt += 1
del inv
assert cnt == 1
del inverter
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_unhealthy_remote(config_conn, patch_open_connection, patch_unhealthy_remote):
_ = config_conn
_ = patch_open_connection
_ = patch_unhealthy_remote
assert asyncio.get_running_loop()
InverterBase._registry.clear()
reader = FakeReader()
writer = FakeWriter()
with InverterBase(reader, writer, 'tsun', Talent) as inverter:
assert inverter.local.stream
assert inverter.local.ifc
await inverter.create_remote()
await asyncio.sleep(0)
assert inverter.remote.stream
assert inverter.remote.ifc
assert inverter.local.ifc.healthy()
assert not inverter.remote.ifc.healthy()
# call healthy inside the contexter manager
assert not inverter.healthy()
# outside context manager the unhealth AsyncStream is released
cnt = 0
for inv in InverterBase:
assert inv.healthy() # inverter is healthy again (without the unhealty AsyncStream)
cnt += 1
del inv
assert cnt == 1
del inverter
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_disc(config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
reader = FakeReader()
writer = FakeWriter()
with InverterBase(reader, writer, 'tsun', Talent) as inverter:
await inverter.create_remote()
await asyncio.sleep(0)
assert inverter.remote.stream
# call disc inside the contexter manager
await inverter.disc()
# call disc outside the contexter manager (__exit__() was called)
await inverter.disc()
del inverter
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0

View File

@@ -0,0 +1,226 @@
# test_with_pytest.py
import pytest
import asyncio
import sys,gc
from mock import patch
from enum import Enum
from infos import Infos
from cnf.config import Config
from proxy import Proxy
from inverter_base import InverterBase
from singleton import Singleton
from gen3.inverter_g3 import InverterG3
from async_stream import AsyncStream
from test_modbus_tcp import patch_mqtt_err, patch_mqtt_except, test_port, test_hostname
pytest_plugins = ('pytest_asyncio',)
# initialize the proxy statistics
Infos.static_init()
@pytest.fixture
def config_conn():
Config.act_config = {
'mqtt':{
'host': test_hostname,
'port': test_port,
'user': '',
'passwd': ''
},
'ha':{
'auto_conf_prefix': 'homeassistant',
'discovery_prefix': 'homeassistant',
'entity_prefix': 'tsun',
'proxy_node_id': 'test_1',
'proxy_unique_id': ''
},
'tsun':{'enabled': True, 'host': 'test_cloud.local', 'port': 1234}, 'inverters':{'allow_all':True}
}
@pytest.fixture(scope="module", autouse=True)
def module_init():
Singleton._instances.clear()
yield
class FakeReader():
def __init__(self):
self.on_recv = asyncio.Event()
async def read(self, max_len: int):
await self.on_recv.wait()
return b''
def feed_eof(self):
return
class FakeWriter():
def write(self, buf: bytes):
return
def get_extra_info(self, sel: str):
if sel == 'peername':
return ('47.1.2.3', 10000)
elif sel == 'sockname':
return 'sock:1234'
assert False
def is_closing(self):
return False
def close(self):
return
async def wait_closed(self):
return
class MockType(Enum):
RD_TEST_0_BYTES = 1
RD_TEST_TIMEOUT = 2
RD_TEST_EXCEPT = 3
test = MockType.RD_TEST_0_BYTES
@pytest.fixture
def patch_open_connection():
async def new_conn(conn):
await asyncio.sleep(0)
return FakeReader(), FakeWriter()
def new_open(host: str, port: int):
if test == MockType.RD_TEST_TIMEOUT:
raise ConnectionRefusedError
elif test == MockType.RD_TEST_EXCEPT:
raise ValueError("Value cannot be negative") # Compliant
return new_conn(None)
with patch.object(asyncio, 'open_connection', new_open) as conn:
yield conn
@pytest.fixture
def patch_healthy():
with patch.object(AsyncStream, 'healthy') as conn:
yield conn
def test_method_calls(patch_healthy):
spy = patch_healthy
reader = FakeReader()
writer = FakeWriter()
InverterBase._registry.clear()
with InverterG3(reader, writer) as inverter:
assert inverter.local.stream
assert inverter.local.ifc
for inv in InverterBase:
inv.healthy()
del inv
spy.assert_called_once()
del inverter
cnt = 0
for inv in InverterBase:
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_conn(config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
with InverterG3(FakeReader(), FakeWriter()) as inverter:
await inverter.create_remote()
await asyncio.sleep(0)
assert inverter.remote.stream
del inverter
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_remote_except(config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
global test
test = MockType.RD_TEST_TIMEOUT
with InverterG3(FakeReader(), FakeWriter()) as inverter:
await inverter.create_remote()
await asyncio.sleep(0)
assert inverter.remote.stream==None
test = MockType.RD_TEST_EXCEPT
await inverter.create_remote()
await asyncio.sleep(0)
assert inverter.remote.stream==None
del inverter
test = MockType.RD_TEST_0_BYTES
cnt = 0
for inv in InverterBase:
print(f'InverterBase refs:{gc.get_referrers(inv)}')
cnt += 1
assert cnt == 0
@pytest.mark.asyncio
async def test_mqtt_publish(config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
Proxy.class_init()
with InverterG3(FakeReader(), FakeWriter()) as inverter:
stream = inverter.local.stream
await inverter.async_publ_mqtt() # check call with invalid unique_id
stream._Talent__set_serial_no(serial_no= "123344")
stream.new_data['inverter'] = True
stream.db.db['inverter'] = {}
await inverter.async_publ_mqtt()
assert stream.new_data['inverter'] == False
stream.new_data['env'] = True
stream.db.db['env'] = {}
await inverter.async_publ_mqtt()
assert stream.new_data['env'] == False
Infos.new_stat_data['proxy'] = True
await inverter.async_publ_mqtt()
assert Infos.new_stat_data['proxy'] == False
@pytest.mark.asyncio
async def test_mqtt_err(config_conn, patch_open_connection, patch_mqtt_err):
_ = config_conn
_ = patch_open_connection
_ = patch_mqtt_err
assert asyncio.get_running_loop()
Proxy.class_init()
with InverterG3(FakeReader(), FakeWriter()) as inverter:
stream = inverter.local.stream
stream._Talent__set_serial_no(serial_no= "123344")
stream.new_data['inverter'] = True
stream.db.db['inverter'] = {}
await inverter.async_publ_mqtt()
assert stream.new_data['inverter'] == True
@pytest.mark.asyncio
async def test_mqtt_except(config_conn, patch_open_connection, patch_mqtt_except):
_ = config_conn
_ = patch_open_connection
_ = patch_mqtt_except
assert asyncio.get_running_loop()
Proxy.class_init()
with InverterG3(FakeReader(), FakeWriter()) as inverter:
stream = inverter.local.stream
stream._Talent__set_serial_no(serial_no= "123344")
stream.new_data['inverter'] = True
stream.db.db['inverter'] = {}
await inverter.async_publ_mqtt()
assert stream.new_data['inverter'] == True

View File

@@ -0,0 +1,199 @@
# test_with_pytest.py
import pytest
import asyncio
from mock import patch
from enum import Enum
from infos import Infos
from cnf.config import Config
from proxy import Proxy
from inverter_base import InverterBase
from singleton import Singleton
from gen3plus.inverter_g3p import InverterG3P
from test_modbus_tcp import patch_mqtt_err, patch_mqtt_except, test_port, test_hostname
pytest_plugins = ('pytest_asyncio',)
# initialize the proxy statistics
Infos.static_init()
@pytest.fixture
def config_conn():
Config.act_config = {
'mqtt':{
'host': test_hostname,
'port': test_port,
'user': '',
'passwd': ''
},
'ha':{
'auto_conf_prefix': 'homeassistant',
'discovery_prefix': 'homeassistant',
'entity_prefix': 'tsun',
'proxy_node_id': 'test_1',
'proxy_unique_id': ''
},
'solarman':{'enabled': True, 'host': 'test_cloud.local', 'port': 1234}, 'inverters':{'allow_all':True}
}
@pytest.fixture(scope="module", autouse=True)
def module_init():
Singleton._instances.clear()
yield
class FakeReader():
def __init__(self):
self.on_recv = asyncio.Event()
async def read(self, max_len: int):
await self.on_recv.wait()
return b''
def feed_eof(self):
return
class FakeWriter():
def write(self, buf: bytes):
return
def get_extra_info(self, sel: str):
if sel == 'peername':
return ('47.1.2.3', 10000)
elif sel == 'sockname':
return 'sock:1234'
assert False
def is_closing(self):
return False
def close(self):
return
async def wait_closed(self):
return
class MockType(Enum):
RD_TEST_0_BYTES = 1
RD_TEST_TIMEOUT = 2
RD_TEST_EXCEPT = 3
test = MockType.RD_TEST_0_BYTES
@pytest.fixture
def patch_open_connection():
async def new_conn(conn):
await asyncio.sleep(0)
return FakeReader(), FakeWriter()
def new_open(host: str, port: int):
if test == MockType.RD_TEST_TIMEOUT:
raise ConnectionRefusedError
elif test == MockType.RD_TEST_EXCEPT:
raise ValueError("Value cannot be negative") # Compliant
return new_conn(None)
with patch.object(asyncio, 'open_connection', new_open) as conn:
yield conn
def test_method_calls(config_conn):
_ = config_conn
reader = FakeReader()
writer = FakeWriter()
InverterBase._registry.clear()
with InverterG3P(reader, writer, client_mode=False) as inverter:
assert inverter.local.stream
assert inverter.local.ifc
@pytest.mark.asyncio
async def test_remote_conn(config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
with InverterG3P(FakeReader(), FakeWriter(), client_mode=False) as inverter:
await inverter.create_remote()
await asyncio.sleep(0)
assert inverter.remote.stream
@pytest.mark.asyncio
async def test_remote_except(config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
global test
test = MockType.RD_TEST_TIMEOUT
with InverterG3P(FakeReader(), FakeWriter(), client_mode=False) as inverter:
await inverter.create_remote()
await asyncio.sleep(0)
assert inverter.remote.stream==None
test = MockType.RD_TEST_EXCEPT
await inverter.create_remote()
await asyncio.sleep(0)
assert inverter.remote.stream==None
test = MockType.RD_TEST_0_BYTES
@pytest.mark.asyncio
async def test_mqtt_publish(config_conn, patch_open_connection):
_ = config_conn
_ = patch_open_connection
assert asyncio.get_running_loop()
Proxy.class_init()
with InverterG3P(FakeReader(), FakeWriter(), client_mode=False) as inverter:
stream = inverter.local.stream
await inverter.async_publ_mqtt() # check call with invalid unique_id
stream._set_serial_no(snr= 123344)
stream.new_data['inverter'] = True
stream.db.db['inverter'] = {}
await inverter.async_publ_mqtt()
assert stream.new_data['inverter'] == False
stream.new_data['env'] = True
stream.db.db['env'] = {}
await inverter.async_publ_mqtt()
assert stream.new_data['env'] == False
Infos.new_stat_data['proxy'] = True
await inverter.async_publ_mqtt()
assert Infos.new_stat_data['proxy'] == False
@pytest.mark.asyncio
async def test_mqtt_err(config_conn, patch_open_connection, patch_mqtt_err):
_ = config_conn
_ = patch_open_connection
_ = patch_mqtt_err
assert asyncio.get_running_loop()
Proxy.class_init()
with InverterG3P(FakeReader(), FakeWriter(), client_mode=False) as inverter:
stream = inverter.local.stream
stream._set_serial_no(snr= 123344)
stream.new_data['inverter'] = True
stream.db.db['inverter'] = {}
await inverter.async_publ_mqtt()
assert stream.new_data['inverter'] == True
@pytest.mark.asyncio
async def test_mqtt_except(config_conn, patch_open_connection, patch_mqtt_except):
_ = config_conn
_ = patch_open_connection
_ = patch_mqtt_except
assert asyncio.get_running_loop()
Proxy.class_init()
with InverterG3P(FakeReader(), FakeWriter(), client_mode=False) as inverter:
stream = inverter.local.stream
stream._set_serial_no(snr= 123344)
stream.new_data['inverter'] = True
stream.db.db['inverter'] = {}
await inverter.async_publ_mqtt()
assert stream.new_data['inverter'] == True

View File

@@ -1,11 +1,10 @@
# test_with_pytest.py
import pytest
import asyncio
from app.src.modbus import Modbus
from app.src.infos import Infos, Register
from modbus import Modbus
from infos import Infos, Register
pytest_plugins = ('pytest_asyncio',)
# pytestmark = pytest.mark.asyncio(scope="module")
class ModbusTestHelper(Modbus):
def __init__(self):
@@ -32,7 +31,12 @@ def test_modbus_crc():
assert mb._Modbus__check_crc(b'\x01\x06\x20\x08\x00\x00\x03\xc8')
assert 0x5c75 == mb._Modbus__calc_crc(b'\x01\x03\x08\x01\x2c\x00\x2c\x02\x2c\x2c\x46')
msg = b'\x01\x03\x28\x51'
msg += b'\x0e\x08\xd3\x00\x29\x13\x87\x00\x3e\x00\x00\x01\x2c\x03\xb4\x00'
msg += b'\x08\x00\x00\x00\x00\x01\x59\x01\x21\x03\xe6\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\xe6\xef'
assert 0 == mb._Modbus__calc_crc(msg)
def test_build_modbus_pdu():
'''Check building and sending a MODBUS RTU'''
mb = ModbusTestHelper()
@@ -71,11 +75,12 @@ def test_recv_resp_crc_err():
mb.req_pend = True
mb.last_addr = 1
mb.last_fcode = 3
mb.last_reg == 0x300e
mb.last_len == 2
mb.last_reg = 0x300e
mb.last_len = 2
mb.set_node_id('test')
# check matching response, but with CRC error
call = 0
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x04\x01\x2c\x00\x46\xbb\xf3', 'test'):
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x04\x01\x2c\x00\x46\xbb\xf3'):
call += 1
assert mb.err == 1
assert 0 == call
@@ -91,12 +96,13 @@ def test_recv_resp_invalid_addr():
# simulate a transmitted request
mb.last_addr = 1
mb.last_fcode = 3
mb.last_reg == 0x300e
mb.last_len == 2
mb.last_reg = 0x300e
mb.last_len = 2
mb.set_node_id('test')
# check not matching response, with wrong server addr
call = 0
for key, update in mb.recv_resp(mb.db, b'\x02\x03\x04\x01\x2c\x00\x46\x88\xf4', 'test'):
for key, update in mb.recv_resp(mb.db, b'\x02\x03\x04\x01\x2c\x00\x46\x88\xf4'):
call += 1
assert mb.err == 2
assert 0 == call
@@ -116,7 +122,8 @@ def test_recv_recv_fcode():
# check not matching response, with wrong function code
call = 0
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x04\x01\x2c\x00\x46\xbb\xf4', 'test'):
mb.set_node_id('test')
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x04\x01\x2c\x00\x46\xbb\xf4'):
call += 1
assert mb.err == 3
@@ -138,7 +145,8 @@ def test_recv_resp_len():
# check not matching response, with wrong data length
call = 0
for key, update, _ in mb.recv_resp(mb.db, b'\x01\x03\x04\x01\x2c\x00\x46\xbb\xf4', 'test'):
mb.set_node_id('test')
for key, update, _ in mb.recv_resp(mb.db, b'\x01\x03\x04\x01\x2c\x00\x46\xbb\xf4'):
call += 1
assert mb.err == 4
@@ -157,7 +165,8 @@ def test_recv_unexpect_resp():
# check unexpected response, which must be dropped
call = 0
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x04\x01\x2c\x00\x46\xbb\xf4', 'test'):
mb.set_node_id('test')
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x04\x01\x2c\x00\x46\xbb\xf4'):
call += 1
assert mb.err == 5
@@ -173,8 +182,9 @@ def test_parse_resp():
assert mb.req_pend
call = 0
exp_result = ['V0.0.212', 4.4, 0.7, 0.7, 30]
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x0c\x01\x2c\x00\x2c\x00\x2c\x00\x46\x00\x46\x00\x46\x32\xc8', 'test'):
mb.set_node_id('test')
exp_result = ['V0.0.2C', 4.4, 0.7, 0.7, 30]
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x0c\x01\x2c\x00\x2c\x00\x2c\x00\x46\x00\x46\x00\x46\x32\xc8'):
if key == 'grid':
assert update == True
elif key == 'inverter':
@@ -222,8 +232,9 @@ def test_queue2():
assert mb.send_calls == 1
assert mb.pdu == b'\x01\x030\x07\x00\x06{\t'
call = 0
exp_result = ['V0.0.212', 4.4, 0.7, 0.7, 30]
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x0c\x01\x2c\x00\x2c\x00\x2c\x00\x46\x00\x46\x00\x46\x32\xc8', 'test'):
mb.set_node_id('test')
exp_result = ['V0.0.2C', 4.4, 0.7, 0.7, 30]
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x0c\x01\x2c\x00\x2c\x00\x2c\x00\x46\x00\x46\x00\x46\x32\xc8'):
if key == 'grid':
assert update == True
elif key == 'inverter':
@@ -241,14 +252,14 @@ def test_queue2():
assert mb.send_calls == 2
assert mb.pdu == b'\x01\x06\x20\x08\x00\x04\x02\x0b'
for key, update, val in mb.recv_resp(mb.db, b'\x01\x06\x20\x08\x00\x04\x02\x0b', 'test'):
pass
for key, update, val in mb.recv_resp(mb.db, b'\x01\x06\x20\x08\x00\x04\x02\x0b'):
pass # call generator mb.recv_resp()
assert mb.que.qsize() == 0
assert mb.send_calls == 3
assert mb.pdu == b'\x01\x030\x07\x00\x06{\t'
call = 0
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x0c\x01\x2c\x00\x2c\x00\x2c\x00\x46\x00\x46\x00\x46\x32\xc8', 'test'):
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x0c\x01\x2c\x00\x2c\x00\x2c\x00\x46\x00\x46\x00\x46\x32\xc8'):
call += 1
assert 0 == mb.err
assert 5 == call
@@ -272,8 +283,9 @@ def test_queue3():
assert mb.recv_responses == 0
call = 0
exp_result = ['V0.0.212', 4.4, 0.7, 0.7, 30]
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x0c\x01\x2c\x00\x2c\x00\x2c\x00\x46\x00\x46\x00\x46\x32\xc8', 'test'):
mb.set_node_id('test')
exp_result = ['V0.0.2C', 4.4, 0.7, 0.7, 30]
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x0c\x01\x2c\x00\x2c\x00\x2c\x00\x46\x00\x46\x00\x46\x32\xc8'):
if key == 'grid':
assert update == True
elif key == 'inverter':
@@ -292,8 +304,8 @@ def test_queue3():
assert mb.send_calls == 2
assert mb.pdu == b'\x01\x06\x20\x08\x00\x04\x02\x0b'
for key, update, val in mb.recv_resp(mb.db, b'\x01\x06\x20\x08\x00\x04\x02\x0b', 'test'):
pass
for key, update, val in mb.recv_resp(mb.db, b'\x01\x06\x20\x08\x00\x04\x02\x0b'):
pass # no code in loop is OK; calling the generator is the purpose
assert 0 == mb.err
assert mb.recv_responses == 2
@@ -301,7 +313,7 @@ def test_queue3():
assert mb.send_calls == 3
assert mb.pdu == b'\x01\x030\x07\x00\x06{\t'
call = 0
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x0c\x01\x2c\x00\x2c\x00\x2c\x00\x46\x00\x46\x00\x46\x32\xc8', 'test'):
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x0c\x01\x2c\x00\x2c\x00\x2c\x00\x46\x00\x46\x00\x46\x32\xc8'):
call += 1
assert 0 == mb.err
assert mb.recv_responses == 2
@@ -359,22 +371,34 @@ async def test_timeout():
assert mb.retry_cnt == 0
assert mb.send_calls == 4
# assert mb.counter == {}
def test_recv_unknown_data():
'''Receive a response with an unknwon register'''
mb = ModbusTestHelper()
assert 0x9000 not in mb.map
mb.map[0x9000] = {'reg': Register.TEST_REG1, 'fmt': '!H', 'ratio': 1}
assert 0x9000 not in mb.mb_reg_mapping
mb.mb_reg_mapping[0x9000] = {'reg': Register.TEST_REG1, 'fmt': '!H', 'ratio': 1}
mb.build_msg(1,3,0x9000,2)
# check matching response, but with CRC error
call = 0
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x04\x01\x2c\x00\x46\xbb\xf4', 'test'):
mb.set_node_id('test')
for key, update, val in mb.recv_resp(mb.db, b'\x01\x03\x04\x01\x2c\x00\x46\xbb\xf4'):
call += 1
assert mb.err == 0
assert 0 == call
assert not mb.req_pend
del mb.map[0x9000]
del mb.mb_reg_mapping[0x9000]
def test_close():
'''Check queue handling for build_msg() calls'''
mb = ModbusTestHelper()
mb.build_msg(1,3,0x3007,6)
mb.build_msg(1,6,0x2008,4)
assert mb.que.qsize() == 1
mb.build_msg(1,3,0x3007,6)
assert mb.que.qsize() == 2
assert mb.que.empty() == False
mb.close()
assert mb.que.qsize() == 0
assert mb.que.empty() == True

View File

@@ -0,0 +1,388 @@
# test_with_pytest.py
import pytest
import asyncio
from aiomqtt import MqttCodeError
from mock import patch
from enum import Enum
from singleton import Singleton
from cnf.config import Config
from infos import Infos
from mqtt import Mqtt
from inverter_base import InverterBase
from messages import Message, State
from proxy import Proxy
from modbus_tcp import ModbusConn, ModbusTcp
pytest_plugins = ('pytest_asyncio',)
# initialize the proxy statistics
Infos.static_init()
@pytest.fixture(scope="module", autouse=True)
def module_init():
Singleton._instances.clear()
yield
@pytest.fixture(scope="module")
def test_port():
return 1883
@pytest.fixture(scope="module")
def test_hostname():
# if getenv("GITHUB_ACTIONS") == "true":
# return 'mqtt'
# else:
return 'test.mosquitto.org'
@pytest.fixture
def config_conn(test_hostname, test_port):
Config.act_config = {
'mqtt':{
'host': test_hostname,
'port': test_port,
'user': '',
'passwd': ''
},
'ha':{
'auto_conf_prefix': 'homeassistant',
'discovery_prefix': 'homeassistant',
'entity_prefix': 'tsun',
'proxy_node_id': 'test_1',
'proxy_unique_id': ''
},
'solarman':{
'host': 'access1.solarmanpv.com',
'port': 10000
},
'inverters':{
'allow_all': True,
"R170000000000001":{
'node_id': 'inv_1'
},
"Y170000000000001":{
'node_id': 'inv_2',
'monitor_sn': 2000000000,
'modbus_polling': True,
'suggested_area': "",
'sensor_list': 0x2b0,
'client_mode':{
'host': '192.168.0.1',
'port': 8899,
'forward': True
}
}
}
}
class FakeReader():
RD_TEST_0_BYTES = 1
RD_TEST_TIMEOUT = 2
RD_TEST_13_BYTES = 3
RD_TEST_SW_EXCEPT = 4
RD_TEST_OS_ERROR = 5
def __init__(self):
self.on_recv = asyncio.Event()
self.test = self.RD_TEST_0_BYTES
async def read(self, max_len: int):
print(f'fakeReader test: {self.test}')
await self.on_recv.wait()
if self.test == self.RD_TEST_0_BYTES:
return b''
elif self.test == self.RD_TEST_13_BYTES:
print('fakeReader return 13 bytes')
self.test = self.RD_TEST_0_BYTES
return b'test-data-req'
elif self.test == self.RD_TEST_TIMEOUT:
raise TimeoutError
elif self.test == self.RD_TEST_SW_EXCEPT:
self.test = self.RD_TEST_0_BYTES
self.unknown_var += 1
elif self.test == self.RD_TEST_OS_ERROR:
self.test = self.RD_TEST_0_BYTES
raise ConnectionRefusedError
def feed_eof(self):
return
class FakeWriter():
def __init__(self, conn='remote.intern'):
self.conn = conn
self.closing = False
def write(self, buf: bytes):
return
async def drain(self):
await asyncio.sleep(0)
def get_extra_info(self, sel: str):
if sel == 'peername':
return self.conn
elif sel == 'sockname':
return 'sock:1234'
assert False
def is_closing(self):
return self.closing
def close(self):
self.closing = True
async def wait_closed(self):
await asyncio.sleep(0)
@pytest.fixture
def patch_open():
async def new_conn(conn):
await asyncio.sleep(0)
return FakeReader(), FakeWriter(conn)
def new_open(host: str, port: int):
return new_conn(f'{host}:{port}')
with patch.object(asyncio, 'open_connection', new_open) as conn:
yield conn
@pytest.fixture
def patch_open_timeout():
def new_open(host: str, port: int):
raise TimeoutError
with patch.object(asyncio, 'open_connection', new_open) as conn:
yield conn
@pytest.fixture
def patch_open_value_error():
def new_open(host: str, port: int):
raise ValueError
with patch.object(asyncio, 'open_connection', new_open) as conn:
yield conn
@pytest.fixture
def patch_open_conn_abort():
def new_open(host: str, port: int):
raise ConnectionAbortedError
with patch.object(asyncio, 'open_connection', new_open) as conn:
yield conn
@pytest.fixture
def patch_no_mqtt():
with patch.object(Mqtt, 'publish') as conn:
yield conn
@pytest.fixture
def patch_mqtt_err():
def new_publish(self, key, data):
raise MqttCodeError(None)
with patch.object(Mqtt, 'publish', new_publish) as conn:
yield conn
@pytest.fixture
def patch_mqtt_except():
def new_publish(self, key, data):
raise ValueError("Test")
with patch.object(Mqtt, 'publish', new_publish) as conn:
yield conn
@pytest.mark.asyncio
async def test_modbus_conn(config_conn, patch_open):
_ = config_conn
_ = patch_open
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
async with ModbusConn('test.local', 1234) as inverter:
stream = inverter.local.stream
assert stream.node_id == 'G3P'
assert stream.addr == ('test.local:1234')
assert type(stream.ifc._reader) is FakeReader
assert type(stream.ifc._writer) is FakeWriter
assert Infos.stat['proxy']['Inverter_Cnt'] == 1
del inverter
for _ in InverterBase:
assert False
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
@pytest.mark.asyncio
async def test_modbus_no_cnf():
_ = config_conn
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
loop = asyncio.get_event_loop()
ModbusTcp(loop)
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
@pytest.mark.asyncio
async def test_modbus_timeout(config_conn, patch_open_timeout):
_ = config_conn
_ = patch_open_timeout
assert asyncio.get_running_loop()
Proxy.class_init()
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
loop = asyncio.get_event_loop()
ModbusTcp(loop)
await asyncio.sleep(0.01)
for m in Message:
if (m.node_id == 'inv_2'):
assert False
await asyncio.sleep(0.01)
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
@pytest.mark.asyncio
async def test_modbus_value_err(config_conn, patch_open_value_error):
_ = config_conn
_ = patch_open_value_error
assert asyncio.get_running_loop()
Proxy.class_init()
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
loop = asyncio.get_event_loop()
ModbusTcp(loop)
await asyncio.sleep(0.01)
for m in Message:
if (m.node_id == 'inv_2'):
assert False
await asyncio.sleep(0.01)
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
@pytest.mark.asyncio
async def test_modbus_conn_abort(config_conn, patch_open_conn_abort):
_ = config_conn
_ = patch_open_conn_abort
assert asyncio.get_running_loop()
Proxy.class_init()
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
loop = asyncio.get_event_loop()
ModbusTcp(loop)
await asyncio.sleep(0.01)
for m in Message:
if (m.node_id == 'inv_2'):
assert False
await asyncio.sleep(0.01)
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
@pytest.mark.asyncio
async def test_modbus_cnf2(config_conn, patch_no_mqtt, patch_open):
_ = config_conn
_ = patch_open
_ = patch_no_mqtt
assert asyncio.get_running_loop()
Proxy.class_init()
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
ModbusTcp(asyncio.get_event_loop())
await asyncio.sleep(0.01)
test = 0
for m in Message:
if (m.node_id == 'inv_2'):
test += 1
assert Infos.stat['proxy']['Inverter_Cnt'] == 1
m.shutdown_started = True
m.ifc._reader.on_recv.set()
del m
assert 1 == test
await asyncio.sleep(0.01)
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
@pytest.mark.asyncio
async def test_modbus_cnf3(config_conn, patch_no_mqtt, patch_open):
_ = config_conn
_ = patch_open
_ = patch_no_mqtt
assert asyncio.get_running_loop()
Proxy.class_init()
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
ModbusTcp(asyncio.get_event_loop(), tim_restart= 0)
await asyncio.sleep(0.01)
test = 0
for m in Message:
if (m.node_id == 'inv_2'):
assert Infos.stat['proxy']['Inverter_Cnt'] == 1
test += 1
if test == 1:
m.shutdown_started = False
m.ifc._reader.on_recv.set()
await asyncio.sleep(0.1)
assert m.state == State.closed
await asyncio.sleep(0.1)
else:
m.shutdown_started = True
m.ifc._reader.on_recv.set()
del m
assert 2 == test
await asyncio.sleep(0.01)
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
@pytest.mark.asyncio
async def test_mqtt_err(config_conn, patch_mqtt_err, patch_open):
_ = config_conn
_ = patch_open
_ = patch_mqtt_err
assert asyncio.get_running_loop()
Proxy.class_init()
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
ModbusTcp(asyncio.get_event_loop(), tim_restart= 0)
await asyncio.sleep(0.01)
test = 0
for m in Message:
if (m.node_id == 'inv_2'):
assert Infos.stat['proxy']['Inverter_Cnt'] == 1
test += 1
if test == 1:
m.shutdown_started = False
m.ifc._reader.on_recv.set()
await asyncio.sleep(0.1)
assert m.state == State.closed
await asyncio.sleep(0.1)
await asyncio.sleep(0.1)
else:
m.shutdown_started = True
m.ifc._reader.on_recv.set()
del m
await asyncio.sleep(0.01)
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
@pytest.mark.asyncio
async def test_mqtt_except(config_conn, patch_mqtt_except, patch_open):
_ = config_conn
_ = patch_open
_ = patch_mqtt_except
assert asyncio.get_running_loop()
Proxy.class_init()
assert Infos.stat['proxy']['Inverter_Cnt'] == 0
ModbusTcp(asyncio.get_event_loop(), tim_restart= 0)
await asyncio.sleep(0.01)
test = 0
for m in Message:
if (m.node_id == 'inv_2'):
assert Infos.stat['proxy']['Inverter_Cnt'] == 1
test += 1
if test == 1:
m.shutdown_started = False
m.ifc._reader.on_recv.set()
await asyncio.sleep(0.1)
assert m.state == State.closed
await asyncio.sleep(0.1)
else:
m.shutdown_started = True
m.ifc._reader.on_recv.set()
del m
await asyncio.sleep(0.01)
assert Infos.stat['proxy']['Inverter_Cnt'] == 0

268
app/tests/test_mqtt.py Normal file
View File

@@ -0,0 +1,268 @@
# test_with_pytest.py
import pytest
import asyncio
import aiomqtt
import logging
from mock import patch, Mock
from async_stream import AsyncIfcImpl
from singleton import Singleton
from mqtt import Mqtt
from modbus import Modbus
from gen3plus.solarman_v5 import SolarmanV5
from cnf.config import Config
NO_MOSQUITTO_TEST = False
'''disable all tests with connections to test.mosquitto.org'''
pytest_plugins = ('pytest_asyncio',)
@pytest.fixture(scope="module", autouse=True)
def module_init():
Singleton._instances.clear()
yield
@pytest.fixture(scope="module")
def test_port():
return 1883
@pytest.fixture(scope="module")
def test_hostname():
# if getenv("GITHUB_ACTIONS") == "true":
# return 'mqtt'
# else:
return 'test.mosquitto.org'
@pytest.fixture
def config_mqtt_conn(test_hostname, test_port):
Config.act_config = {'mqtt':{'host': test_hostname, 'port': test_port, 'user': '', 'passwd': ''},
'ha':{'auto_conf_prefix': 'homeassistant','discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun'}
}
@pytest.fixture
def config_no_conn(test_port):
Config.act_config = {'mqtt':{'host': "", 'port': test_port, 'user': '', 'passwd': ''},
'ha':{'auto_conf_prefix': 'homeassistant','discovery_prefix': 'homeassistant', 'entity_prefix': 'tsun'}
}
@pytest.fixture
def spy_at_cmd():
conn = SolarmanV5(None, ('test.local', 1234), server_side=True, client_mode= False, ifc=AsyncIfcImpl())
conn.node_id = 'inv_2/'
with patch.object(conn, 'send_at_cmd', wraps=conn.send_at_cmd) as wrapped_conn:
yield wrapped_conn
conn.close()
@pytest.fixture
def spy_modbus_cmd():
conn = SolarmanV5(None, ('test.local', 1234), server_side=True, client_mode= False, ifc=AsyncIfcImpl())
conn.node_id = 'inv_1/'
with patch.object(conn, 'send_modbus_cmd', wraps=conn.send_modbus_cmd) as wrapped_conn:
yield wrapped_conn
conn.close()
@pytest.fixture
def spy_modbus_cmd_client():
conn = SolarmanV5(None, ('test.local', 1234), server_side=False, client_mode= False, ifc=AsyncIfcImpl())
conn.node_id = 'inv_1/'
with patch.object(conn, 'send_modbus_cmd', wraps=conn.send_modbus_cmd) as wrapped_conn:
yield wrapped_conn
conn.close()
def test_native_client(test_hostname, test_port):
"""Sanity check: Make sure the paho-mqtt client can connect to the test
MQTT server. Otherwise the test set NO_MOSQUITTO_TEST to True and disable
all test cases which depends on the test.mosquitto.org server
"""
global NO_MOSQUITTO_TEST
if NO_MOSQUITTO_TEST:
pytest.skip('skipping, since Mosquitto is not reliable at the moment')
import paho.mqtt.client as mqtt
import threading
c = mqtt.Client(mqtt.CallbackAPIVersion.VERSION2)
c.loop_start()
try:
# Just make sure the client connects successfully
on_connect = threading.Event()
c.on_connect = Mock(side_effect=lambda *_: on_connect.set())
c.connect_async(test_hostname, test_port)
if not on_connect.wait(3):
NO_MOSQUITTO_TEST = True # skip all mosquitto tests
pytest.skip('skipping, since Mosquitto is not reliable at the moment')
finally:
c.loop_stop()
@pytest.mark.asyncio
async def test_mqtt_connection(config_mqtt_conn):
if NO_MOSQUITTO_TEST:
pytest.skip('skipping, since Mosquitto is not reliable at the moment')
_ = config_mqtt_conn
assert asyncio.get_running_loop()
on_connect = asyncio.Event()
async def cb():
on_connect.set()
try:
m = Mqtt(cb)
assert m.task
assert await asyncio.wait_for(on_connect.wait(), 5)
# await asyncio.sleep(1)
assert 0 == m.ha_restarts
await m.publish('homeassistant/status', 'online')
except TimeoutError:
assert False
finally:
await m.close()
await m.publish('homeassistant/status', 'online')
@pytest.mark.asyncio
async def test_ha_reconnect(config_mqtt_conn):
if NO_MOSQUITTO_TEST:
pytest.skip('skipping, since Mosquitto is not reliable at the moment')
_ = config_mqtt_conn
on_connect = asyncio.Event()
async def cb():
on_connect.set()
try:
m = Mqtt(cb)
msg = aiomqtt.Message(topic= 'homeassistant/status', payload= b'offline', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
assert not on_connect.is_set()
msg = aiomqtt.Message(topic= 'homeassistant/status', payload= b'online', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
assert on_connect.is_set()
finally:
await m.close()
@pytest.mark.asyncio
async def test_mqtt_no_config(config_no_conn):
_ = config_no_conn
assert asyncio.get_running_loop()
on_connect = asyncio.Event()
async def cb():
on_connect.set()
try:
m = Mqtt(cb)
assert m.task
await asyncio.sleep(0)
assert not on_connect.is_set()
try:
await m.publish('homeassistant/status', 'online')
assert False
except Exception:
pass
except TimeoutError:
assert False
finally:
await m.close()
@pytest.mark.asyncio
async def test_msg_dispatch(config_mqtt_conn, spy_modbus_cmd):
_ = config_mqtt_conn
spy = spy_modbus_cmd
try:
m = Mqtt(None)
msg = aiomqtt.Message(topic= 'tsun/inv_1/rated_load', payload= b'2', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x2008, 2, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'100', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 1024, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'50', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_awaited_once_with(Modbus.WRITE_SINGLE_REG, 0x202c, 512, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/modbus_read_regs', payload= b'0x3000, 10', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_awaited_once_with(Modbus.READ_REGS, 0x3000, 10, logging.INFO)
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/modbus_read_inputs', payload= b'0x3000, 10', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_awaited_once_with(Modbus.READ_INPUTS, 0x3000, 10, logging.INFO)
finally:
await m.close()
@pytest.mark.asyncio
async def test_msg_dispatch_err(config_mqtt_conn, spy_modbus_cmd):
_ = config_mqtt_conn
spy = spy_modbus_cmd
try:
m = Mqtt(None)
# test out of range param
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'-1', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_not_called()
# test unknown node_id
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_2/out_coeff', payload= b'2', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_not_called()
# test invalid fload param
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/out_coeff', payload= b'2, 3', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_not_called()
spy.reset_mock()
msg = aiomqtt.Message(topic= 'tsun/inv_1/modbus_read_regs', payload= b'0x3000, 10, 7', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_not_called()
finally:
await m.close()
@pytest.mark.asyncio
async def test_msg_ignore_client_conn(config_mqtt_conn, spy_modbus_cmd_client):
'''don't call function if connnection is not in server mode'''
_ = config_mqtt_conn
spy = spy_modbus_cmd_client
try:
m = Mqtt(None)
msg = aiomqtt.Message(topic= 'tsun/inv_1/rated_load', payload= b'2', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_not_called()
finally:
await m.close()
@pytest.mark.asyncio
async def test_ignore_unknown_func(config_mqtt_conn):
'''don't dispatch for unknwon function names'''
_ = config_mqtt_conn
try:
m = Mqtt(None)
msg = aiomqtt.Message(topic= 'tsun/inv_1/rated_load', payload= b'2', qos= 0, retain = False, mid= 0, properties= None)
for _ in m.each_inverter(msg, 'unkown_fnc'):
assert False
finally:
await m.close()
@pytest.mark.asyncio
async def test_at_cmd_dispatch(config_mqtt_conn, spy_at_cmd):
_ = config_mqtt_conn
spy = spy_at_cmd
try:
m = Mqtt(None)
msg = aiomqtt.Message(topic= 'tsun/inv_2/at_cmd', payload= b'AT+', qos= 0, retain = False, mid= 0, properties= None)
await m.dispatch_msg(msg)
spy.assert_awaited_once_with('AT+')
finally:
await m.close()

91
app/tests/test_proxy.py Normal file
View File

@@ -0,0 +1,91 @@
# test_with_pytest.py
import pytest
import asyncio
import aiomqtt
import logging
from mock import patch, Mock
from singleton import Singleton
from proxy import Proxy
from mqtt import Mqtt
from gen3plus.solarman_v5 import SolarmanV5
from cnf.config import Config
pytest_plugins = ('pytest_asyncio',)
@pytest.fixture(scope="module", autouse=True)
def module_init():
def new_init(cls, cb_mqtt_is_up):
pass # empty test methos
Singleton._instances.clear()
with patch.object(Mqtt, '__init__', new_init):
yield
@pytest.fixture(scope="module")
def test_port():
return 1883
@pytest.fixture(scope="module")
def test_hostname():
# if getenv("GITHUB_ACTIONS") == "true":
# return 'mqtt'
# else:
return 'test.mosquitto.org'
@pytest.fixture
def config_conn(test_hostname, test_port):
Config.act_config = {
'mqtt':{
'host': test_hostname,
'port': test_port,
'user': '',
'passwd': ''
},
'ha':{
'auto_conf_prefix': 'homeassistant',
'discovery_prefix': 'homeassistant',
'entity_prefix': 'tsun',
'proxy_node_id': 'test_1',
'proxy_unique_id': ''
},
'inverters': {
'allow_all': True,
"R170000000000001":{
'node_id': 'inv_1'
}
}
}
@pytest.mark.asyncio
async def test_inverter_cb(config_conn):
_ = config_conn
with patch.object(Proxy, '_cb_mqtt_is_up', wraps=Proxy._cb_mqtt_is_up) as spy:
print('call Proxy.class_init')
Proxy.class_init()
assert 'homeassistant/' == Proxy.discovery_prfx
assert 'tsun/' == Proxy.entity_prfx
assert 'test_1/' == Proxy.proxy_node_id
await Proxy._cb_mqtt_is_up()
spy.assert_called_once()
@pytest.mark.asyncio
async def test_mqtt_is_up(config_conn):
_ = config_conn
with patch.object(Mqtt, 'publish') as spy:
Proxy.class_init()
await Proxy._cb_mqtt_is_up()
spy.assert_called()
@pytest.mark.asyncio
async def test_mqtt_proxy_statt_invalid(config_conn):
_ = config_conn
with patch.object(Mqtt, 'publish') as spy:
Proxy.class_init()
await Proxy._async_publ_mqtt_proxy_stat('InValId_kEy')
spy.assert_not_called()

32
app/tests/test_server.py Normal file
View File

@@ -0,0 +1,32 @@
# test_with_pytest.py
import pytest
import logging
import os
from mock import patch
from server import get_log_level
def test_get_log_level():
with patch.dict(os.environ, {}):
log_lvl = get_log_level()
assert log_lvl == None
with patch.dict(os.environ, {'LOG_LVL': 'DEBUG'}):
log_lvl = get_log_level()
assert log_lvl == logging.DEBUG
with patch.dict(os.environ, {'LOG_LVL': 'INFO'}):
log_lvl = get_log_level()
assert log_lvl == logging.INFO
with patch.dict(os.environ, {'LOG_LVL': 'WARN'}):
log_lvl = get_log_level()
assert log_lvl == logging.WARNING
with patch.dict(os.environ, {'LOG_LVL': 'ERROR'}):
log_lvl = get_log_level()
assert log_lvl == logging.ERROR
with patch.dict(os.environ, {'LOG_LVL': 'UNKNOWN'}):
log_lvl = get_log_level()
assert log_lvl == None

View File

@@ -0,0 +1,19 @@
# test_with_pytest.py
import pytest
from singleton import Singleton
class Example(metaclass=Singleton):
def __init__(self):
pass # is a dummy test class
def test_singleton_metaclass():
Singleton._instances.clear()
a = Example()
assert 1 == len(Singleton._instances)
b = Example()
assert 1 == len(Singleton._instances)
assert a is b
del a
assert 1 == len(Singleton._instances)
del b
assert 0 == len(Singleton._instances)

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,233 @@
import pytest
import asyncio
from async_stream import AsyncIfcImpl, StreamPtr
from gen3plus.solarman_v5 import SolarmanV5, SolarmanBase
from gen3plus.solarman_emu import SolarmanEmu
from infos import Infos, Register
from test_solarman import FakeIfc, FakeInverter, MemoryStream, get_sn_int, get_sn, correct_checksum, config_tsun_inv1, msg_modbus_rsp
from test_infos_g3p import str_test_ip, bytes_test_ip
timestamp = 0x3224c8bc
class InvStream(MemoryStream):
def __init__(self, msg=b''):
super().__init__(msg)
def _emu_timestamp(self):
return timestamp
class CldStream(SolarmanEmu):
def __init__(self, inv: InvStream, inverter=FakeInverter()):
_ifc = FakeIfc()
_ifc.remote.stream = inv
super().__init__(inverter, ('test.local', 1234), _ifc, server_side=False, client_mode=False)
self.__msg = b''
self.__msg_len = 0
self.__offs = 0
self.msg_count = 0
self.msg_recvd = []
def _emu_timestamp(self):
return timestamp
def append_msg(self, msg):
self.__msg += msg
self.__msg_len += len(msg)
def _read(self) -> int:
copied_bytes = 0
try:
if (self.__offs < self.__msg_len):
self.ifc.rx_fifo += self.__msg[self.__offs:]
copied_bytes = self.__msg_len - self.__offs
self.__offs = self.__msg_len
except Exception:
pass # ignore exceptions here
return copied_bytes
def _SolarmanBase__flush_recv_msg(self) -> None:
self.msg_recvd.append(
{
'control': self.control,
'seq': str(self.seq),
'data_len': self.data_len
}
)
super()._SolarmanBase__flush_recv_msg()
self.msg_count += 1
@pytest.fixture
def device_ind_msg(bytes_test_ip): # 0x4110
msg = b'\xa5\xd4\x00\x10\x41\x00\x01' +get_sn() +b'\x02\xbc\xc8\x24\x32'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x05\x3c\x78\x01\x00\x01\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00' + bytes_test_ip
msg += b'\x0f\x00\x01\xb0'
msg += b'\x02\x0f\x00\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xfe\xfe\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += correct_checksum(msg)
msg += b'\x15'
return msg
@pytest.fixture
def inverter_ind_msg(): # 0x4210
msg = b'\xa5\x99\x01\x10\x42\x00\x01' +get_sn() +b'\x01\xb0\x02\xbc\xc8'
msg += b'\x24\x32\x3c\x00\x00\x00\xa0\x47\xe4\x33\x01\x00\x03\x08\x00\x00'
msg += b'\x59\x31\x37\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x31'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x01\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x40\x10\x08\xc8\x00\x49\x13\x8d\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00'
msg += b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x04\x00'
msg += b'\x04\x00\x00\x01\xff\xff\x00\x01\x00\x06\x00\x68\x00\x68\x05\x00'
msg += b'\x09\xcd\x07\xb6\x13\x9c\x13\x24\x00\x01\x07\xae\x04\x0f\x00\x41'
msg += b'\x00\x0f\x0a\x64\x0a\x64\x00\x06\x00\x06\x09\xf6\x12\x8c\x12\x8c'
msg += b'\x00\x10\x00\x10\x14\x52\x14\x52\x00\x10\x00\x10\x01\x51\x00\x05'
msg += b'\x00\x00\x00\x01\x13\x9c\x0f\xa0\x00\x4e\x00\x66\x03\xe8\x04\x00'
msg += b'\x09\xce\x07\xa8\x13\x9c\x13\x26\x00\x00\x00\x00\x00\x00\x00\x00'
msg += b'\x00\x00\x00\x00\x04\x00\x04\x00\x00\x00\x00\x00\xff\xff\x00\x00'
msg += b'\x00\x00\x00\x00'
msg += correct_checksum(msg)
msg += b'\x15'
return msg
@pytest.fixture
def inverter_rsp_msg(): # 0x1210
msg = b'\xa5\x0a\x00\x10\x12\x02\02' +get_sn() +b'\x01\x01'
msg += b'\x00\x00\x00\x00'
msg += b'\x3c\x00\x00\x00'
msg += correct_checksum(msg)
msg += b'\x15'
return msg
@pytest.fixture
def heartbeat_ind():
msg = b'\xa5\x01\x00\x10G\x00\x01\x00\x00\x00\x00\x00Y\x15'
return msg
def test_emu_init_close():
# received a message with wrong start byte plus an valid message
# the complete receive buffer must be cleared to
# find the next valid message
inv = InvStream()
cld = CldStream(inv)
cld.close()
@pytest.mark.asyncio
async def test_emu_start(config_tsun_inv1, msg_modbus_rsp, str_test_ip, device_ind_msg):
_ = config_tsun_inv1
assert asyncio.get_running_loop()
inv = InvStream(msg_modbus_rsp)
assert asyncio.get_running_loop() == inv.mb_timer.loop
await inv.send_start_cmd(get_sn_int(), str_test_ip, True, inv.mb_first_timeout)
inv.read() # read complete msg, and dispatch msg
assert not inv.header_valid # must be invalid, since msg was handled and buffer flushed
assert inv.msg_count == 1
assert inv.control == 0x1510
cld = CldStream(inv)
cld.ifc.update_header_cb(inv.ifc.fwd_fifo.peek())
assert inv.ifc.fwd_fifo.peek() == device_ind_msg
cld.close()
def test_snd_hb(config_tsun_inv1, heartbeat_ind):
_ = config_tsun_inv1
inv = InvStream()
cld = CldStream(inv)
# await inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
cld.send_heartbeat_cb(0)
assert cld.ifc.tx_fifo.peek() == heartbeat_ind
cld.close()
@pytest.mark.asyncio
async def test_snd_inv_data(config_tsun_inv1, inverter_ind_msg, inverter_rsp_msg):
_ = config_tsun_inv1
inv = InvStream()
inv.db.set_db_def_value(Register.INVERTER_STATUS, 1)
inv.db.set_db_def_value(Register.DETECT_STATUS_1, 2)
inv.db.set_db_def_value(Register.VERSION, 'V4.0.10')
inv.db.set_db_def_value(Register.GRID_VOLTAGE, 224.8)
inv.db.set_db_def_value(Register.GRID_CURRENT, 0.73)
inv.db.set_db_def_value(Register.GRID_FREQUENCY, 50.05)
inv.db.set_db_def_value(Register.PROD_COMPL_TYPE, 6)
assert asyncio.get_running_loop() == inv.mb_timer.loop
await inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
inv.db.set_db_def_value(Register.DATA_UP_INTERVAL, 17) # set test value
cld = CldStream(inv)
cld.time_ofs = 0x33e447a0
cld.last_sync = cld._emu_timestamp() - 60
cld.pkt_cnt = 0x802
assert cld.data_up_inv == 17 # check test value
cld.data_up_inv = 0.1 # speedup test first data msg
cld._init_new_client_conn()
cld.data_up_inv = 0.5 # timeout for second data msg
await asyncio.sleep(0.2)
assert cld.ifc.tx_fifo.get() == inverter_ind_msg
cld.append_msg(inverter_rsp_msg)
cld.read() # read complete msg, and dispatch msg
assert not cld.header_valid # must be invalid, since msg was handled and buffer flushed
assert cld.msg_count == 1
assert cld.header_len==11
assert cld.snr == 2070233889
assert cld.unique_id == '2070233889'
assert cld.msg_recvd[0]['control']==0x1210
assert cld.msg_recvd[0]['seq']=='02:02'
assert cld.msg_recvd[0]['data_len']==0x0a
assert '02b0' == cld.db.get_db_value(Register.SENSOR_LIST, None)
assert cld.db.stat['proxy']['Unknown_Msg'] == 0
cld.close()
@pytest.mark.asyncio
async def test_rcv_invalid(config_tsun_inv1, inverter_ind_msg, inverter_rsp_msg):
_ = config_tsun_inv1
inv = InvStream()
assert asyncio.get_running_loop() == inv.mb_timer.loop
await inv.send_start_cmd(get_sn_int(), str_test_ip, False, inv.mb_first_timeout)
inv.db.set_db_def_value(Register.DATA_UP_INTERVAL, 17) # set test value
cld = CldStream(inv)
cld._init_new_client_conn()
cld.append_msg(inverter_ind_msg)
cld.read() # read complete msg, and dispatch msg
assert not cld.header_valid # must be invalid, since msg was handled and buffer flushed
assert cld.msg_count == 1
assert cld.header_len==11
assert cld.snr == 2070233889
assert cld.unique_id == '2070233889'
assert cld.msg_recvd[0]['control']==0x4210
assert cld.msg_recvd[0]['seq']=='00:01'
assert cld.msg_recvd[0]['data_len']==0x199
assert '02b0' == cld.db.get_db_value(Register.SENSOR_LIST, None)
assert cld.db.stat['proxy']['Unknown_Msg'] == 1
cld.close()

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 8.4 KiB

View File

@@ -1,26 +0,0 @@
// {type:sequence}
// {generate:true}
[Inverter]ContactInd>[Proxy]
[Proxy]-[note: store Contact Info in proxy{bg:cornsilk}]
[Proxy]ContactRsp (Ok).>[Inverter]
[Inverter]getTimeReq>[Proxy]
[Proxy]ContactInd>[Cloud]
[Cloud]ContactRsp (Ok).>[Proxy]
[Proxy]getTimeReq>[Cloud]
[Cloud]TimeRsp (time).>[Proxy]
[Proxy]TimeRsp (time).>[Inverter]
[Inverter]-[note: set clock in inverter{bg:cornsilk}]
[Inverter]DataInd (ts:=time)>[Proxy]
[Proxy]DataRsp>[Inverter]
[Proxy]DataInd (ts)>>[Cloud]
[Proxy]DataInd>>[MQTT-Broker]
[Cloud]DataRsp>>[Proxy]
[Inverter]DataInd (ts:=time)>[Proxy]
[Proxy]DataRsp>[Inverter]
[Proxy]DataInd (ts)>>[Cloud]
[Proxy]DataInd>>[MQTT-Broker]
[Cloud]DataRsp>>[Proxy]

View File

@@ -83,7 +83,7 @@ services:
- ${PROJECT_DIR:-./}tsun-proxy/log:/home/tsun-proxy/log
- ${PROJECT_DIR:-./}tsun-proxy/config:/home/tsun-proxy/config
healthcheck:
test: wget --no-verbose --tries=1 --spider http://localhost:8127/-/healthy || exit 1
test: wget --no-verbose --tries=1 --spider http://127.0.0.1:8127/-/healthy || exit 1
interval: 10s
timeout: 3s
networks:

4
ha_addons/.gitignore vendored Normal file
View File

@@ -0,0 +1,4 @@
.data.json
config.yaml
apparmor.txt
README.md

203
ha_addons/Makefile Normal file
View File

@@ -0,0 +1,203 @@
#!make
include ../.env
.PHONY: debug dev build clean rootfs repro rc rel
SHELL = /bin/sh
JINJA = jinja2
IMAGE = tsun-gen3-addon
# Source folders for building the local add-on
SRC=../app
SRC_PROXY=$(SRC)/src
CNF_PROXY=$(SRC)/config
# Target folders for building the local add-on and the docker container
ADDON_PATH = ha_addon
DST=$(ADDON_PATH)/rootfs
DST_PROXY=$(DST)/home/proxy
# base director of the add-on repro for installing the add-on git repros
INST_BASE=../../ha-addons
# Template folder for build the config.yaml variants
TEMPL=templates
# help variable STAGE determine the target to build
STAGE=dev
debug : STAGE=debug
rc : STAGE=rc
rel : STAGE=rel
export BUILD_DATE := ${shell date -Iminutes}
BUILD_ID := ${shell date +'%y%m%d%H%M'}
rel : BUILD_ID=
VERSION := $(shell cat $(SRC)/.version)
export MAJOR := $(shell echo $(VERSION) | cut -f1 -d.)
PUBLIC_URL := $(shell echo $(PUBLIC_CONTAINER_REGISTRY) | cut -f1 -d/)
PUBLIC_USER :=$(shell echo $(PUBLIC_CONTAINER_REGISTRY) | cut -f2 -d/)
build: local_add_on
dev debug: local_add_on
@echo version: $(VERSION) build-date: $(BUILD_DATE) image: $(PRIVAT_CONTAINER_REGISTRY)$(IMAGE)
export VERSION=$(VERSION)-$@-$(BUILD_ID) && \
export IMAGE=$(PRIVAT_CONTAINER_REGISTRY)$(IMAGE) && \
docker buildx bake -f docker-bake.hcl $@
rc: local_add_on
@[ "${RC}" ] || ( echo ">> RC is not set"; exit 1 )
@echo version: $(VERSION) build-date: $(BUILD_DATE) image: $(PUBLIC_CONTAINER_REGISTRY)$(IMAGE)
@echo login at $(PUBLIC_URL) as $(PUBLIC_USER)
@DO_LOGIN="$(shell echo $(PUBLIC_CR_KEY) | docker login $(PUBLIC_URL) -u $(PUBLIC_USER) --password-stdin)"
export VERSION=$(VERSION)-$@$(RC) && \
export IMAGE=$(PUBLIC_CONTAINER_REGISTRY)$(IMAGE) && \
docker buildx bake -f docker-bake.hcl $@
rel: local_add_on
@echo version: $(VERSION) build-date: $(BUILD_DATE) image: $(PUBLIC_CONTAINER_REGISTRY)$(IMAGE)
@echo login at $(PUBLIC_URL) as $(PUBLIC_USER)
@DO_LOGIN="$(shell echo $(PUBLIC_CR_KEY) | docker login $(PUBLIC_URL) -u $(PUBLIC_USER) --password-stdin)"
export VERSION=$(VERSION)-$@ && \
export IMAGE=$(PUBLIC_CONTAINER_REGISTRY)$(IMAGE) && \
docker buildx bake -f docker-bake.hcl $@
clean:
rm -r -f $(DST_PROXY)
rm -f $(DST)/requirements.txt
rm -f $(ADDON_PATH)/config.yaml
rm -f $(TEMPL)/.data.json
docker logout ghcr.io
#############
# Build the local add-on with a rootfs and config.yaml
# The rootfs is needed to build the add-on Docker container
#
local_add_on: rootfs $(ADDON_PATH)/config.yaml $(ADDON_PATH)/apparmor.txt $(ADDON_PATH)/README.md
# collect source files
SRC_FILES := $(wildcard $(SRC_PROXY)/*.py)\
$(wildcard $(SRC_PROXY)/*.ini)\
$(wildcard $(SRC_PROXY)/cnf/*.py)\
$(wildcard $(SRC_PROXY)/cnf/*.toml)\
$(wildcard $(SRC_PROXY)/gen3/*.py)\
$(wildcard $(SRC_PROXY)/gen3plus/*.py)
CNF_FILES := $(wildcard $(CNF_PROXY)/*.toml)
# determine destination files
TARGET_FILES = $(SRC_FILES:$(SRC_PROXY)/%=$(DST_PROXY)/%)
CONFIG_FILES = $(CNF_FILES:$(CNF_PROXY)/%=$(DST_PROXY)/%)
rootfs: $(TARGET_FILES) $(CONFIG_FILES) $(DST)/requirements.txt
$(CONFIG_FILES): $(DST_PROXY)/% : $(CNF_PROXY)/%
@echo Copy $< to $@
@mkdir -p $(@D)
@cp $< $@
$(TARGET_FILES): $(DST_PROXY)/% : $(SRC_PROXY)/%
@echo Copy $< to $@
@mkdir -p $(@D)
@cp $< $@
$(DST)/requirements.txt : $(SRC)/requirements.txt
@echo Copy $< to $@
@cp $< $@
$(ADDON_PATH)/%.yaml: $(TEMPL)/%.jinja $(TEMPL)/.data.json
$(JINJA) --strict -D AppVersion=$(VERSION) -D BuildID=$(BUILD_ID) --format=json $^ -o $@
$(ADDON_PATH)/%.txt: $(TEMPL)/%.jinja $(TEMPL)/.data.json
$(JINJA) --strict --format=json $^ -o $@
$(ADDON_PATH)/%.md: $(TEMPL)/%.jinja $(TEMPL)/.data.json
$(JINJA) --strict --format=json $^ -o $@
# build a common data.json file from STAGE depending source files
# don't touch the destination if the checksum of src and dst is equal
$(TEMPL)/.data.json: FORCE
rsync --checksum $(TEMPL)/$(STAGE)_data.json $@
FORCE : ;
#############
# Build repository for Home Assistant Add-Onx
#
repro_files = DOCS.md icon.png logo.png translations/de.yaml translations/en.yaml rootfs/run.sh
repro_root = CHANGELOG.md LICENSE.md
repro_templates = config.yaml
repro_apparmor = apparmor.txt
repro_readme = README.md
repro_subdirs = translations rootfs
repro_vers = debug dev rc rel
repro_all_files := $(foreach dir,$(repro_vers), $(foreach file,$(repro_files),$(INST_BASE)/ha_addon_$(dir)/$(file)))
repro_root_files := $(foreach dir,$(repro_vers), $(foreach file,$(repro_root),$(INST_BASE)/ha_addon_$(dir)/$(file)))
repro_all_templates := $(foreach dir,$(repro_vers), $(foreach file,$(repro_templates),$(INST_BASE)/ha_addon_$(dir)/$(file)))
repro_all_apparmor := $(foreach dir,$(repro_vers), $(foreach file,$(repro_apparmor),$(INST_BASE)/ha_addon_$(dir)/$(file)))
repro_all_readme := $(foreach dir,$(repro_vers), $(foreach file,$(repro_readme),$(INST_BASE)/ha_addon_$(dir)/$(file)))
repro_all_subdirs := $(foreach dir,$(repro_vers), $(foreach file,$(repro_subdirs),$(INST_BASE)/ha_addon_$(dir)/$(file)))
debug: $(foreach file,$(repro_subdirs),$(INST_BASE)/ha_addon_debug/$(file)) \
$(foreach file,$(repro_templates),$(INST_BASE)/ha_addon_debug/$(file)) \
$(foreach file,$(repro_apparmor),$(INST_BASE)/ha_addon_debug/$(file)) \
$(foreach file,$(repro_readme),$(INST_BASE)/ha_addon_debug/$(file)) \
$(foreach file,$(repro_files),$(INST_BASE)/ha_addon_debug/$(file)) \
$(foreach file,$(repro_root),$(INST_BASE)/ha_addon_debug/$(file))
dev: $(foreach file,$(repro_subdirs),$(INST_BASE)/ha_addon_dev/$(file)) \
$(foreach file,$(repro_templates),$(INST_BASE)/ha_addon_dev/$(file)) \
$(foreach file,$(repro_apparmor),$(INST_BASE)/ha_addon_dev/$(file)) \
$(foreach file,$(repro_readme),$(INST_BASE)/ha_addon_dev/$(file)) \
$(foreach file,$(repro_files),$(INST_BASE)/ha_addon_dev/$(file)) \
$(foreach file,$(repro_root),$(INST_BASE)/ha_addon_dev/$(file))
rc: $(foreach file,$(repro_subdirs),$(INST_BASE)/ha_addon_rc/$(file)) \
$(foreach file,$(repro_templates),$(INST_BASE)/ha_addon_rc/$(file)) \
$(foreach file,$(repro_apparmor),$(INST_BASE)/ha_addon_rc/$(file)) \
$(foreach file,$(repro_readme),$(INST_BASE)/ha_addon_rc/$(file)) \
$(foreach file,$(repro_files),$(INST_BASE)/ha_addon_rc/$(file)) \
$(foreach file,$(repro_root),$(INST_BASE)/ha_addon_rc/$(file))
rel: $(foreach file,$(repro_subdirs),$(INST_BASE)/ha_addon_rel/$(file)) \
$(foreach file,$(repro_templates),$(INST_BASE)/ha_addon_rel/$(file)) \
$(foreach file,$(repro_apparmor),$(INST_BASE)/ha_addon_rel/$(file)) \
$(foreach file,$(repro_readme),$(INST_BASE)/ha_addon_rel/$(file)) \
$(foreach file,$(repro_files),$(INST_BASE)/ha_addon_rel/$(file)) \
$(foreach file,$(repro_root),$(INST_BASE)/ha_addon_rel/$(file))
$(repro_all_subdirs) :
mkdir -p $@
$(repro_all_templates) : $(INST_BASE)/ha_addon_%/config.yaml: $(TEMPL)/config.jinja $(TEMPL)/%_data.json $(SRC)/.version FORCE
$(JINJA) --strict -D AppVersion=$(VERSION)-$* -D BuildID=$(BUILD_ID) $< $(filter %.json,$^) -o $@
$(repro_all_apparmor) : $(INST_BASE)/ha_addon_%/apparmor.txt: $(TEMPL)/apparmor.jinja $(TEMPL)/%_data.json
$(JINJA) --strict $< $(filter %.json,$^) -o $@
$(repro_all_readme) : $(INST_BASE)/ha_addon_%/README.md: $(TEMPL)/README.jinja $(TEMPL)/%_data.json
$(JINJA) --strict $< $(filter %.json,$^) -o $@
$(filter $(INST_BASE)/ha_addon_debug/%,$(repro_root_files)) : $(INST_BASE)/ha_addon_debug/% : ../%
cp $< $@
$(filter $(INST_BASE)/ha_addon_dev/%,$(repro_root_files)) : $(INST_BASE)/ha_addon_dev/% : ../%
cp $< $@
$(filter $(INST_BASE)/ha_addon_rc/%,$(repro_root_files)) : $(INST_BASE)/ha_addon_rc/% : ../%
cp $< $@
$(filter $(INST_BASE)/ha_addon_rel/%,$(repro_root_files)) : $(INST_BASE)/ha_addon_rel/% : ../%
cp $< $@
$(filter $(INST_BASE)/ha_addon_debug/%,$(repro_all_files)) : $(INST_BASE)/ha_addon_debug/% : ha_addon/%
cp $< $@
$(filter $(INST_BASE)/ha_addon_dev/%,$(repro_all_files)) : $(INST_BASE)/ha_addon_dev/% : ha_addon/%
cp $< $@
$(filter $(INST_BASE)/ha_addon_rc/%,$(repro_all_files)) : $(INST_BASE)/ha_addon_rc/% : ha_addon/%
cp $< $@
$(filter $(INST_BASE)/ha_addon_rel/%,$(repro_all_files)) : $(INST_BASE)/ha_addon_rel/% : ha_addon/%
cp $< $@

100
ha_addons/docker-bake.hcl Normal file
View File

@@ -0,0 +1,100 @@
variable "IMAGE" {
default = "tsun-gen3-addon"
}
variable "VERSION" {
default = "0.0.0"
}
variable "MAJOR" {
default = "0"
}
variable "BUILD_DATE" {
default = "dev"
}
variable "BRANCH" {
default = ""
}
variable "DESCRIPTION" {
default = "This proxy enables a reliable connection between TSUN third generation inverters (eg. TSOL MS600, MS800, MS2000) and an MQTT broker to integrate the inverter into typical home automations."
}
target "_common" {
context = "ha_addon"
dockerfile = "Dockerfile"
args = {
VERSION = "${VERSION}"
environment = "production"
}
attest = [
"type =provenance,mode=max",
"type =sbom,generator=docker/scout-sbom-indexer:latest"
]
annotations = [
"index:io.hass.version=${VERSION}",
"index:io.hass.type=addon",
"index:io.hass.arch=armhf|aarch64|i386|amd64",
"index:org.opencontainers.image.title=TSUN-Proxy",
"index:org.opencontainers.image.authors=Stefan Allius",
"index:org.opencontainers.image.created=${BUILD_DATE}",
"index:org.opencontainers.image.version=${VERSION}",
"index:org.opencontainers.image.revision=${BRANCH}",
"index:org.opencontainers.image.description=${DESCRIPTION}",
"index:org.opencontainers.image.licenses=BSD-3-Clause",
"index:org.opencontainers.image.source=https://github.com/s-allius/tsun-gen3-proxy/ha_addons/ha_addon"
]
labels = {
"io.hass.version" = "${VERSION}"
"io.hass.type" = "addon"
"io.hass.arch" = "armhf|aarch64|i386|amd64"
"org.opencontainers.image.title" = "TSUN-Proxy"
"org.opencontainers.image.authors" = "Stefan Allius"
"org.opencontainers.image.created" = "${BUILD_DATE}"
"org.opencontainers.image.version" = "${VERSION}"
"org.opencontainers.image.revision" = "${BRANCH}"
"org.opencontainers.image.description" = "${DESCRIPTION}"
"org.opencontainers.image.licenses" = "BSD-3-Clause"
"org.opencontainers.image.source" = "https://github.com/s-allius/tsun-gen3-proxy/ha_addonsha_addon"
}
output = [
"type=image,push=true"
]
no-cache = false
platforms = ["linux/amd64", "linux/arm64", "linux/arm/v7"]
}
target "_debug" {
args = {
LOG_LVL = "DEBUG"
environment = "dev"
}
}
target "_prod" {
args = {
}
}
target "debug" {
inherits = ["_common", "_debug"]
tags = ["${IMAGE}:debug", "${IMAGE}:${VERSION}"]
}
target "dev" {
inherits = ["_common"]
tags = ["${IMAGE}:dev", "${IMAGE}:${VERSION}"]
}
target "preview" {
inherits = ["_common", "_prod"]
tags = ["${IMAGE}:preview", "${IMAGE}:${VERSION}"]
}
target "rc" {
inherits = ["_common", "_prod"]
tags = ["${IMAGE}:rc", "${IMAGE}:${VERSION}"]
no-cache = true
}
target "rel" {
inherits = ["_common", "_prod"]
tags = ["${IMAGE}:latest", "${IMAGE}:${MAJOR}", "${IMAGE}:${VERSION}"]
no-cache = true
}

177
ha_addons/ha_addon/DOCS.md Normal file
View File

@@ -0,0 +1,177 @@
# Home Assistant Add-on: TSUN Proxy
[TSUN Proxy][tsunproxy] enables a reliable connection between TSUN third generation
inverters and an MQTT broker. With the proxy, you can easily retrieve real-time values
such as power, current and daily energy and integrate the inverter into Home Assistant.
This works even without an internet connection.
The optional connection to the TSUN Cloud can be disabled!
## Pre-requisites
1. This Add-on requires an MQTT broker to work.
For a typical installation, we recommend the [Mosquitto add-on][Mosquitto] running on your Home Assistant.
2. You need to loop the proxy into the connection between the inverter and the TSUN Cloud,
you must adapt the DNS record within the network that your inverter uses. You need a mapping
from logger.talent-monitoring.com and/or iot.talent-monitoring.com to the IP address of your
Home Assistant.
This can be done, for example, by adding a local DNS record to [AdGuard Home Add-on][AdGuard]
(navigate to `filters` on the AdGuard panel and add an entry under `custom filtering rules`).
## Installation
The installation of this add-on is pretty straightforward and not different in
comparison to installing any other Home Assistant add-on.
1. Add the repository URL to the Home Assistant add-on store
[![Add repository on my Home Assistant][repository-badge]][repository-url]
2. Reload the add-on store page
3. Click the "Install" button to install the add-on.
4. Add your inverter configuration to the add-on configuration
5. Start the "TSUN-Proxy" add-on
6. Check the logs of the "TSUN-Proxy" add-on to see if everything went well.
_Please note, the add-on is pre-configured to connect with
Home Assistants default MQTT Broker. There is no need to configure any MQTT parameters
if you're running an MOSQUITTO add-on. Home Assistant communication and TSUN Cloud URL
and Ports are also pre-configured._
This automatic handling of the TSUN Cloud and MQTT Broker conflicts with the
[TSUN Proxy official documentation][tsunproxy]. The official documentation
will state `mqtt.host`, `mqtt.port`, `mqtt.user`, `mqtt.passwd` `solarman.host`,
`solarman.port` `tsun.host`, `tsun.port` and Home Assistant options are required.
For the add-on, however, this isn't needed.
## Configuration
**Note**: _Remember to restart the add-on when the configuration is changed._
Example add-on configuration after installation:
```yaml
inverters:
- serial: R17E000000000000
node_id: PV-Garage
suggested_area: Garage
modbus_polling: false
pv1.manufacturer: Shinefar
pv1.type: SF-M18/144550
pv2.manufacturer: Shinefar
pv2.type: SF-M18/144550
```
**Note**: _This is just an example, you need to replace the values with your own!_
Example add-on configuration for GEN3PLUS inverters:
```yaml
inverters:
- serial: Y17000000000000
monitor_sn: 2000000000
node_id: inv_1
suggested_area: Roof
modbus_polling: true
client_mode.host: 192.168.x.x
client_mode.port: 8899
client_mode.forward: true
pv1.manufacturer: Shinefar
pv1.type: SF-M18/144550
pv2.manufacturer: Shinefar
pv2.type: SF-M18/144550
pv3.manufacturer: Shinefar
pv3.type: SF-M18/144550
pv4.manufacturer: Shinefar
pv4.type: SF-M18/144550
```
Example add-on configuration for GEN3PLUS energie storages:
```yaml
batteries:
- serial: 4100000000000000
monitor_sn: 3000000000
node_id: bat_1
suggested_area: Garage
modbus_polling: false
pv1.manufacturer: Shinefar
pv1.type: SF-M18/144550
pv2.manufacturer: Shinefar
pv2.type: SF-M18/144550
```
**Note**: _This is just an example, you need to replace the values with your own!_
more information about the configuration can be found in the [configuration details page][configdetails].
## MQTT settings
By default, this add-on requires no `mqtt` config from the user. **This is not an error!**
However, you are free to set them if you want to override, however, in
general usage, that should not be needed and is not recommended for this add-on.
## Changelog & Releases
This repository keeps a change log using [GitHub's releases][releases]
functionality.
Releases are based on [Semantic Versioning][semver], and use the format
of `MAJOR.MINOR.PATCH`. In a nutshell, the version will be incremented
based on the following:
- `MAJOR`: Incompatible or major changes.
- `MINOR`: Backwards-compatible new features and enhancements.
- `PATCH`: Backwards-compatible bugfixes and package updates.
## Support
Got questions?
You have several options to get them answered:
- The Discussions section on [GitHub][discussions].
- The [Home Assistant Discord chat server][discord-ha] for general Home
Assistant discussions and questions.
You could also [open an issue here][issue] GitHub.
## Authors & contributors
The original setup of this repository is by [Stefan Allius][author].
We're very happy to receive contributions to this project! You can get started by reading [CONTRIBUTING.md][contribute].
## License
This project is licensed under the [BSD 3-clause License][bsd].
Note the aiomqtt library used is based on the paho-mqtt library, which has a dual license.
One of the licenses is the so-called [Eclipse Distribution License v1.0.][eclipse]
It is almost word-for-word identical to the BSD 3-clause License. The only differences are:
- One use of "COPYRIGHT OWNER" (EDL) instead of "COPYRIGHT HOLDER" (BSD)
- One use of "Eclipse Foundation, Inc." (EDL) instead of "copyright holder" (BSD)
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
[tsunproxy]: https://github.com/s-allius/tsun-gen3-proxy
[discussions]: https://github.com/s-allius/tsun-gen3-proxy/discussions
[author]: https://github.com/s-allius
[discord-ha]: https://discord.gg/c5DvZ4e
[issue]: https://github.com/s-allius/tsun-gen3-proxy/issues
[releases]: https://github.com/s-allius/tsun-gen3-proxy/releases
[contribute]: https://github.com/s-allius/tsun-gen3-proxy/blob/main/CONTRIBUTING.md
[semver]: http://semver.org/spec/v2.0.0.htm
[bsd]: https://opensource.org/licenses/BSD-3-Clause
[eclipse]: https://www.eclipse.org/org/documents/edl-v10.php
[Mosquitto]: https://github.com/home-assistant/addons/blob/master/mosquitto/DOCS.md
[AdGuard]: https://github.com/hassio-addons/addon-adguard-home
[repository-badge]: https://img.shields.io/badge/Add%20repository%20to%20my-Home%20Assistant-41BDF5?logo=home-assistant&style=for-the-badge
[repository-url]: https://my.home-assistant.io/redirect/supervisor_add_addon_repository/?repository_url=https%3A%2F%2Fgithub.com%2Fs-allius%2Fha-addons
[configdetails]: https://github.com/s-allius/tsun-gen3-proxy/wiki/Configuration-addon

92
ha_addons/ha_addon/Dockerfile Executable file
View File

@@ -0,0 +1,92 @@
############################################################################
#
# TSUN Proxy
# Homeassistant Add-on
#
# based on https://github.com/s-allius/tsun-gen3-proxy/tree/main
#
############################################################################
######################
# 1 Build Base Image #
######################
ARG BUILD_FROM="ghcr.io/hassio-addons/base:17.2.3"
# hadolint ignore=DL3006
FROM $BUILD_FROM AS base
# Installiere Python, pip und virtuelle Umgebungstools
RUN apk add --no-cache python3=3.12.10-r0 py3-pip=24.3.1-r0 && \
python -m venv /opt/venv && \
. /opt/venv/bin/activate
ENV PATH="/opt/venv/bin:$PATH"
#######################
# 2 Build wheel #
#######################
FROM base AS builder
COPY rootfs/requirements.txt /root/
RUN apk add --no-cache build-base=0.5-r3 && \
python -m pip install --no-cache-dir wheel==0.45.1 && \
python -OO -m pip wheel --no-cache-dir --wheel-dir=/root/wheels -r /root/requirements.txt
#######################
# 3 Build runtime #
#######################
FROM base AS runtime
ARG SERVICE_NAME
ARG VERSION
ARG LOG_LVL=INFO
ENV LOG_LVL=$LOG_LVL
ENV SERVICE_NAME=${SERVICE_NAME}
#######################
# 4 Install libraries #
#######################
# install the requirements from the wheels packages from the builder stage
# and unistall python packages and alpine package manger to reduce attack surface
COPY --from=builder /root/wheels /root/wheels
RUN python -m pip install --no-cache-dir --no-cache --no-index /root/wheels/* && \
rm -rf /root/wheels && \
python -m pip uninstall --yes wheel pip && \
apk --purge del apk-tools
#######################
# 5 copy data #
#######################
COPY rootfs/ /
#######################
# 6 run app #
#######################
# make run.sh executable
RUN chmod a+x /run.sh && \
echo ${VERSION} > /proxy-version.txt
# command to run on container start
CMD [ "/run.sh" ]
#######################

BIN
ha_addons/ha_addon/icon.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 113 KiB

BIN
ha_addons/ha_addon/logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 113 KiB

View File

@@ -0,0 +1,33 @@
#!/usr/bin/with-contenv bashio
echo "Add-on environment started"
echo "check for Home Assistant MQTT"
MQTT_HOST=$(bashio::services mqtt "host")
MQTT_PORT=$(bashio::services mqtt "port")
MQTT_USER=$(bashio::services mqtt "username")
MQTT_PASSWORD=$(bashio::services mqtt "password")
# if a MQTT was/not found, drop a note
if [ -z "$MQTT_HOST" ]; then
echo "MQTT not found"
else
echo "MQTT found"
export MQTT_HOST
export MQTT_PORT
export MQTT_USER
export MQTT_PASSWORD
fi
# Create folder for log und config files
mkdir -p /homeassistant/tsun-proxy/logs
cd /home/proxy || exit
export VERSION=$(cat /proxy-version.txt)
echo "Start Proxyserver..."
python3 server.py --json_config=/data/options.json --log_path=/homeassistant/tsun-proxy/logs/ --config_path=/homeassistant/tsun-proxy/ --log_backups=2

View File

@@ -0,0 +1,108 @@
---
configuration:
inverters:
name: Wechselrichter
description: >+
Für jeden Wechselrichter muss die Seriennummer des Wechselrichters einer MQTT
Definition zugeordnet werden. Dazu wird der entsprechende Konfigurationsblock mit der
16-stellige Seriennummer gestartet, so dass alle nachfolgenden Parameter diesem
Wechselrichter zugeordnet sind.
Weitere wechselrichterspezifische Parameter (z.B. Polling Mode) können im
Konfigurationsblock gesetzt werden.
Die Seriennummer der GEN3 Wechselrichter beginnen mit `R17` oder `R47` und die der GEN3PLUS
Wechselrichter mit `Y17`oder `Y47`!
Siehe Beispielkonfiguration im Dokumentations-Tab
batteries:
name: Batterien
description: >+
Für jeden Energiespeicher muss die Seriennummer des Speichers einer MQTT
Definition zugeordnet werden. Dazu wird der entsprechende Konfigurationsblock mit der
16-stellige Seriennummer gestartet, so dass alle nachfolgenden Parameter diesem
Speicher zugeordnet sind.
Weitere speicherspezifische Parameter (z.B. Polling Mode) können im
Konfigurationsblock gesetzt werden.
Die Seriennummer der GEN3PLUS Batteriespeicher beginnen mit `410`!
Siehe Beispielkonfiguration im Dokumentations-Tab
tsun.enabled:
name: Verbindung zur TSUN Cloud - nur für GEN3-Wechselrichter
description: >+
Schaltet die Verbindung zur TSUN Cloud ein/aus.
Diese Verbindung ist erforderlich, wenn Sie Daten an die TSUN Cloud senden möchten,
z.B. um die TSUN-Apps zu nutzen oder Firmware-Updates zu erhalten.
ein => normaler Proxy-Betrieb.
aus => Der Wechselrichter wird vom Internet isoliert.
solarman.enabled:
name: Verbindung zur Solarman/TSUN Cloud - nur für GEN3PLUS Wechselrichter
description: >+
Schaltet die Verbindung zur Solarman oder TSUN Cloud ein/aus.
Diese Verbindung ist erforderlich, wenn Sie Daten an die Cloud senden möchten,
z.B. um die Solarman App oder TSUN Smart App zu nutzen oder Firmware-Updates zu erhalten.
ein => normaler Proxy-Betrieb.
aus => Die GEN3PLUS Geräte werden vom Internet isoliert.
inverters.allow_all:
name: Erlaube Verbindungen von sämtlichen Wechselrichtern
description: >-
Der Proxy akzeptiert normalerweise nur Verbindungen von konfigurierten Wechselrichtern.
Schalten Sie dies für Testzwecke und unbekannte Seriennummern ein.
mqtt.host:
name: MQTT Broker Host
description: >-
Hostname oder IP-Adresse des MQTT-Brokers. Wenn nicht gesetzt, versucht das Addon, eine Verbindung zum Home Assistant MQTT-Broker herzustellen.
mqtt.port:
name: MQTT Broker Port
description: >-
Port des MQTT-Brokers. Wenn nicht gesetzt, versucht das Addon, eine Verbindung zum Home Assistant MQTT-Broker herzustellen.
mqtt.user:
name: MQTT Broker Benutzer
description: >-
Benutzer für den MQTT-Broker. Wenn nicht gesetzt, versucht das Addon, eine Verbindung zum Home Assistant MQTT-Broker herzustellen.
mqtt.passwd:
name: MQTT Broker Passwort
description: >-
Passwort für den MQTT-Broker. Wenn nicht gesetzt, versucht das Addon, eine Verbindung zum Home Assistant MQTT-Broker herzustellen.
ha.auto_conf_prefix:
name: MQTT-Präfix für das Abonnieren von Home Assistant-Statusaktualisierungen
ha.discovery_prefix:
name: MQTT-Präfix für das discovery topic
ha.entity_prefix:
name: MQTT-Themenpräfix für die Veröffentlichung von Wechselrichterwerten
ha.proxy_node_id:
name: MQTT-Knoten-ID für die proxy_node_id
ha.proxy_unique_id:
name: MQTT-eindeutige ID zur Identifizierung einer Proxy-Instanz
tsun.host:
name: TSUN Cloud Host
description: >-
Hostname oder IP-Adresse der TSUN-Cloud. Wenn nicht gesetzt, versucht das Addon, eine Verbindung zur Cloud logger.talent-monitoring.com herzustellen.
solarman.host:
name: Solarman Cloud Host
description: >-
Hostname oder IP-Adresse der Solarman-Cloud. Wenn nicht gesetzt, versucht das Addon, eine Verbindung zur Cloud iot.talent-monitoring.com herzustellen.
gen3plus.at_acl.tsun.allow:
name: TSUN GEN3PLUS ACL allow
description: >-
Liste erlaubter AT-Befehle für TSUN GEN3PLUS
gen3plus.at_acl.tsun.block:
name: TSUN GEN3 ACL block
description: >-
Liste blockierter AT-Befehle für TSUN GEN3PLUS
gen3plus.at_acl.mqtt.allow:
name: MQTT GEN3PLUS ACL allow
description: >-
Liste erlaubter MQTT-Befehle für GEN3PLUS
gen3plus.at_acl.mqtt.block:
name: MQTT GEN3PLUS ACL block
description: >-
Liste blockierter MQTT-Befehle für GEN3PLUS
network:
5005/tcp: listening Port für TSUN GEN3 Wechselrichter
10000/tcp: listening Port für TSUN GEN3PLUS Wechselrichter

View File

@@ -0,0 +1,109 @@
---
configuration:
inverters:
name: Inverters
description: >+
For each GEN3 inverter, the serial number of the inverter must be mapped to an MQTT
definition. To do this, the corresponding configuration block is started with
16-digit serial number so that all subsequent parameters are assigned
to this inverter. Further inverter-specific parameters (e.g. polling mode) can be set
in the configuration block.
The serial numbers of all GEN3 inverters start with `R17` or `R47` and that of the GEN3PLUS
inverters with Y17 or Y47!
For reference see example configuration in Documentation Tab
batteries:
name: Energy Storages
description: >+
For each energy storage device, the serial number of the storage device must be
assigned to an MQTT definition. To do this, the corresponding configuration block
is started with the 16-digit serial number so that all subsequent parameters are
assigned to this energy storage. Further inverter-specific parameters (e.g. polling
mode) can be set in the configuration block.
The serial numbers of all GEN3PLUS energy storages start with 410!
For reference see example configuration in Documentation Tab
tsun.enabled:
name: Connection to TSUN Cloud - for GEN3 inverter only
description: >+
switch on/off connection to the TSUN cloud.
This connection is only required if you want send data to the TSUN cloud
eg. to use the TSUN APPs or receive firmware updates.
on => normal proxy operation.
off => The Inverter become isolated from Internet.
solarman.enabled:
name: Connection to Solarman/TSUN Cloud - for GEN3PLUS inverter only
description: >+
switch on/off connection to the Solarman or TSUN cloud.
This connection is only required if you want send data to the cloud
eg. to use the Solarman APP, the TSUN Smart APP or receive firmware updates.
on => normal proxy operation.
off => The GEN3PLUS devices become isolated from Internet
inverters.allow_all:
name: Allow all connections from all inverters
description: >-
The proxy only usually accepts connections from configured inverters.
Switch on for test purposes and unknown serial numbers.
mqtt.host:
name: MQTT Broker Host
description: >-
Hostname or IP address of the MQTT broker. if not set, the addon will try to connect to the Home Assistant MQTT broker
mqtt.port:
name: MQTT Broker Port
description: >-
Port of the MQTT broker. if not set, the addon will try to connect to the Home Assistant MQTT broker
mqtt.user:
name: MQTT Broker User
description: >-
User for the MQTT broker. if not set, the addon will try to connect to the Home Assistant MQTT broker
mqtt.passwd:
name: MQTT Broker Password
description: >-
Password for the MQTT broker. if not set, the addon will try to connect to the Home Assistant MQTT broker
ha.auto_conf_prefix:
name: MQTT prefix for subscribing for homeassistant status updates
ha.discovery_prefix:
name: MQTT prefix for discovery topic
ha.entity_prefix:
name: MQTT topic prefix for publishing inverter values
ha.proxy_node_id:
name: MQTT node id, for the proxy_node_id
ha.proxy_unique_id:
name: MQTT unique id, to identify a proxy instance
tsun.host:
name: TSUN Cloud Host
description: >-
Hostname or IP address of the TSUN cloud. if not set, the addon will try to connect to the cloud
on logger.talent-monitoring.com
solarman.host:
name: Solarman Cloud Host
description: >-
Hostname or IP address of the Solarman cloud. if not set, the addon will try to connect to the cloud
on iot.talent-monitoring.com
gen3plus.at_acl.tsun.allow:
name: TSUN GEN3PLUS ACL allow
description: >-
List of allowed TSUN GEN3PLUS AT commands
gen3plus.at_acl.tsun.block:
name: TSUN GEN3 ACL block
description: >-
List of blocked TSUN GEN3PLUS AT commands
gen3plus.at_acl.mqtt.allow:
name: MQTT GEN3PLUS ACL allow
description: >-
List of allowed MQTT GEN3PLUS commands
gen3plus.at_acl.mqtt.block:
name: MQTT GEN3PLUS ACL block
description: >-
List of blocked MQTT GEN3PLUS commands
network:
5005/tcp: listening Port for TSUN GEN3 Devices
10000/tcp: listening Port for TSUN GEN3PLUS Devices

View File

@@ -0,0 +1,3 @@
name: TSUN-Proxy
url: https://github.com/s-allius/tsun-gen3-proxy/ha_addons
maintainer: Stefan Allius

View File

@@ -0,0 +1,21 @@
# Home Assistant Add-on: {{name}}
{{readme_descr}}
## Features
- Supports TSUN GEN3 PLUS inverters: TSOL-MS2000, MS1800 and MS1600
- Supports TSUN GEN3 PLUS batteries: TSOL-DC1000 (from version 0.13)
- Supports TSUN GEN3 inverters: TSOL-MS3000, MS800, MS700, MS600, MS400, MS350 and MS300
- `Home-Assistant` auto-discovery support
- `MODBUS` support via MQTT topics
- `AT-Command` support via MQTT topics (GEN3PLUS only)
- Faster DataUp interval sends measurement data to the MQTT broker every minute
- Self-sufficient island operation without internet
- Security-Features:
- control access via `AT-commands`
## About
This Add-on and the TSUN Proxy is not related to the company TSUN. It is a private initiative that aims to connect TSUN inverters and storage systems with an MQTT broker. There is no support and no warranty from TSUN.
{{readme_links}}

View File

@@ -0,0 +1,52 @@
#include <tunables/global>
profile {{slug}} flags=(attach_disconnected,mediate_deleted) {
#include <abstractions/base>
# Capabilities
file,
signal (send) set=(kill,term,int,hup,cont),
# S6-Overlay
/init ix,
/bin/** ix,
/usr/bin/** ix,
/run/{s6,s6-rc*,service}/** ix,
/package/** ix,
/command/** ix,
/etc/services.d/** rwix,
/etc/cont-init.d/** rwix,
/etc/cont-finish.d/** rwix,
/run/{,**} rwk,
/dev/tty rw,
# Bashio
/usr/lib/bashio/** ix,
/tmp/** rwk,
# Access to options.json and other files within your addon
/data/** rw,
# Start new profile for service
/usr/bin/myprogram cx -> myprogram,
profile myprogram flags=(attach_disconnected,mediate_deleted) {
#include <abstractions/base>
# Receive signals from S6-Overlay
signal (receive) peer=*_{{slug}},
# Access to options.json and other files within your addon
/data/** rw,
# Access to mapped volumes specified in config.json
/share/** rw,
# Access required for service functionality
/usr/bin/myprogram r,
/bin/bash rix,
/bin/echo ix,
/etc/passwd r,
/dev/tty rw,
}
}

125
ha_addons/templates/config.jinja Executable file
View File

@@ -0,0 +1,125 @@
name: {{name}}
description: {{description}}
version: {% if version is defined and version|length %} {{version}} {% elif BuildID is defined and BuildID|length %} {{AppVersion}}-{{BuildID}} {% else %} {{AppVersion}} {% endif %}
image: {{image}}
url: https://github.com/s-allius/tsun-gen3-proxy
slug: {{slug}}
advanced: {{advanced}}
stage: {{stage}}
init: false
arch:
- aarch64
- amd64
- armhf
- armv7
startup: services
homeassistant_api: true
map:
- type: addon_config
path: /homeassistant/tsun-proxy
read_only: False
services:
- mqtt:want
ports:
5005/tcp: 5005
10000/tcp: 10000
watchdog: "http://[HOST]:[PORT:8127]/-/healthy"
# Definition of parameters in the configuration tab of the addon
# parameters are available within the container as /data/options.json
# and should become picked up by the proxy - current workaround as a transfer script
schema:
inverters:
- serial: match(^(R17|R47|Y17|Y47).{13}$)
monitor_sn: int?
node_id: str
suggested_area: str
modbus_polling: bool
client_mode.host: match(\b((25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])\.){3}(25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])\b)?
client_mode.port: port?
client_mode.forward: bool?
modbus_scanning.start: int(0,65535)?
modbus_scanning.step: int(0,65535)?
modbus_scanning.bytes: int(1,80)?
pv1.manufacturer: str?
pv1.type: str?
pv2.manufacturer: str?
pv2.type: str?
pv3.manufacturer: str?
pv3.type: str?
pv4.manufacturer: str?
pv4.type: str?
pv5.manufacturer: str?
pv5.type: str?
pv6.manufacturer: str?
pv6.type: str?
tsun.enabled: bool
solarman.enabled: bool
inverters.allow_all: bool
batteries:
- serial: match(^(410).{13}$)
monitor_sn: int
node_id: str
suggested_area: str
modbus_polling: bool
client_mode.host: match(\b((25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])\.){3}(25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])\b)?
client_mode.port: port?
client_mode.forward: bool?
pv1.manufacturer: str?
pv1.type: str?
pv2.manufacturer: str?
pv2.type: str?
# optionale parameter
mqtt.host: str?
mqtt.port: port?
mqtt.user: str?
mqtt.passwd: password?
ha.auto_conf_prefix: str? # suggeriert optionale konfigurationsoption -> es darf jedoch kein default unter "options" angegeben werden
ha.discovery_prefix: str? # dito
ha.entity_prefix: str? #dito
ha.proxy_node_id: str? #dito
ha.proxy_unique_id: str? #dito
tsun.host: str?
solarman.host: str?
gen3plus.at_acl.tsun.allow:
- str
gen3plus.at_acl.tsun.block:
- str?
gen3plus.at_acl.mqtt.allow:
- str
gen3plus.at_acl.mqtt.block:
- str?
# set default options for mandatory parameters
# for optional parameters do not define any default value in the options dictionary.
# If any default value is given, the option becomes a required value.
options:
inverters:
- serial: R17E000000000000
monitor_sn: 0
node_id: inv_1
suggested_area: Roof
modbus_polling: false
pv1.manufacturer: Shinefar
pv1.type: SF-M18/144550
pv2.manufacturer: Shinefar
pv2.type: SF-M18/144550
batteries:
- serial: 4100000000000000
monitor_sn: 0
node_id: bat_1
suggested_area: Garage
modbus_polling: false
pv1.manufacturer: Shinefar
pv1.type: SF-M18/144550
pv2.manufacturer: Shinefar
pv2.type: SF-M18/144550
tsun.enabled: true # set default
solarman.enabled: true # set default
inverters.allow_all: false # set default
gen3plus.at_acl.tsun.allow: ["AT+Z", "AT+UPURL", "AT+SUPDATE"]
gen3plus.at_acl.mqtt.allow: ["AT+"]

View File

@@ -0,0 +1,11 @@
{
"name": "TSUN-Proxy (Debug)",
"description": "MQTT Proxy for TSUN Photovoltaic Inverters with Debug Logging",
"image": "docker.io/sallius/tsun-gen3-addon",
"slug": "tsun-proxy-debug",
"advanced": true,
"stage": "experimental",
"readme_descr": "This is a bleeding-edge version of the `TSUN Proxy` Add-On with debuging enabled by default.\n\nThe versions may be based on different feature branches and therefore the range of functions may change.\n\nIt is intended to be used to simulate special situations/problems and should only be used in consultation with the maintainer.\n\nFor production please use the stable version `TSUN Proxy`. If you are interested in a bleeding edge version, we offer the `TSUN Proxy (dev)` version.",
"readme_links": ""
}

View File

@@ -0,0 +1,11 @@
{
"name": "TSUN-Proxy (Dev)",
"description": "MQTT Proxy for TSUN Photovoltaic Inverters",
"image": "docker.io/sallius/tsun-gen3-addon",
"slug": "tsun-proxy-dev",
"advanced": false,
"stage": "experimental",
"readme_descr": "This is a bleeding-edge version of the `TSUN Proxy` Add-On.\n\nThe versions may be based on different feature branches and therefore the range of functions may change.\n\nIt is intended for testing new functions or testing new devices that are to be supported with the next release.\nFor production, please use the stable version 'TSUN Proxy'.",
"readme_links": ""
}

View File

@@ -0,0 +1,13 @@
{
"name": "TSUN-Proxy (Release Candidate)",
"description": "MQTT Proxy for TSUN Photovoltaic Inverters",
"version": "rc",
"image": "ghcr.io/s-allius/tsun-gen3-addon",
"slug": "tsun-proxy-rc",
"advanced": true,
"stage": "experimental",
"readme_descr": "This is a release candidate of the `TSUN Proxy` Add-On.\n\nIt is intended for testing the next release.\nFor production, please use the stable version 'TSUN Proxy'.",
"readme_links": ""
}

Some files were not shown because too many files have changed in this diff Show More