Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code coverage for SonarCloud #150

Merged
merged 20 commits into from
Aug 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .coveragerc
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
[run]
branch = True
branch = True
relative_files = True
22 changes: 19 additions & 3 deletions .github/workflows/python-app.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ on:
branches: [ "main", "dev-*", "*/issue*" ]
paths-ignore:
- '**.md' # Do no build on *.md changes
- '**.yml' # Do no build on *.yml changes
# - '**.yml' # Do no build on *.yml changes
- '**.yaml' # Do no build on *.yaml changes
- '**.yuml' # Do no build on *.yuml changes
- '**.svg' # Do no build on *.svg changes
Expand All @@ -18,10 +18,11 @@ on:
- '**.dockerfile' # Do no build on *.dockerfile changes
- '**.sh' # Do no build on *.sh changes
pull_request:
branches: [ "main" ]
branches: [ "main", "dev-*" ]

permissions:
contents: read
pull-requests: read # allows SonarCloud to decorate PRs with analysis results

jobs:
build:
Expand Down Expand Up @@ -53,4 +54,19 @@ jobs:
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
- name: Test with pytest
run: |
python -m pytest app
pip install pytest pytest-cov
#pytest app --doctest-modules --junitxml=junit/test-results.xml --cov=com --cov-report=xml --cov-report=html
python -m pytest app --cov=app/src --cov-report=xml
- name: Analyze with SonarCloud
uses: SonarSource/[email protected]
env:
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
with:
projectBaseDir: .
args:
-Dsonar.projectKey=s-allius_tsun-gen3-proxy
-Dsonar.organization=s-allius
-Dsonar.python.version=3.12
-Dsonar.python.coverage.reportPaths=coverage.xml
-Dsonar.tests=system_tests,app/tests
-Dsonar.source=app/src
65 changes: 0 additions & 65 deletions .github/workflows/sonarcloud.yml

This file was deleted.

4 changes: 4 additions & 0 deletions .sonarlint/connectedMode.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
{
"sonarCloudOrganization": "s-allius",
"projectKey": "s-allius_tsun-gen3-proxy"
}
6 changes: 5 additions & 1 deletion .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -11,5 +11,9 @@
"python.testing.pytestEnabled": true,
"flake8.args": [
"--extend-exclude=app/tests/*.py system_tests/*.py"
]
],
"sonarlint.connectedMode.project": {
"connectionId": "s-allius",
"projectKey": "s-allius_tsun-gen3-proxy"
}
}
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

## [unreleased]

- add SonarQube and code coverage support
- don't send MODBUS request when state is note up; adapt timeouts [#141](https://github.com/s-allius/tsun-gen3-proxy/issues/141)
- build multi arch images with sboms [#144](https://github.com/s-allius/tsun-gen3-proxy/issues/144)
- add timestamp to MQTT topics [#138](https://github.com/s-allius/tsun-gen3-proxy/issues/138)
Expand Down
4 changes: 2 additions & 2 deletions app/proxy.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 2 additions & 2 deletions app/proxy.yuml
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,9 @@
[IterRegistry||__iter__]^[Message|server_side:bool;header_valid:bool;header_len:unsigned;data_len:unsigned;unique_id;node_id;sug_area;_recv_buffer:bytearray;_send_buffer:bytearray;_forward_buffer:bytearray;db:Infos;new_data:list;state|_read():void<abstract>;close():void;inc_counter():void;dec_counter():void]
[Message]^[Talent|await_conn_resp_cnt;id_str;contact_name;contact_mail;db:InfosG3;mb:Modbus;switch|msg_contact_info();msg_ota_update();msg_get_time();msg_collector_data();msg_inverter_data();msg_unknown();;close()]
[Message]^[SolarmanV5|control;serial;snr;db:InfosG3P;mb:Modbus;switch|msg_unknown();;close()]
[Talent]^[ConnectionG3|remoteStream:ConnectionG3|healthy();close()]
[Talent]^[ConnectionG3|remote_stream:ConnectionG3|healthy();close()]
[Talent]has-1>[Modbus]
[SolarmanV5]^[ConnectionG3P|remoteStream:ConnectionG3P|healthy();close()]
[SolarmanV5]^[ConnectionG3P|remote_stream:ConnectionG3P|healthy();close()]
[SolarmanV5]has-1>[Modbus]
[AsyncStream|reader;writer;addr;r_addr;l_addr|<async>server_loop();<async>client_loop();<async>loop;disc();close();;__async_read();async_write();__async_forward()]^[ConnectionG3]
[AsyncStream]^[ConnectionG3P]
Expand Down
56 changes: 28 additions & 28 deletions app/src/async_stream.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,32 +67,32 @@ async def server_loop(self, addr: str) -> None:

# if the server connection closes, we also have to disconnect
# the connection to te TSUN cloud
if self.remoteStream:
if self.remote_stream:
logger.info(f'[{self.node_id}:{self.conn_no}] disc client '
f'connection: [{self.remoteStream.node_id}:'
f'{self.remoteStream.conn_no}]')
await self.remoteStream.disc()
f'connection: [{self.remote_stream.node_id}:'
f'{self.remote_stream.conn_no}]')
await self.remote_stream.disc()

async def client_loop(self, addr: str) -> None:
async def client_loop(self, _: str) -> None:
'''Loop for receiving messages from the TSUN cloud (client-side)'''
clientStream = await self.remoteStream.loop()
logger.info(f'[{clientStream.node_id}:{clientStream.conn_no}] '
client_stream = await self.remote_stream.loop()
logger.info(f'[{client_stream.node_id}:{client_stream.conn_no}] '
'Client loop stopped for'
f' l{clientStream.l_addr}')
f' l{client_stream.l_addr}')

# if the client connection closes, we don't touch the server
# connection. Instead we erase the client connection stream,
# thus on the next received packet from the inverter, we can
# establish a new connection to the TSUN cloud

# erase backlink to inverter
clientStream.remoteStream = None
client_stream.remote_stream = None

if self.remoteStream == clientStream:
# logging.debug(f'Client l{clientStream.l_addr} refs:'
# f' {gc.get_referrers(clientStream)}')
if self.remote_stream == client_stream:
# logging.debug(f'Client l{client_stream.l_addr} refs:'
# f' {gc.get_referrers(client_stream)}')
# than erase client connection
self.remoteStream = None
self.remote_stream = None

async def loop(self) -> Self:
"""Async loop handler for precessing all received messages"""
Expand Down Expand Up @@ -203,35 +203,35 @@ async def __async_forward(self) -> None:
if not self._forward_buffer:
return
try:
if not self.remoteStream:
if not self.remote_stream:
await self.async_create_remote()
if self.remoteStream:
if self.remoteStream._init_new_client_conn():
await self.remoteStream.async_write()
if self.remote_stream:
if self.remote_stream._init_new_client_conn():
await self.remote_stream.async_write()

if self.remoteStream:
self.remoteStream._update_header(self._forward_buffer)
if self.remote_stream:
self.remote_stream._update_header(self._forward_buffer)
hex_dump_memory(logging.INFO,
f'Forward to {self.remoteStream.addr}:',
f'Forward to {self.remote_stream.addr}:',
self._forward_buffer,
len(self._forward_buffer))
self.remoteStream.writer.write(self._forward_buffer)
await self.remoteStream.writer.drain()
self.remote_stream.writer.write(self._forward_buffer)
await self.remote_stream.writer.drain()
self._forward_buffer = bytearray(0)

except OSError as error:
if self.remoteStream:
rmt = self.remoteStream
self.remoteStream = None
if self.remote_stream:
rmt = self.remote_stream
self.remote_stream = None
logger.error(f'[{rmt.node_id}:{rmt.conn_no}] Fwd: {error} for '
f'l{rmt.l_addr} | r{rmt.r_addr}')
await rmt.disc()
rmt.close()

except RuntimeError as error:
if self.remoteStream:
rmt = self.remoteStream
self.remoteStream = None
if self.remote_stream:
rmt = self.remote_stream
self.remote_stream = None
logger.info(f'[{rmt.node_id}:{rmt.conn_no}] '
f'Fwd: {error} for {rmt.l_addr}')
await rmt.disc()
Expand Down
7 changes: 3 additions & 4 deletions app/src/gen3/connection_g3.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
import logging
# import gc
from asyncio import StreamReader, StreamWriter
from async_stream import AsyncStream
from gen3.talent import Talent
Expand All @@ -15,7 +14,7 @@ def __init__(self, reader: StreamReader, writer: StreamWriter,
AsyncStream.__init__(self, reader, writer, addr)
Talent.__init__(self, server_side, id_str)

self.remoteStream: 'ConnectionG3' = remote_stream
self.remote_stream: 'ConnectionG3' = remote_stream

'''
Our puplic methods
Expand All @@ -26,10 +25,10 @@ def close(self):
# logger.info(f'AsyncStream refs: {gc.get_referrers(self)}')

async def async_create_remote(self) -> None:
pass
pass # virtual interface

async def async_publ_mqtt(self) -> None:
pass
pass # virtual interface

def healthy(self) -> bool:
logger.debug('ConnectionG3 healthy()')
Expand Down
10 changes: 4 additions & 6 deletions app/src/gen3/inverter_g3.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,7 @@
from aiomqtt import MqttCodeError
from infos import Infos

# import gc

# logger = logging.getLogger('conn')
logger_mqtt = logging.getLogger('mqtt')


Expand Down Expand Up @@ -60,10 +58,10 @@ async def async_create_remote(self) -> None:
logging.info(f'[{self.node_id}] Connect to {addr}')
connect = asyncio.open_connection(host, port)
reader, writer = await connect
self.remoteStream = ConnectionG3(reader, writer, addr, self,
False, self.id_str)
logging.info(f'[{self.remoteStream.node_id}:'
f'{self.remoteStream.conn_no}] '
self.remote_stream = ConnectionG3(reader, writer, addr, self,
False, self.id_str)
logging.info(f'[{self.remote_stream.node_id}:'
f'{self.remote_stream.conn_no}] '
f'Connected to {addr}')
asyncio.create_task(self.client_loop(addr))

Expand Down
Loading