Hallo,
ich habe seit ein paar Tagen 2 FHEM RPis am Laufen, da ich den 2. RPi zur "Reichweitenerhöhung" für Bluetoothgeräte benötige. Nun habe ich mit FHEM2FHEM versucht die Geräte des "Remote FHEM" in meine Hauptinstallation zu bekommen, das hat leider nicht zu dem Ergebnis geführt das ich wollte. Daher habe ich entschieden ein eigenes Modul dafür zu schreiben.
Folgende Funktionen werden von FHEMSync abgedeckt:
- FHEM Devices werden von einer Remote FHEM Instanz in die FHEM Master Instanz synchronisiert
- Reading Updates werden automatisch gemacht
- Commands können aus den FHEM Master direkt an die Remote FHEM Instanz geschickt werden
- Attribute und Internals werden ebenfalls vom Remote Device übertragen (außer "room", da das keinen Sinn machen würde)
- Das Device wird daher 1:1 dargestellt, einziger Unterschied ist der TYPE des Devices
Wie erfolgt die Einrichtung von FHEMSync?
Begriffsdefinition:
- Master FHEM: FHEM Instanz wo die Devices hin synchronisiert werden sollen
- Remote FHEM: FHEM Instanz von wo die Devices zu Master synchronisiert werden sollen
Installation:
Master
- Sofern noch kein node am Master installiert, muss noch folgendes ausgefuehrt werden:
$ curl -sL https://deb.nodesource.com/setup_13.x | sudo -E bash -
$ sudo apt install -y nodejs
- 10_FHEMSYNC ins FHEM Verzeichnis kopieren und reload 10_FHEMSYNC ausführen
- 10_FHEMSYNC_DEVICE ins FHEM Verzeichnis kopieren und reload 10_FHEMSYNC_DEVICE ausführen
- sudo npm install -g fhemsync
- In der Haupt-FHEM Instanz:
- define fhemsync FHEMSYNC
- attr fhemsync remote-server IP-VON-REMOTE-FHEM
- Folgende weitere remote-* Attribute koennen gesetzt werden wenn diese von Default abweichen, default:
server: MUSS GESETZT WERDEN
port: 8083
webname: fhem
filter: room=FHEMSync
auth: ""
ssl: false,
selfsignedcert: false
- FHEMSync-* Werte definieren den Zugriff auf die Master Instanz, default:
server: 127.0.0.1
port: 8083
webname: fhem
auth: ""
ssl: false,
selfsignedcert: false
Remote
- In der Remote-FHEM Instanz einen Raum FHEMSync erstellen wo die zu synchronisierenden Devices hinterlegt werden müssen
Der Rest läuft vollautomatisch.
Logfile ist im Logverzeichnis mit dem Namen fhemsync-YYYY-MM-DD.log zu finden.
Changelog
- v0.9.9
Fix uninitialized value
- v0.9.8
Fix some attr sync issues (fhemsync v2.7.2 required)
- v0.9.7
Bugfixes
- v0.9.6
Sync Attributes from Master to Slave
- v0.9.5
Get working now
- v0.9.4
Fix key=value parameters in set commands
- v0.9.3
Bugfixes
- v0.9.1
Support up to 5 remote FHEMs (untested, bitte um Feedback)
Fix get (untested, bitte um Feedback)
- v0.9.0
Support Get (erfordert fhemsync v2.1.0)
Aktuelle Einschränkungen die in den nächsten Tagen behoben werden:
- Readings/Attribute die gelöscht werden, werden in der Hauptinstanz nicht gelöscht
- Authentifizierung (falls bei FHEM WEB genutzt) wird noch nicht unterstützt
- Commandref ist noch nicht korrekt
- Der Name des Devices wird 1:1 aus dem Remote FHEM übernommen und darf nicht geändert werden
Wer es ausprobieren möchte, anbei die beiden Files. Bitte um Feedback.
Wenn alles funktioniert, werde ich das Modul auch ins offizielle Repository einspielen.
Gruß,
Dominik
Hey dominik,
ich habe mir das Modul nicht angeschaut. Aber wenn es das macht, was es machen soll, wäre es nicht in einem anderen Unterforum (Automatisierung oder Sonstiges!?) besser aufgehoben? Hier schauen in der Regel ja nur die Entwickler rein.
Was ich nicht ganz verstehe, wieso muss da noch was mittels npm installiert werden? Ich finde eine solche Abhängigkeit schon riesig. Gerade für so eine kleine Funktion der Synchronisierung.
Grüße
Danke für den Hinweis wegen Unterforum, ich werde es in Automatisierung verschieben.
@CoolTux, weil ich in NodeJS aus gassistant schon so gut wie alle Funktionalitäten parat hatte. Generell bin ich in NodeJS auch schneller als in Perl, daher bevorzuge ich das. Ich denke die meisten haben aufgrund von Alexa oder Google Assistant sowieso schon npm installiert und daher ist die Installation wahrscheinlich nur eine Kleinigkeit.
Zitat von: dominik am 19 März 2020, 07:50:41
Ich denke die meisten haben aufgrund von Alexa oder Google Assistant sowieso schon npm installiert
das glaube ich kaum wenn man den FHEM Stats Glauben schenkt ....
Ich denke, auch, dass das eine Abhängigkeit ist, die viele User ausschließt. Mich zum Beispiel.
ZitatIch denke die meisten haben aufgrund von Alexa oder Google Assistant sowieso schon npm installiert
Alexa kommt bei mir nicht ins Haus solange ich das vermeiden kann !
Ich kann Dominik verstehen. Die benötigten Funktionen standen halt schon mit Code fertig.
Ich persönlich würde aber so etwas vermeiden. Die selbe Funktionalität hätte sich auch in der FHEM Ausgangssprache (Perl) lösen lassen. Ich finde diese Abhängigkeit extrem. Ist aber nur meine persönliche Meinung.
Grüße
Marko
@Wzut, da gebe ich dir natürlich recht.
@marvin78, ausschließen tu ich niemanden, es kann jeder selbst entscheiden ob er node installiert. Gleiches gilt für alle Abhängigkeiten, egal ob Perl Module, externe Binaries, etc..
Wer mag, kann das gerne auch nativ in Perl implementieren. Nachdem einige andere Module ebenfalls nodejs Module erfordern, sehe ich das (für mich) nicht soooo kritisch.
Es ist natürlich jeden selbst überlassen ob man node installiert oder nicht. Ich zwinge niemanden dazu :) Wer es nutzen mag nutzt es, die anderen nicht. Ich wollte es nur allen zur Verfügung stellen.
Hallo Dominik,
da ich zum Kreis der Alexa / Google Anwender gehöre, ist die Abhängigkeit kein Hindernis für mich. Ich finde Deinen Ansatz spannend und werde es mal ausprobieren, da ich gerade dabei bin meine Raspi Installation aufzuteilen. Habe es bisher über MQTT2 Funktionaliät getestet, was auch geht, aber natürlich mit Kanonen auf Spatzen geschossen ist.
Beste Grüße
Torsten
Die Loesung mit npm/node.js ist aus Sicht der Benutzer zwar doof, vmtl. noch mehr die Tatsache, dass es eine weitere (die dritte?) Alternative ist, und als Anfaenger nicht so recht weiss, was man nehmen soll.
Aus meiner Sicht ist das anders, weil Konkurrenz das Geschaeft belebt, und man was dazulernen kann :)
Z.Bsp ueber das "wie":
- Theorie: FHEM kontaktiert lokalen node.js, das wiederum das remote FHEMWEB anspricht.
- Welches Protokoll wird bei der lokalen und bei der remote Verbindung verwendet?
- Welche Benachrichtigungsmechanismus wird verwendet? Wenn FHEMWEB: welche der etlichen inform Varianten?
- Wie/Wann werden Internals uebertragen?
Habs jetzt mal versucht zu installieren, aber leider schon ein Fehler bei der npm installation:
npm install -g fhemsync
/root/.nvm/versions/node/v10.16.3/bin/fhemsync -> /root/.nvm/versions/node/v10.16.3/lib/node_modules/fhemsync/fhemsync.js
npm WARN request-promise@4.2.5 requires a peer of request@^2.34 but none is installed. You must install peer dependencies yourself.
npm WARN request-promise-core@1.1.3 requires a peer of request@^2.34 but none is installed. You must install peer dependencies yourself.
im fhemsync.log steht:
###
### The "request" library is not installed automatically anymore.
### But is a dependency of "request-promise".
### Please install it with:
### npm install request --save
###
/usr/lib/node_modules/fhemsync/node_modules/request-promise/lib/rp.js:23
throw err;
^
Error: Cannot find module 'request'
Require stack:
- /usr/lib/node_modules/fhemsync/node_modules/request-promise/lib/rp.js
- /usr/lib/node_modules/fhemsync/fhemsync.js
at Function.Module._resolveFilename (internal/modules/cjs/loader.js:982:15)
at Function.Module._load (internal/modules/cjs/loader.js:864:27)
at Module.require (internal/modules/cjs/loader.js:1044:19)
at require (internal/modules/cjs/helpers.js:77:18)
at /usr/lib/node_modules/fhemsync/node_modules/request-promise/lib/rp.js:11:16
at module.exports (/usr/lib/node_modules/fhemsync/node_modules/stealthy-require/lib/index.js:62:23)
at Object.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request-promise/lib/rp.js:10:19)
at Module._compile (internal/modules/cjs/loader.js:1158:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1178:10)
at Module.load (internal/modules/cjs/loader.js:1002:32)
at Function.Module._load (internal/modules/cjs/loader.js:901:14)
at Module.require (internal/modules/cjs/loader.js:1044:19)
at require (internal/modules/cjs/helpers.js:77:18)
at Object.<anonymous> (/usr/lib/node_modules/fhemsync/fhemsync.js:8:17)
at Module._compile (internal/modules/cjs/loader.js:1158:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1178:10)
at Module.load (internal/modules/cjs/loader.js:1002:32)
at Function.Module._load (internal/modules/cjs/loader.js:901:14)
at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:74:12)
at internal/main/run_main_module.js:18:47 {
code: 'MODULE_NOT_FOUND',
requireStack: [
'/usr/lib/node_modules/fhemsync/node_modules/request-promise/lib/rp.js',
'/usr/lib/node_modules/fhemsync/fhemsync.js'
]
}
FHEMSYNC startet nicht!
ZitatWer mag, kann das gerne auch nativ in Perl implementieren. Nachdem einige andere Module ebenfalls nodejs Module erfordern, sehe ich das (für mich) nicht soooo kritisch.
Ich schon. Wir schaffen uns immer mehr "externen Ballast". So etwas wie node/npm sollte nur im absoluten "Notfall" eingesetzt werden, wenn die Funktionalität EXTREM komplex ist und eine Umsetzung in Perl fast unmöglich ist. Ich nehme mal meine Beispiele der Deebot-Implementierung und SamsungAV. Beides funktionierent mit Python. ABER: Python-Installation ist meines Erachtens extrem kompliziert, da unterschiedlichste Versionen implementiert und umgekehrt Voraussetzung für die Funktionsweise der Python-Module ist. Daher hab ich extra alles in Perl oder sonstige FHEM-Funktionalität(MQTT) umgesetzt. Irgendwann ufert das so aus, dass wir im Forum nur noch über die Installation von nicht-Perl supporten müssen.
Zitat@CoolTux, weil ich in NodeJS aus gassistant schon so gut wie alle Funktionalitäten parat hatte. Generell bin ich in NodeJS auch schneller als in Perl, daher bevorzuge ich das.
Schneller kann nicht das Argument sein. Schlanke Installationen mit wenig OS-Gefummel sind das Gegenargument.
Edit: Und bereits parat ist doch vermutlich nur, weil Du nicht sofort in Perl entwickelt hast, sonst wären die Funktionalitäten in Perl vorhanden gewesen. Da bewahrheitet sich dann: Wehret den Anfängen...
Grüße Markus
Edit: und schon geht es wieder los, wie punker beweist.... :'(
@rudolfkoenig
- Theorie: FHEM kontaktiert lokalen node.js, das wiederum das remote FHEMWEB anspricht.
Exakt, wobei nodejs eine Verbindung sowohl zum "Master FHEM" als auch zum "Remote FHEM" herstellt.
- Welches Protokoll wird bei der lokalen und bei der remote Verbindung verwendet?
longpoll, eventuell später websocket wenn Interesse besteht
- Welche Benachrichtigungsmechanismus wird verwendet? Wenn FHEMWEB: welche der etlichen inform Varianten?
longpoll inform=type=status, falls du das meinst?
- Wie/Wann werden Internals uebertragen?
Beim einmaligen Anlegen der Devices im Master FHEM. Gleiches gilt für Attributes.
Wie gesagt, mich hat nur gestört, dass FHEM2FHEM es nicht so gemacht hat, wie ich es wollte ;) Wenn FHEM2FHEM erweitert wird auf "Command senden" und das Device wirklich nahezu 1:1 im Master FHEM darstellen, steige ich gleich wieder um. Das fehlte mir jedoch.
@punker
sudo npm install request
Da fehlt eine Abhängigkeit, werde ich aktualisieren. Danke für die Rückmeldung!
@KölnSolar
Ich nutze gerne bestehende Libraries, weil ich damit nicht selbst die Weiterentwicklung vornehmen muss - warum auch doppelt entwickeln? Für Perl gibt es leider nicht immer das was ich benötige, daher nutze ich dann auch mal andere Sprachen...und ja, jeder kann gerne FHEMSync in Perl implementieren.
Ich hatte ja schon mal gesagt, ich fände ein FHEM NodeJS Binding super...aber das möchte ich nun hier nicht diskutieren.
Hallo,
Ich habe jetzt auch mal versucht FHEMSync zu installieren.
Nachdem ich request nach installiert habe bekomme ich folgenden Fehler:
(node:9433) UnhandledPromiseRejectionWarning: RequestError: Error: read ECONNRESET
at new RequestError (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/errors.js:14:15)
at Request.plumbing.callback (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:87:29)
at Request.RP$callback [as _callback] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:46:31)
at self.callback (/usr/lib/node_modules/request/request.js:185:22)
at Request.emit (events.js:311:20)
at Request.onRequestError (/usr/lib/node_modules/request/request.js:877:8)
at ClientRequest.emit (events.js:311:20)
at Socket.socketErrorListener (_http_client.js:426:9)
at Socket.emit (events.js:311:20)
at emitErrorNT (internal/streams/destroy.js:92:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:60:3)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
(node:9433) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:9433) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
Allerdings steht mein FHEM drausen im Internet, sollte aber doch kein Problem sein. Zum Test habe ich ein FHEM WEB ohne Anmeldung definiert.
Gruß
Carlos
Hast du vielleicht in FHEM WEB den Zugriff auf IP Adressen eingeschränkt? Sieht so aus, als könnte keine Verbindung zum Remote FHEM hergestellt werden.
Nein habe ich nicht.
Extra neue WEB instanz ohne Einschränkungen.
Gruß
Carlos
Nutzt du HTTP und die andere Instanz ist HTTPS? Wenn ja, stell das FHEMSync-ssl Attribut auf false.
Hab ich auch schon probiert, der Fehler bleibt gleich.
Zitat von: carlos am 19 März 2020, 18:22:16
Hab ich auch schon probiert, der Fehler bleibt gleich.
Anderer Port? Zeig mal das gesamte Log, da ist dann auch die URL drin.
So jetzt funktionierts, habe bei allowfrom die öffentliche IP meines routers eingetragen.
Aber im log steht jetzt ständig folgendes, obwohl fhemsync definiert ist:
[MAIN ] Options: {"version":"1.0.0","fhem":true,"port":true,"webname":true,"auth":true,"device":true}
[MASTER ] executing: http://xxx:8084/WEBS?XHR=1
[MASTER ] executing: http://xxx:8084/WEBS?XHR=1&cmd=jsonlist2%20fhemsync&fwcsrf=undefined
[MASTER ] FHEMSYNC device detected: undefined
[MASTER ] Please define FHEMSYNC device in FHEM: define fhemsync FHEMSYNC
Nutzt du csrfToken? Der duerfte nicht erkannt worden sein.
Nein ist auf none gesetzt.
Kannst du das mal aktivieren? Dann sollte es klappen. Aktuell ist es nur mit aktivierten csrfToken nutzbar.
Gleicher Fehler, das token scheint aber ok zu sein:
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.0","fhem":true,"port":true,"webname":true,"device":true}
[MASTER ] executing: http://xxx:8084/WEBS?XHR=1
[MASTER ] executing: http://xxx:8084/WEBS?XHR=1&cmd=jsonlist2%20fhemsync&fwcsrf=fhemsync
[MASTER ] FHEMSYNC device detected: undefined
[MASTER ] Please define FHEMSYNC device in FHEM: define fhemsync FHEMSYNC
Hast du den csrfToken auf den Wert "fhemsync" gesetzt? Loesche bitte die Zeile in deiner Config mal komplett raus. Btw, bitte diese Aenderung am Master FHEM (dort wo FHEMSYNC definiert wurde) durchfuehren.
Nur um sicher zu gehen, dein FHEMSYNC Device am Master FHEM heisst fhemsync?
ja den token habe ich auf fhemsync gesetzt.
In welcher config? Ich habe leider keine.
Die sollte doch automatisch angelegt werden. Wo? in /opt/fhem?
Mein FHEMSYNC device:
Internals:
FUUID 5e732d87-f33f-66b6-473c-085642d0fc4d18e8
LAST_START 2020-03-19 21:20:54
LAST_STOP 2020-03-19 21:20:54
NAME fhemsync
NR 1357
NTFY_ORDER 50-fhemsync
STARTS 1531
STATE stopped
TYPE FHEMSYNC
logfile ./log/fhemsync-%Y-%m-%d.log
.attraggr:
.attrminint:
CoProcess:
cmdFn FHEMSYNC_getCMD
name fhemsync
state stopped
READINGS:
2020-03-19 21:20:54 fhemsync stopped
Attributes:
FHEMSync-port 8084
FHEMSync-server xxx
FHEMSync-ssl false
FHEMSync-webname WEBS
alias fhemsync
devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
remote-server xxx
room FHEMSync
stateFormat fhemsync
verbose 5
In der Remote-FHEM Instanz muss doch nur einen Raum FHEMSync geben oder muss da noch mehr gemacht werden?
Nur um sicher zu gehen, dass alles richtig konfiguriert ist:
- Master FHEM
Port 8084
HTTP (kein SSL)
WEBS (als Webname)
Auf dem Master FHEM ist auch define fhemsync FHEMSYNC angelegt
- Remote FHEM
Port 8083
HTTP
fhem (als Webname)
Wenn es so ist, ist deine Konfiguration in den Attributen auch richtig.
Mit der Config meinte ich fhem.cfg, also dort die Zeile mit csrfToken loeschen.
Probier bitte noch ein
sudo npm install -g fhemsync
Version 1.0.2 sollte damit installiert werden.
Ich habe gerade eingebaut, dass es auch ohne csrfToken funktionieren sollte - habe dies selbst nicht getestet!
sudo npm install -g fhemsync
führt zum Fehler
npm Kommando nicht gefunden ??
Zitat von: harway2007 am 19 März 2020, 22:01:21
sudo npm install -g fhemsync
führt zum Fehler
npm Kommando nicht gefunden ??
Bitte vorher noch npm installieren
$ curl -sL https://deb.nodesource.com/setup_13.x | sudo -E bash -
$ sudo apt install -y nodejs
führt nicht weit :
W: Fehlschlag beim Holen von http://raspberrypi.collabora.com/dists/wheezy/rpi/i18n/Translation-en Beim Auflösen von »raspberrypi.collabora.com:http« ist etwas Schlimmes passiert (-5 - Zu diesem Hostnamen gehört keine Adresse).
W: Fehlschlag beim Holen von http://mirrordirector.raspbian.org/raspbian/dists/wheezy/main/binary-armhf/Packages 404 Not Found
W: Fehlschlag beim Holen von http://mirrordirector.raspbian.org/raspbian/dists/wheezy/contrib/binary-armhf/Packages 404 Not Found
W: Fehlschlag beim Holen von http://mirrordirector.raspbian.org/raspbian/dists/wheezy/non-free/binary-armhf/Packages 404 Not Found
W: Fehlschlag beim Holen von http://mirrordirector.raspbian.org/raspbian/dists/wheezy/rpi/binary-armhf/Packages 404 Not Found
E: Einige Indexdateien konnten nicht heruntergeladen werden. Sie wurden ignoriert oder alte an ihrer Stelle benutzt.
Error executing command, exiting
pi@pi57Dachgeschoss ~ $ sudo apt install -y nodejssudo: apt: command not found
pi@pi57Dachgeschoss ~ $
WHEEZY!!!
Da gibt's schon lang nix mehr Repository!!!
(Wheezy -> Jessie -> Stretch -> Buster ist aktuell)
Auch würdest du mit dem node/npm von Wheezy nicht weit gekommen sein!
Da musst wohl wirklich (langsam) aufrüsten!
Meine Meinung...
Ansonsten gibt es/gab es beim "alten" alexa-fhem Wiki am Anfang eine "Anleitung" zur "manuellen Installation" von node/npm...
Gruß, Joachim
So, hab jetzt FHEMsync soweit dass es läuft, aber es werden in der Haupinstanz keine Geräte angelegt, die ich in der Remoteinstanz in den Raum FHEMSync gelegt habe.
Das sind 3 Amazon Echo-Dot.
Diese sollten doch "vollautomatisch" auf der FHEM-Hauptinstanz im Raum FHEMSync auftauchen?
Hier ein list von der Haupinstanz:
Internals:
FD 50
FUUID 5e748604-f33f-55ca-39ce-c5094625f2d7b348
LAST_START 2020-03-20 11:00:13
LAST_STOP 2020-03-20 11:00:13
NAME fhemsync
NR 636
NTFY_ORDER 50-fhemsync
PID 19906
STARTS 10
STATE running /usr/bin/fhemsync
TYPE FHEMSYNC
currentlogfile ./log/fhemsync-2020-03-20.log
logfile ./log/fhemsync-%Y-%m-%d.log
CoProcess:
cmdFn FHEMSYNC_getCMD
name fhemsync
state running /usr/bin/fhemsync
READINGS:
2020-03-20 11:00:13 fhemsync running /usr/bin/fhemsync
Attributes:
FHEMSync-server 192.168.2.161
FHEMSync-webname WEB
devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
remote-ssl false
room FHEMSync
stateFormat fhemsync
Hier das FHEMsync.log:
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.2","fhem":true,"webname":true,"device":true,"ssl":true}
[MASTER ] executing: https://192.168.2.161:8083/WEB?XHR=1
Da fällt mir auf dass "ssl":true ist und die Remote Instanz mit https aufgerufen wird das funzt aber nicht weil kein SSL benutzt wird!
Also was mache ich falsch?
Du musst deine Attribute noch anpassen:
Alle Attribute mit FHEMSync-* gelten für den FHEM Server, wo FHEMSync läuft. Hier muss also im Normalfall nichts konfiguriert werden. Wenn nichts konfiguriert wird, werden folgende Default Werte genommen:
FHEMSync-server 192.168.2.161 (default: 127.0.0.1)
FHEMSync-webname WEB (default: fhem)
FHEMSync-ssl (default: false)
remote-server musst du auf alle Fälle noch konfigurieren, dort muss der Remote FHEM eingetragen werden mit der IP.
remote-ssl auch setzen wenn nötig (default: false).
Also hier nochmal ein list vom Server (https://192.168.2.219:8083/fhem):
Internals:
FD 109
FUUID 5e748604-f33f-55ca-39ce-c5094625f2d7b348
LAST_START 2020-03-20 13:37:08
LAST_STOP 2020-03-20 13:36:43
NAME fhemsync
NR 636
NTFY_ORDER 50-fhemsync
PID 21280
STARTS 31
STATE running /usr/bin/fhemsync
TYPE FHEMSYNC
currentlogfile ./log/fhemsync-2020-03-20.log
logfile ./log/fhemsync-%Y-%m-%d.log
CoProcess:
cmdFn FHEMSYNC_getCMD
name fhemsync
state running /usr/bin/fhemsync
READINGS:
2020-03-20 13:37:08 fhemsync running /usr/bin/fhemsync
Attributes:
FHEMSync-server 192.168.2.219
FHEMSync-webname WEB
devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
remote-server 192.168.2.161
remote-ssl false
room FHEMSync
stateFormat fhemsync
und das Log:
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.2","fhem":true,"webname":true,"device":true,"ssl":true}
[MASTER ] executing: https://192.168.2.219:8083/WEB?XHR=1
(node:21446) UnhandledPromiseRejectionWarning: RequestError: Error: self signed certificate
at new RequestError (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/errors.js:14:15)
at Request.plumbing.callback (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:87:29)
at Request.RP$callback [as _callback] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:46:31)
at self.callback (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:185:22)
at Request.emit (events.js:311:20)
at Request.onRequestError (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:877:8)
at ClientRequest.emit (events.js:311:20)
at TLSSocket.socketErrorListener (_http_client.js:426:9)
at TLSSocket.emit (events.js:311:20)
at emitErrorNT (internal/streams/destroy.js:92:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:60:3)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
(node:21446) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:21446) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
Der Remote-FHEM hat die Adresse http://192.168.2.161:8083/fhem
So wie im List funzt es nicht - sollte es aber doch?
(node:21446) UnhandledPromiseRejectionWarning: RequestError: Error: self signed certificate
Du nutzt ein self signed Zertifikat welches nicht geprüft werden kann. Probier es mal ohne Zertifikat.
Ich werde noch prüfen, ob man diesen Check irgendwo deaktivieren kann.
//Edit: Ich habe es mal schnell mit eingebaut. Mach ein
sudo npm install -g fhemsync
Version 1.0.3 wird installiert.
Kopier das 10_FHEMSYNC in das FHEM directory, danach auf reload 10_FHEMSYNC nicht vergessen
Wenn das getan ist, findest du folgende Attribute in FHEMSync:
FHEMSync-selfsignedcert (das auf true setzen)
remote-selfsignedcert (nachdem der remote kein SSL macht, musst du das nicht setzen)
Jetzt hab ich mal attr FHEMSync-auth mit benutzer:passwort eingetragen und nun hab ich dieses Log:
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.2","fhem":true,"webname":true,"auth":true,"device":true,"ssl":true}
[MASTER ] executing: https://192.168.2.219:8083/WEB?XHR=1
(node:24167) UnhandledPromiseRejectionWarning: TypeError: Assignment to constant variable.
at FHEM.getCsrfToken (/usr/lib/node_modules/fhemsync/fhemsync.js:247:13)
at main (/usr/lib/node_modules/fhemsync/fhemsync.js:387:20)
at Object.<anonymous> (/usr/lib/node_modules/fhemsync/fhemsync.js:452:1)
at Module._compile (internal/modules/cjs/loader.js:1158:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1178:10)
at Module.load (internal/modules/cjs/loader.js:1002:32)
at Function.Module._load (internal/modules/cjs/loader.js:901:14)
at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:74:12)
at internal/main/run_main_module.js:18:47
(node:24167) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 2)
(node:24167) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
also nichts mehr mit dem self signed certificate
funzt trotzdem nicht!
auth wird noch nicht unterstuetzt, siehe erster Post...kommt noch.
na dann weiß ich auch nicht wie es funzen soll und lösch es halt - werd halt mit RFHEM weitermachen!
Zitat von: punker am 20 März 2020, 15:24:29
na dann weiß ich auch nicht wie es funzen soll und lösch es halt - werd halt mit RFHEM weitermachen!
Ganz einfach, aktuell noch ohne Auth nutzen, dann klappt es. Self signed certificate sollte nun auch funktionieren, siehe Post oben.
Danke dennoch fuers Testen.
//Edit: So, mal schnell eingebaut. Wenn du noch willst, kannst mit Version 1.0.4 probieren, gerade published. Damit sollte auth nun auch gehen.
Bitte beachte, das ist hier ein TEST Release und kein fertiges Produkt.
Habs jetzt mit deiner neuen Version nochmal probiert - leider wieder ohne Erfolg!
list
Internals:
FUUID 5e75cbac-f33f-55ca-70eb-f24fdf71b2b33530
LAST_START 2020-03-21 09:19:05
LAST_STOP 2020-03-21 09:19:09
NAME fhemsync
NR 636
NTFY_ORDER 50-fhemsync
STARTS 9
STATE stopped
TYPE FHEMSYNC
logfile ./log/fhemsync-%Y-%m-%d.log
CoProcess:
cmdFn FHEMSYNC_getCMD
name fhemsync
state stopped
READINGS:
2020-03-21 09:19:09 fhemsync stopped
Attributes:
FHEMSync-auth crypt:11165b5e01100e0447595a0407794f
FHEMSync-selfsignedcert true
FHEMSync-server 192.168.2.219
FHEMSync-webname WEB
devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
remote-server 192.168.2.161
remote-webname WEB
room FHEMSync
stateFormat fhemsync
verbose 0
im Log steht nur noch folgendes, obwohl fhemsync definiert ist!
log
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.4","fhem":true,"webname":true,"auth":true,"device":true,"ssl":true,"selfSignedCert":true}
[MASTER ] executing: https://192.168.2.219:8083/WEB?XHR=1
[MASTER ] executing: https://192.168.2.219:8083/WEB?XHR=1&cmd=jsonlist2%20fhemsync
[MASTER ] Please define FHEMSYNC device in FHEM: define fhemsync FHEMSYNC
Hmm...ist FHEMSync-webname WEB wirklich richtig? Wenn du ueber den Browser zugreifst, steht dann dort auch "IP:PORT/WEB" oder "IP:PORT/fhem"? Beim WEB hast du auch keine allow oder so konfiguriert um den Zugriff einzuschraenken?
Du kannst mal die Links im Browser aufrufen und pruefen ob da entsprechende Rueckmeldungen kommen
Da steht natürlich IP:PORT/fhem - ich dachte der Name vom defmod WEB FHEMWEB 8083 ist gemeint!
Habs jetzt mal so geändert:
Internals:
FUUID 5e75cbac-f33f-55ca-70eb-f24fdf71b2b33530
LAST_START 2020-03-21 10:40:28
LAST_STOP 2020-03-21 10:40:31
NAME fhemsync
NR 636
NTFY_ORDER 50-fhemsync
STARTS 43
STATE stopped
TYPE FHEMSYNC
logfile ./log/fhemsync-%Y-%m-%d.log
CoProcess:
cmdFn FHEMSYNC_getCMD
name fhemsync
state stopped
READINGS:
2020-03-21 10:40:31 fhemsync stopped
Attributes:
FHEMSync-server 192.168.2.219
FHEMSync-webname fhem
devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
remote-server 192.168.2.161
remote-webname fhem
room FHEMSync
stateFormat fhemsync
verbose 0
funzt aber auch nicht!
Ich bin wahrscheinlich dafür zu blöd - allerdings ist das das erste Modul das ich nicht zum laufen krieg!
Poste bitte nochmals das Log, den Fehler finden wir auch noch.
Hast du nur eine FHEMWEB Instanz, oder mehrere? Auf allen Seiten ist kein Auth und SSL hinterlegt, sehe ich das richtig?
Der HauptFHEMserver läuft mit https der RemoteFHEMserver ohne https
list:
Internals:
FUUID 5e75cbac-f33f-55ca-70eb-f24fdf71b2b33530
LAST_START 2020-03-21 10:56:19
LAST_STOP 2020-03-21 10:56:22
NAME fhemsync
NR 636
NTFY_ORDER 50-fhemsync
STARTS 38
STATE stopped
TYPE FHEMSYNC
logfile ./log/fhemsync-%Y-%m-%d.log
CoProcess:
cmdFn FHEMSYNC_getCMD
name fhemsync
state stopped
READINGS:
2020-03-21 10:56:22 fhemsync stopped
Attributes:
FHEMSync-selfsignedcert true
FHEMSync-server 192.168.2.219
FHEMSync-ssl trueevent-aggregator
devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
remote-port 8083
remote-selfsignedcert false
remote-server 192.168.2.161
remote-ssl false
room FHEMSync
stateFormat fhemsync
verbose 0
log
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.4","fhem":true,"device":true,"ssl":true,"selfSignedCert":true}
[MASTER ] executing: https://192.168.2.219:8083/fhem?XHR=1
[MASTER ] executing: https://192.168.2.219:8083/fhem?XHR=1&cmd=jsonlist2%20fhemsync
[MASTER ] FHEMSYNC device detected: fhemsync
[SLAVE ] config: value for ssl has to be boolean.
Das sieht jetzt schon gut aus. fhemsync Device wird schon gefunden!
Jetzt bitte noch das Attribut
FHEMSync-ssl trueevent-aggregator
auf
FHEMSync-ssl true
setzen, da duerfte was reingerutscht sein.
Hab wieder was geändert:
Internals:
FUUID 5e75cbac-f33f-55ca-70eb-f24fdf71b2b33530
LAST_START 2020-03-21 11:17:40
LAST_STOP 2020-03-21 11:17:43
NAME fhemsync
NR 636
NTFY_ORDER 50-fhemsync
STARTS 98
STATE stopped
TYPE FHEMSYNC
logfile ./log/fhemsync-%Y-%m-%d.log
CoProcess:
cmdFn FHEMSYNC_getCMD
name fhemsync
state stopped
READINGS:
2020-03-21 11:17:43 fhemsync stopped
Attributes:
FHEMSync-filter room=FHEMSync
FHEMSync-port 8083
FHEMSync-selfsignedcert true
FHEMSync-server 192.168.2.219
FHEMSync-ssl trueevent-aggregator
FHEMSync-webname fhem
devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
remote-filter room=FHEMSync
remote-port 8083
remote-selfsignedcert false
remote-server 192.168.2.161
remote-ssl false
remote-webname fhem
room FHEMSync
stateFormat fhemsync
verbose 0
log
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.4","fhem":true,"port":true,"webname":true,"device":true,"ssl":true,"selfSignedCert":true}
[MASTER ] executing: https://192.168.2.219:8083/fhem?XHR=1
[MASTER ] executing: https://192.168.2.219:8083/fhem?XHR=1&cmd=jsonlist2%20fhemsync
[MASTER ] FHEMSYNC device detected: fhemsync
[SLAVE ] config: value for ssl has to be boolean.
funzt weiterhin nicht
Bitte nicht so viel auf einmal aendern, lass uns das Schritt fuer Schritt machen:
FHEMSync-ssl trueevent-aggregator
Hier liegt der Fehler. Ich weiss nicht wie das trueevent-aggregator da rein kommt, eigentlich geht das nur auf false/true setzen. Bitte daher auf true setzen.
wenn ich das attribut FHEMSync-ssl setzen will kommt false und trueevent-aggregator!
sobald ich das attribut FHEMSync-ssl auf true setze stürzt fhem ab!
Achje, ja, ich sehe den Fehler bei mir. Wobei, lass das mal so und loesche nur das remote-ssl Attribut.
list
Internals:
FUUID 5e75f0d6-f33f-55ca-c4e9-4550cf7ac6a50294
LAST_START 2020-03-21 12:10:43
LAST_STOP 2020-03-21 12:10:46
NAME fhemsync
NR 635
NTFY_ORDER 50-fhemsync
STARTS 2
STATE stopped
TYPE FHEMSYNC
logfile ./log/fhemsync-%Y-%m-%d.log
CoProcess:
cmdFn FHEMSYNC_getCMD
name fhemsync
state stopped
READINGS:
2020-03-21 12:10:46 fhemsync stopped
Attributes:
FHEMSync-server 192.168.2.219
FHEMSync-webname fhem
devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
remote-server 192.168.2.161
remote-webname fhem
room FHEMSync
stateFormat fhemsync
so läuft FHEM
Sobald ich das Attribut FHEMSync-selfsignedcert auf true stelle stürzt FHEM ab!
log
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.4","fhem":true,"webname":true,"device":true,"ssl":true}
[MASTER ] executing: https://192.168.2.219:8083/fhem?XHR=1
(node:7542) UnhandledPromiseRejectionWarning: RequestError: Error: self signed certificate
at new RequestError (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/errors.js:14:15)
at Request.plumbing.callback (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:87:29)
at Request.RP$callback [as _callback] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:46:31)
at self.callback (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:185:22)
at Request.emit (events.js:311:20)
at Request.onRequestError (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:877:8)
at ClientRequest.emit (events.js:311:20)
at TLSSocket.socketErrorListener (_http_client.js:426:9)
at TLSSocket.emit (events.js:311:20)
at emitErrorNT (internal/streams/destroy.js:92:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:60:3)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
(node:7542) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:7542) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
Danke dir fuer die vielen Tests. Ich werde nun bei mir ein HTTPS mit basicAuth aufsetzen um das im Detail zu testen. Leider schmiert mir gerade FHEM beim Aktivieren von HTTPS ab.
Ich gebe dir Bescheid sobald du wieder testen kannst.
So, ich habe jetzt bei mir beide auf HTTPS mit Auth umgestellt.
Bitte die beiden *.pm Dateien nochmals aus dem 1. Post herunterladen und reloaden.
fhemsync per npm auf Version 1.0.5 aktualisieren.
Hier mein List
Internals:
FD 55
FUUID 5e727ac6-f33f-faa4-6063-768b9f6e89c60206
LAST_START 2020-03-21 13:33:13
LAST_STOP 2020-03-21 13:33:09
NAME fhemsync
NR 51738
NTFY_ORDER 50-fhemsync
PID 24302
STARTS 878
STATE running /usr/bin/fhemsync
TYPE FHEMSYNC
currentlogfile ./log/fhemsync-2020-03-21.log
logfile ./log/fhemsync-%Y-%m-%d.log
CoProcess:
cmdFn FHEMSYNC_getCMD
name fhemsync
state running /usr/bin/fhemsync
READINGS:
2020-03-21 13:33:13 fhemsync running /usr/bin/fhemsync
Attributes:
FHEMSync-auth crypt:.....
FHEMSync-selfsignedcert true
FHEMSync-ssl true
devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
remote-auth USER:PASSWORT
remote-selfsignedcert true
remote-server 192.168.86.39
remote-ssl true
room FHEMSync
stateFormat fhemsync
Remote IP: 192.168.86.39
Master IP: 192.168.86.150 (habe ich als FHEMSync-server nicht gesetzt, da default 127.0.0.1 ist und ich keine Zugriffsbeschraenkungen eingerichtet habe)
also bei mir funzt es nicht!
Sobald ich FHEMSync-selfsignedcert auf true stelle stürzt FHEM ab!
Kannst du bitte mal im FHEM Log schauen was da fuer ein Fehler zu finden ist? Ich kann den bei mir leider nicht nachstellen.
Im FHEM.log steht kein Fehler drin, wahrscheinlich müsste ich verbose erst auf 5 setzen?
Aber jetzt hab ichs mal umgedreht probiert, also den bisherigen Hauptinstanz-Server und den Remote-Server vertauscht und so rum funzt es!!!
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.5","fhem":true,"device":true}
[MASTER ] executing: http://192.168.2.161:8083/fhem?XHR=1
[MASTER ] executing: http://192.168.2.161:8083/fhem?XHR=1&cmd=jsonlist2%20fhemsync
[MASTER ] FHEMSYNC device detected: fhemsync
[SLAVE ] executing: https://192.168.2.219:8083/fhem?XHR=1
[MASTER ] starting longpoll: http://192.168.2.161:8083/fhem?XHR=1&inform=type=status;addglobal=1;filter=.*;since=null;fmt=JSON×tamp=1584800228069
[SLAVE ] starting longpoll: https://192.168.2.219:8083/fhem?XHR=1&inform=type=status;addglobal=1;filter=.*;since=null;fmt=JSON×tamp=1584800228504
[MASTER ] Fetching FHEM devices...
[MASTER ] executing: http://192.168.2.161:8083/fhem?XHR=1&cmd=jsonlist2%20TYPE%3DFHEMSYNC_DEVICE
[MASTER ] got: 1 results
[SLAVE ] Fetching FHEM devices...
[SLAVE ] executing: https://192.168.2.219:8083/fhem?XHR=1&cmd=jsonlist2%20room%3DFHEMSync
[SLAVE ] got: 1 results
[MASTER ] executing: http://192.168.2.161:8083/fhem?XHR=1&cmd=set%20TFS_Compizimmer%20x_json%20%7B%22Name%22%3A%22TFS_Compizimmer%22%2C%22PossibleSets%22%3A%22%22%2C%22PossibleAttrs%22%3A%22alias%20comment%3AtextField-long%20eventMap%3AtextField-long%20group%20room%20suppressReading%20userReadings%3AtextField-long%20verbose%3A0%2C1%2C2%2C3%2C4%2C5%20IODev%20do_not_notify%3A0%2C1%20ignore%3A0%2C1%20model%3AS300TH%2CKS300%2CASH2200%20showtime%3A0%2C1%20strangeTempDiff%20event-aggregator%20event-min-interval%20event-on-change-reading%20event-on-update-reading%20oldreadings%20stateFormat%3AtextField-long%20timestamp-on-change-reading%201%20DbLogExclude%20Haus%20Haus_map%20alexaName%20alexaProactiveEvents%3A1%2C0%20alexaRoom%20cmdIcon%20devStateIcon%20devStateIcon%3AtextField-long%20devStateStyle%20device_timeout%20fp_energie%20fp_garten%20fp_mein%20fp_system%20fp_system1%20fp_wohnung%20genericDeviceType%3Asecurity%2Cignore%2Cswitch%2Coutlet%2Clight%2ClightSceneParamsToSave%20homebridgeMapping%3AtextField-long%20icon%20ignore_battery%20lightSceneRestoreOnlyIfChanged%3A1%2C0%2Cscene%2Cspeaker%2Cblind%2Cthermometer%2Cthermostat%2Ccontact%2Cgarage%2Cwindow%2Clock%20raum%20raum_map%20room_map%20sortby%20structexclude%20webCmd%20webCmdLabel%3AtextField-long%20widgetOverride%20userattr%22%2C%22Internals%22%3A%7B%22CHANGED%22%3A%22null%22%2C%22CODE%22%3A%225%22%2C%22CUL868_MSGCNT%22%3A%224%22%2C%22CUL868_RAWMSG%22%3A%22K41044239%22%2C%22CUL868_RSSI%22%3A%22-78.5%22%2C%22CUL868_TIME%22%3A%222020-03-21%2015%3A12%3A55%22%2C%22DEF%22%3A%225%22%2C%22FUUID%22%3A%225c42d2ee-f33f-55ca-c362-cee9c16d75578b02%22%2C%22LASTInputDev%22%3A%22CUL868%22%2C%22MSGCNT%22%3A%224%22%2C%22NAME%22%3A%22TFS_Compizimmer%22%2C%22NR%22%3A%22152%22%2C%22STATE%22%3A%22T%3A%2020.4%20%20H%3A%2039.4%22%2C%22TYPE%22%3A%22CUL_WS%22%2C%22corr1%22%3A%220%22%2C%22corr2%22%3A%220%22%2C%22corr3%22%3A%220%22%2C%22corr4%22%3A%220%22%7D%2C%22Readings%22%3A%7B%22DEVFAMILY%22%3A%7B%22Value%22%3A%22WS300%22%2C%22Time%22%3A%222020-03-21%2015%3A12%3A55%22%7D%2C%22DEVTYPE%22%3A%7B%22Value%22%3A%22S300TH%22%2C%22Time%22%3A%222020-03-21%2015%3A12%3A55%22%7D%2C%22humidity%22%3A%7B%22Value%22%3A%2239.4%22%2C%22Time%22%3A%222020-03-21%2015%3A12%3A55%22%7D%2C%22state%22%3A%7B%22Value%22%3A%22T%3A%2020.4%20%20H%3A%2039.4%22%2C%22Time%22%3A%222020-03-21%2015%3A12%3A55%22%7D%2C%22temperature%22%3A%7B%22Value%22%3A%2220.4%22%2C%22Time%22%3A%222020-03-21%2015%3A12%3A55%22%7D%7D%2C%22Attributes%22%3A%7B%22IODev%22%3A%22CUL868%22%2C%22event-on-change-reading%22%3A%22.*%22%2C%22fp_wohnung%22%3A%22368%2C471%2C4%2CCompizimmer%2C%22%2C%22genericDeviceType%22%3A%22thermometer%22%2C%22icon%22%3A%22temperature_humidity%22%2C%22model%22%3A%22S300TH%22%2C%22room%22%3A%22FHEMSync%2CCompizimmer%22%7D%7D
Hab dazu probeweise ein S300TH Thermometer in den FHEMSync Raum gelegt.
Das tauchte dann am anderen Raspi auf!
Seitdem stürzt allerdings dieser Raspi minütlich ab!
Zur Hardware - der HauptFHEM ist ein BananaPi Pro und der RemoteFHEM ein Raspi3B
Hmm...das ist eigenartig. Kommt dann auf diesen Raspi irgendeine Fehlermeldung im FHEM Log oder fhemsync Log?
Also bei mir geht es jetzt auch, jedoch mit folgendem Ergebnis:
[SLAVE ] Fetching FHEM devices...
[SLAVE ] executing: http://xxx:8083/fhem?XHR=1&cmd=jsonlist2%20room%3DFHEMSync
[SLAVE ] got: 0 results
Er findet im Raum FHEMSync keine devices, da sind aber 2 drin.
wenn ich die URL von Hand aufrufe auch kein Ergebnis:
{
"Arg":"room=FHEMSync",
"Results": [
],
"totalResultsReturned":0
}
auch wenn ich in FHEM egal ob auf dem Master oder slave 'jsonlist2 room FHEMSync' absetzte kein Ergebnis.
Gruß
Carlos
Zwischen room und FHEMSync muss ein = sein, probier das mal manuell. Wenn das geht, dann schauen wir weiter.
hier das fhem.log
2020.03.21 15:29:58.147 3: fhemsync: starting
2020.03.21 15:29:58.157 3: fhemsync: using logfile: ./log/fhemsync-2020-03-21.log
2020.03.21 15:30:03.717 3: fhemsync: read: end of file reached while sysread
2020.03.21 15:30:03.717 3: fhemsync: stopped
2020.03.21 15:30:17.377 3: fhemsync: starting
2020.03.21 15:30:17.391 3: fhemsync: using logfile: ./log/fhemsync-2020-03-21.log
2020.03.21 15:30:18.661 3: fhemsync: read: end of file reached while sysread
2020.03.21 15:30:18.662 3: fhemsync: stopped
2020.03.21 15:30:33.217 3: fhemsync: starting
2020.03.21 15:30:33.232 3: fhemsync: using logfile: ./log/fhemsync-2020-03-21.log
2020.03.21 15:30:34.811 3: fhemsync: read: end of file reached while sysread
2020.03.21 15:30:34.812 3: fhemsync: stopped
2020.03.21 15:30:48.391 3: fhemsync: starting
2020.03.21 15:30:48.407 3: fhemsync: using logfile: ./log/fhemsync-2020-03-21.log
Can't use string ("null") as an ARRAY ref while "strict refs" in use at fhem.pl line 4582.
2020.03.21 15:30:52 2: [MaxScan] UtilsMaxScan_Initialize.68 MaxScan is starting
2020.03.21 15:30:53.111 1: Including fhem.cfg
2020.03.21 15:30:53.155 3: telnet: port 7072 opened
2020.03.21 15:30:53.453 3: WEB: port 8083 opened
2020.03.21 15:30:53.554 2: eventTypes: loaded 292 events from demolog/eventTypes.txt
2020.03.21 15:30:53.876 0: [echodevice] load ECHO Device ECHO_709418075143066P
2020.03.21 15:30:53.879 0: [echodevice] load ECHO Device ECHO_70941807515226UP
2020.03.21 15:30:53.882 0: [echodevice] load ECHO Device EchoDot_Compizimmer
2020.03.21 15:30:53.885 0: [echodevice] load ECHO Device EchoDot_WZDieter
2020.03.21 15:30:53.887 0: [echodevice] load ECHO Device EchoDot_SZDieter
2020.03.21 15:30:53.890 1: Including /var/log/fhem/fhem.save
2020.03.21 15:30:53.933 0: Featurelevel: 6
2020.03.21 15:30:53.933 0: Server started with 14 defined entities (fhem.pl:21337/2020-03-02 perl:5.028001 os:linux user:fhem pid:31568)
2020.03.21 15:33:30.665 0: Server shutdown
Bei mir legt er das erkannte Device an, stürzt dann aber laufend ab!
Das habe ich auch probiert mit einem anderen Raum funktioniert das.
Nur mit dem Raum FHEMSync nicht. Keine Ahnung warum.
Ach du Schande, jetzt sehe ich das Problem.
Beim Anlegen des Raumes ist mir am Ende noch ein Leerzeichen durch cut and paste mit dazu gekommen.
Kaum macht man es richtig, geht's.
Zitat von: punker am 21 März 2020, 15:43:11
hier das fhem.log
2020.03.21 15:29:58.147 3: fhemsync: starting
2020.03.21 15:29:58.157 3: fhemsync: using logfile: ./log/fhemsync-2020-03-21.log
2020.03.21 15:30:03.717 3: fhemsync: read: end of file reached while sysread
2020.03.21 15:30:03.717 3: fhemsync: stopped
2020.03.21 15:30:17.377 3: fhemsync: starting
2020.03.21 15:30:17.391 3: fhemsync: using logfile: ./log/fhemsync-2020-03-21.log
2020.03.21 15:30:18.661 3: fhemsync: read: end of file reached while sysread
2020.03.21 15:30:18.662 3: fhemsync: stopped
2020.03.21 15:30:33.217 3: fhemsync: starting
2020.03.21 15:30:33.232 3: fhemsync: using logfile: ./log/fhemsync-2020-03-21.log
2020.03.21 15:30:34.811 3: fhemsync: read: end of file reached while sysread
2020.03.21 15:30:34.812 3: fhemsync: stopped
2020.03.21 15:30:48.391 3: fhemsync: starting
2020.03.21 15:30:48.407 3: fhemsync: using logfile: ./log/fhemsync-2020-03-21.log
Can't use string ("null") as an ARRAY ref while "strict refs" in use at fhem.pl line 4582.
2020.03.21 15:30:52 2: [MaxScan] UtilsMaxScan_Initialize.68 MaxScan is starting
2020.03.21 15:30:53.111 1: Including fhem.cfg
2020.03.21 15:30:53.155 3: telnet: port 7072 opened
2020.03.21 15:30:53.453 3: WEB: port 8083 opened
2020.03.21 15:30:53.554 2: eventTypes: loaded 292 events from demolog/eventTypes.txt
2020.03.21 15:30:53.876 0: [echodevice] load ECHO Device ECHO_709418075143066P
2020.03.21 15:30:53.879 0: [echodevice] load ECHO Device ECHO_70941807515226UP
2020.03.21 15:30:53.882 0: [echodevice] load ECHO Device EchoDot_Compizimmer
2020.03.21 15:30:53.885 0: [echodevice] load ECHO Device EchoDot_WZDieter
2020.03.21 15:30:53.887 0: [echodevice] load ECHO Device EchoDot_SZDieter
2020.03.21 15:30:53.890 1: Including /var/log/fhem/fhem.save
2020.03.21 15:30:53.933 0: Featurelevel: 6
2020.03.21 15:30:53.933 0: Server started with 14 defined entities (fhem.pl:21337/2020-03-02 perl:5.028001 os:linux user:fhem pid:31568)
2020.03.21 15:33:30.665 0: Server shutdown
Bei mir legt er das erkannte Device an, stürzt dann aber laufend ab!
Kannst du es bitte mal mit einem anderen Device im FHEMSync Raum testen?
Habs schon mit ner IT Steckdose getestet und es funktioniert.
Allerdings reagiert FHEM sehr träge, um das Log anzuzeigen braucht es so ca. ne Minute!
Habs auch hier wieder gelöscht und probiere es noch ein letztes Mal so rum wie es eigentlich gedacht war!
Wenn es traege ist, schau mal bitte ins Log bzw. fhemsync Log ob dort was zu sehen ist.
Habs auf dem BananaPi nochmal probiert und da gibt es einen Logeintrag
unexpected end of string while parsing JSON string, at character offset 4239 (before "(end of string)") at ./FHEM/10_FHEMSYNC_DEVICE.pm line 84.
Weiß nicht ob das was bedeutet?
Aber wie gehabt beim setzen des Attributes FHEMSync-selfsignedcert auf true stürzt FHEM ab!
Ich geb auf!
Danke für die Unterstützung!
Folgende Ideen habe ich dazu noch:
- FHEMSync Raum ist leer am Remote Pi
- FHEM liefert, aus welchen Grund auch immer, ein falsches JSON Format zurueck
- Das gesamte Log von fhemsync waere jetzt interessant, da sollte sich definitiv was finden lassen. Das Device duerfte schon angelegt worden sein, nur das JSON des Devices setzen macht jetzt noch Probleme.
Falls du nicht mehr weiter testen magst, ist das auch in Ordnung. Gerne wuerde ich den Fehler jedoch ausfindig machen :)
hier ist das Log:
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.5","fhem":true,"webname":true,"device":true}
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=jsonlist2%20fhemsync
[MASTER ] FHEMSYNC device detected: fhemsync
[SLAVE ] executing: http://192.168.2.161:8083/fhem?XHR=1
[MASTER ] starting longpoll: http://192.168.2.219:8083/fhem?XHR=1&inform=type=status;addglobal=1;filter=.*;since=null;fmt=JSON×tamp=1584805255512
[SLAVE ] starting longpoll: http://192.168.2.161:8083/fhem?XHR=1&inform=type=status;addglobal=1;filter=.*;since=null;fmt=JSON×tamp=1584805257121
[MASTER ] Fetching FHEM devices...
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=jsonlist2%20TYPE%3DFHEMSYNC_DEVICE
[MASTER ] got: 0 results
[SLAVE ] Fetching FHEM devices...
[SLAVE ] executing: http://192.168.2.161:8083/fhem?XHR=1&cmd=jsonlist2%20room%3DFHEMSync
[SLAVE ] got: 3 results
[MAIN ] Create device: EchoDot_Compizimmer
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=define%20EchoDot_Compizimmer%20FHEMSYNC_DEVICE%20echodevice%20EchoDot_Compizimmer
[MAIN ] Create device: EchoDot_SZDieter
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=define%20EchoDot_SZDieter%20FHEMSYNC_DEVICE%20echodevice%20EchoDot_SZDieter
[MAIN ] Create device: myRaspiEcho
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=define%20myRaspiEcho%20FHEMSYNC_DEVICE%20echodevice%20myRaspiEcho
[MASTER ] Fetching FHEM devices...
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=jsonlist2%20TYPE%3DFHEMSYNC_DEVICE
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=set%20EchoDot_SZDieter%20x_json%20%7B%22Name%22%3A%22EchoDot_SZDieter%22%2C%22PossibleSets%22%3A%22volume%3Aslider%2C0%2C1%2C100%20play%3AnoArg%20pause%3AnoArg%20next%3AnoArg%20previous%3AnoArg%20forward%3AnoArg%20rewind%3AnoArg%20shuffle%3Aon%2Coff%20repeat%3Aon%2Coff%20dnd%3Aon%2Coff%20volume_alarm%3Aslider%2C0%2C1%2C100%20info%3ABeliebig_Auf_Wiedersehen%2CBeliebig_Bestaetigung%2CBeliebig_Geburtstag%2CBeliebig_Guten_Morgen%2CBeliebig_Gute_Nacht%2CBeliebig_Ich_Bin_Zuhause%2CBeliebig_Kompliment%2CErzaehle_Geschichte%2CErzaehle_Was_Neues%2CErzaehle_Witz%2CKalender_Heute%2CKalender_Morgen%2CKalender_Naechstes_Ereignis%2CNachrichten%2CSinge_Song%2CVerkehr%2CWetter%20sounds%3Aglocken%2Ckirchenglocke%2Csummer%2Ctuerklingel_1%2Ctuerklingel_2%2Ctuerklingel_3%2Cjubelnde_menschenmenge%2Cpublikumsapplaus%2Cflugzeug%2Ckatastrophenalarm%2Cmotoren_an%2Cschilde_hoch%2Csirenen%2Czappen%2Cboing_1%2Cboing_2%2Ckamera%2Clufthupe%2Cquitschende_tuer%2Ctickende_uhr%2Ctrompete%2Chahn%2Chundegebell%2Ckatzenmauzen%2Cloewengebruell%2Cwolfsgeheul%2Cgruselig_quitschende_tuer%2Cweihnachtsglocken%20tunein%20primeplaylist%20primeplaysender%20primeplayeigene%20primeplayeigeneplaylist%20alarm_normal%20alarm_repeat%20reminder_normal%20reminder_repeat%20speak%20speak_ssml%20tts%20tts_translate%3AtextField-long%20playownmusic%3AtextField-long%20saveownplaylist%3AtextField-long%20track%20bluetooth_connect%3A-%20bluetooth_disconnect%3A-%20routine_play%3A%40amzn1.alexa.behaviors.preconfigured%3Aalarm_dismissed_with_condition_preconfigured_routine%2CIch_bin_zuhause%40amzn1.alexa.behaviors.preconfigured%3Aim_home_custom_utt_trigger%2Cgute_Nacht%40amzn1.alexa.behaviors.preconfigured%3Agood_night_custom_utt_trigger%2Clichter_aus%40amzn1.alexa.automation.0a44666b-0c4a-4807-b121-04676b5ef6a2%2Clichter_ein%40amzn1.alexa.automation.4697b58e-f964-4891-b9ba-492eee2e2453%2Cstarte_meinen_Tag%40amzn1.alexa.behaviors.preconfigured%3Astart_my_day_custom_utt_trigger%20%22%2C%22PossibleAttrs%22%3A%22alias%20comment%3AtextField-long%20eventMap%3AtextField-long%20group%20room%20suppressReading%20userReadings%3AtextField-long%20verbose%3A0%2C1%2C2%2C3%2C4%2C5%20disable%3A0%2C1%20IODev%20TTS_Voice%3AAustralianEnglish_Female_Nicole%2CAustralianEnglish_Male_Russell%2CBrazilianPortuguese_Female_Vitoria%2CBrazilianPortuguese_Male_Ricardo%2CBritishEnglish_Female_Amy%2CBritishEnglish_Female_Emma%2CBritishEnglish_Male_Brian%2CCanadianFrench_Female_Chantal%2CCastilianSpanish_Female_Conchita%2CCastilianSpanish_Male_Enrique%2CDanish_Female_Naja%2CDanish_Male_Mads%2CDutch_Female_Lotte%2CDutch_Male_Ruben%2CFrench_Female_Celine%2CFrench_Male_Mathieu%2CGerman_Female_Google%2CGerman_Female_Marlene%2CGerman_Female_Vicki%2CGerman_Male_Hans%2CIcelandic_Female_Dora%2CIcelandic_Male_Karl%2CIndianEnglish_Female_Aditi%2CIndianEnglish_Female_Raveena%2CItalian_Female_Carla%2CItalian_Male_Giorgio%2CJapanese_Female_Mizuki%2CJapanese_Male_Takumi%2CKorean_Female_Seoyeon%2CNorwegian_Female_Liv%2CPolish_Female_Ewa%2CPolish_Female_Maja%2CPolish_Male_Jacek%2CPolish_Male_Jan%2CPortuguese_Female_Ines%2CPortuguese_Male_Cristiano%2CRomanian_Female_Carmen%2CRussian_Female_Tatyana%2CRussian_Male_Maxim%2CSwedish_Female_Astrid%2CTurkish_Female_Filiz%2CUSEnglish_Female_Ivy%2CUSEnglish_Female_Joanna%2CUSEnglish_Female_Kendra%2CUSEnglish_Female_Kimberly%2CUSEnglish_Female_Salli%2CUSEnglish_Male_Joey%2CUSEnglish_Male_Justin%2CUSEnglish_Male_Matthew%2CUSSpanish_Female_Penelope%2CUSSpanish_Male_Miguel%2CWelshEnglish_Female_Gwyneth%2CWelshEnglish_Male_Geraint%20TTS_IgnorPlay%3A0%2C1%20TTS_normalize%3Aslider%2C5%2C1%2C40%20TTS_Translate_From%3Adutch%2Cenglish%2Cfrench%2Cgerman%2Citalian%2Cjapanese%2Ckorean%2Cportuguese%2Crussian%2Cspanish%2Cturkish%20intervalsettings%20intervallogin%20intervalvoice%3Aslider%2C0%2C1%2C100%20ignorevoicecommand%20speak_volume%3Aslider%2C0%2C1%2C100%20server%20cookie%20reminder_delay%20tunein_default%20autocreate_refresh%3A0%2C1%20browser_useragent%20browser_language%20browser_save_data%3A0%2C1%20browser_useragent_random%3A0%2C1%20npm_proxy_port%20npm_proxy_ip%20npm_proxy_listen_ip%20npm_refresh_intervall%20npm_bin%20npm_bin_node%20event-aggregator%20event-min-interval%20event-on-change-reading%20event-on-update-reading%20oldreadings%20stateFormat%3AtextField-long%20timestamp-on-change-reading%20cmdIcon%20devStateIcon%20devStateIcon%3AtextField-long%20devStateStyle%20icon%20lightSceneParamsToSave%20lightSceneRestoreOnlyIfChanged%3A1%2C0%20sortby%20structexclude%20webCmd%20webCmdLabel%3AtextField-long%20widgetOverride%20userattr%22%2C%22Internals%22%3A%7B%22DEF%22%3A%22A1RABVCI4QCIKC%20G090XG089433024F%22%2C%22FUUID%22%3A%225e625a38-f33f-55ca-f1d5-1ea8509b9996b3e9%22%2C%22LOGINMODE%22%3A%22IODEV%22%2C%22NAME%22%3A%22EchoDot_SZDieter%22%2C%22NR%22%3A%2225%22%2C%22NTFY_ORDER%22%3A%2250-EchoDot_SZDieter%22%2C%22STATE%22%3A%22connected%22%2C%22TYPE%22%3A%22echodevice%22%2C%22model%22%3A%22Echo%20Dot%20Gen3%22%7D%2C%22Readings%22%3A%7B%22BrowserLanguage%22%3A%7B%22Value%22%3A%22de%2Cen-US%3Bq%3D0.7%2Cen%3Bq%3D0.3%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22BrowserUserAgent%22%3A%7B%22Value%22%3A%22Mozilla%2F5.0%20(Windows%20NT%2010.0%3B%20Win64%3B%20x64%3B%20rv%3A62.0)%20Gecko%2F20100101%20Firefox%2F62.0%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22COOKIE_MODE%22%3A%7B%22Value%22%3A%22IODEV%22%2C%22Time%22%3A%222020-03-21%2016%3A37%3A02%22%7D%2C%22channel%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22currentAlbum%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22currentArtist%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22currentArtwork%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22currentTitle%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22currentTuneInID%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22deviceAddress%22%3A%7B%22Value%22%3A%22Gwendweg%206%2C%20Schwarzenfeld%2C%20Bavaria%2C%20DE%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A38%22%7D%2C%22dnd%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A50%22%7D%2C%22microphone%22%3A%7B%22Value%22%3A%22false%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A38%22%7D%2C%22model%22%3A%7B%22Value%22%3A%22Echo%20Dot%20Gen3%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A36%22%7D%2C%22mute%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22online%22%3A%7B%22Value%22%3A%22true%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A37%22%7D%2C%22playStatus%22%3A%7B%22Value%22%3A%22stopped%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22presence%22%3A%7B%22Value%22%3A%22present%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A36%22%7D%2C%22progress%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22progresslen%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22repeat%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22shuffle%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22state%22%3A%7B%22Value%22%3A%22connected%22%2C%22Time%22%3A%222020-03-21%2016%3A37%3A13%22%7D%2C%22timeZoneId%22%3A%7B%22Value%22%3A%22Europe%2FBerlin%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A38%22%7D%2C%22version%22%3A%7B%22Value%22%3A%223658075268%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A36%22%7D%2C%22voice%22%3A%7B%22Value%22%3A%22sprich%20mir%20nach%20die%20haust%C3%BCr%20wurde%20ge%C3%B6ffnet%22%2C%22Time%22%3A%222020-03-21%2015%3A04%3A57%22%7D%2C%22voice_timestamp%22%3A%7B%22Value%22%3A%221584797993170%22%2C%22Time%22%3A%222020-03-21%2015%3A04%3A57%22%7D%2C%22volume%22%3A%7B%22Value%22%3A%2225%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22volume_alarm%22%3A%7B%22Value%22%3A%2230%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A59%22%7D%2C%22wakeword%22%3A%7B%22Value%22%3A%22ALEXA%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A49%22%7D%7D%2C%22Attributes%22%3A%7B%22IODev%22%3A%22myRaspiEcho%22%2C%22alias%22%3A%22Echo%20Dot%20Schlafzimmer%20Dieter%22%2C%22devStateIcon%22%3A%22connected%3Arc_GREEN%3Aoff%20connected%20but%20loginerror%3Arc_RED%3Aon%22%2C%22icon%22%3A%22echo%22%2C%22room%22%3A%22Amazon%2CFHEMSync%22%7D%7D
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=set%20EchoDot_Compizimmer%20x_json%20%7B%22Name%22%3A%22EchoDot_Compizimmer%22%2C%22PossibleSets%22%3A%22volume%3Aslider%2C0%2C1%2C100%20play%3AnoArg%20pause%3AnoArg%20next%3AnoArg%20previous%3AnoArg%20forward%3AnoArg%20rewind%3AnoArg%20shuffle%3Aon%2Coff%20repeat%3Aon%2Coff%20dnd%3Aon%2Coff%20volume_alarm%3Aslider%2C0%2C1%2C100%20info%3ABeliebig_Auf_Wiedersehen%2CBeliebig_Bestaetigung%2CBeliebig_Geburtstag%2CBeliebig_Guten_Morgen%2CBeliebig_Gute_Nacht%2CBeliebig_Ich_Bin_Zuhause%2CBeliebig_Kompliment%2CErzaehle_Geschichte%2CErzaehle_Was_Neues%2CErzaehle_Witz%2CKalender_Heute%2CKalender_Morgen%2CKalender_Naechstes_Ereignis%2CNachrichten%2CSinge_Song%2CVerkehr%2CWetter%20sounds%3Aglocken%2Ckirchenglocke%2Csummer%2Ctuerklingel_1%2Ctuerklingel_2%2Ctuerklingel_3%2Cjubelnde_menschenmenge%2Cpublikumsapplaus%2Cflugzeug%2Ckatastrophenalarm%2Cmotoren_an%2Cschilde_hoch%2Csirenen%2Czappen%2Cboing_1%2Cboing_2%2Ckamera%2Clufthupe%2Cquitschende_tuer%2Ctickende_uhr%2Ctrompete%2Chahn%2Chundegebell%2Ckatzenmauzen%2Cloewengebruell%2Cwolfsgeheul%2Cgruselig_quitschende_tuer%2Cweihnachtsglocken%20tunein%20primeplaylist%20primeplaysender%20primeplayeigene%20primeplayeigeneplaylist%20alarm_normal%20alarm_repeat%20reminder_normal%20reminder_repeat%20speak%20speak_ssml%20tts%20tts_translate%3AtextField-long%20playownmusic%3AtextField-long%20saveownplaylist%3AtextField-long%20track%20bluetooth_connect%3A-%20bluetooth_disconnect%3A-%20routine_play%3A%40amzn1.alexa.behaviors.preconfigured%3Aalarm_dismissed_with_condition_preconfigured_routine%2CIch_bin_zuhause%40amzn1.alexa.behaviors.preconfigured%3Aim_home_custom_utt_trigger%2Cgute_Nacht%40amzn1.alexa.behaviors.preconfigured%3Agood_night_custom_utt_trigger%2Clichter_aus%40amzn1.alexa.automation.0a44666b-0c4a-4807-b121-04676b5ef6a2%2Clichter_ein%40amzn1.alexa.automation.4697b58e-f964-4891-b9ba-492eee2e2453%2Cstarte_meinen_Tag%40amzn1.alexa.behaviors.preconfigured%3Astart_my_day_custom_utt_trigger%20%22%2C%22PossibleAttrs%22%3A%22alias%20comment%3AtextField-long%20eventMap%3AtextField-long%20group%20room%20suppressReading%20userReadings%3AtextField-long%20verbose%3A0%2C1%2C2%2C3%2C4%2C5%20disable%3A0%2C1%20IODev%20TTS_Voice%3AAustralianEnglish_Female_Nicole%2CAustralianEnglish_Male_Russell%2CBrazilianPortuguese_Female_Vitoria%2CBrazilianPortuguese_Male_Ricardo%2CBritishEnglish_Female_Amy%2CBritishEnglish_Female_Emma%2CBritishEnglish_Male_Brian%2CCanadianFrench_Female_Chantal%2CCastilianSpanish_Female_Conchita%2CCastilianSpanish_Male_Enrique%2CDanish_Female_Naja%2CDanish_Male_Mads%2CDutch_Female_Lotte%2CDutch_Male_Ruben%2CFrench_Female_Celine%2CFrench_Male_Mathieu%2CGerman_Female_Google%2CGerman_Female_Marlene%2CGerman_Female_Vicki%2CGerman_Male_Hans%2CIcelandic_Female_Dora%2CIcelandic_Male_Karl%2CIndianEnglish_Female_Aditi%2CIndianEnglish_Female_Raveena%2CItalian_Female_Carla%2CItalian_Male_Giorgio%2CJapanese_Female_Mizuki%2CJapanese_Male_Takumi%2CKorean_Female_Seoyeon%2CNorwegian_Female_Liv%2CPolish_Female_Ewa%2CPolish_Female_Maja%2CPolish_Male_Jacek%2CPolish_Male_Jan%2CPortuguese_Female_Ines%2CPortuguese_Male_Cristiano%2CRomanian_Female_Carmen%2CRussian_Female_Tatyana%2CRussian_Male_Maxim%2CSwedish_Female_Astrid%2CTurkish_Female_Filiz%2CUSEnglish_Female_Ivy%2CUSEnglish_Female_Joanna%2CUSEnglish_Female_Kendra%2CUSEnglish_Female_Kimberly%2CUSEnglish_Female_Salli%2CUSEnglish_Male_Joey%2CUSEnglish_Male_Justin%2CUSEnglish_Male_Matthew%2CUSSpanish_Female_Penelope%2CUSSpanish_Male_Miguel%2CWelshEnglish_Female_Gwyneth%2CWelshEnglish_Male_Geraint%20TTS_IgnorPlay%3A0%2C1%20TTS_normalize%3Aslider%2C5%2C1%2C40%20TTS_Translate_From%3Adutch%2Cenglish%2Cfrench%2Cgerman%2Citalian%2Cjapanese%2Ckorean%2Cportuguese%2Crussian%2Cspanish%2Cturkish%20intervalsettings%20intervallogin%20intervalvoice%3Aslider%2C0%2C1%2C100%20ignorevoicecommand%20speak_volume%3Aslider%2C0%2C1%2C100%20server%20cookie%20reminder_delay%20tunein_default%20autocreate_refresh%3A0%2C1%20browser_useragent%20browser_language%20browser_save_data%3A0%2C1%20browser_useragent_random%3A0%2C1%20npm_proxy_port%20npm_proxy_ip%20npm_proxy_listen_ip%20npm_refresh_intervall%20npm_bin%20npm_bin_node%20event-aggregator%20event-min-interval%20event-on-change-reading%20event-on-update-reading%20oldreadings%20stateFormat%3AtextField-long%20timestamp-on-change-reading%20cmdIcon%20devStateIcon%20devStateIcon%3AtextField-long%20devStateStyle%20icon%20lightSceneParamsToSave%20lightSceneRestoreOnlyIfChanged%3A1%2C0%20sortby%20structexclude%20webCmd%20webCmdLabel%3AtextField-long%20widgetOverride%20userattr%22%2C%22Internals%22%3A%7B%22DEF%22%3A%22A3S5BH2HU6VAYF%20G090LF1182340FQB%22%2C%22FUUID%22%3A%225e625a38-f33f-55ca-8296-6aea7706b8c5e20f%22%2C%22LOGINMODE%22%3A%22IODEV%22%2C%22NAME%22%3A%22EchoDot_Compizimmer%22%2C%22NR%22%3A%2223%22%2C%22NTFY_ORDER%22%3A%2250-EchoDot_Compizimmer%22%2C%22STATE%22%3A%22connected%22%2C%22TYPE%22%3A%22echodevice%22%2C%22model%22%3A%22Echo%20Dot%22%7D%2C%22Readings%22%3A%7B%22BrowserLanguage%22%3A%7B%22Value%22%3A%22de%2Cen-US%3Bq%3D0.7%2Cen%3Bq%3D0.3%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22BrowserUserAgent%22%3A%7B%22Value%22%3A%22Mozilla%2F5.0%20(Windows%20NT%2010.0%3B%20Win64%3B%20x64%3B%20rv%3A62.0)%20Gecko%2F20100101%20Firefox%2F62.0%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22COOKIE_MODE%22%3A%7B%22Value%22%3A%22IODEV%22%2C%22Time%22%3A%222020-03-21%2016%3A37%3A02%22%7D%2C%22alarm_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22channel%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22currentAlbum%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22currentArtist%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22currentArtwork%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22currentTitle%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22currentTuneInID%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22deviceAddress%22%3A%7B%22Value%22%3A%22Gwendweg%206%2C%20Schwarzenfeld%2C%20Bayern%2C%20DE%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A38%22%7D%2C%22dnd%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A50%22%7D%2C%22microphone%22%3A%7B%22Value%22%3A%22false%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A38%22%7D%2C%22model%22%3A%7B%22Value%22%3A%22Echo%20Dot%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A36%22%7D%2C%22musicalarm_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22mute%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22online%22%3A%7B%22Value%22%3A%22true%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A37%22%7D%2C%22playStatus%22%3A%7B%22Value%22%3A%22stopped%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A19%22%7D%2C%22presence%22%3A%7B%22Value%22%3A%22present%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A36%22%7D%2C%22progress%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22progresslen%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22reminder_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22repeat%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22shuffle%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22state%22%3A%7B%22Value%22%3A%22connected%22%2C%22Time%22%3A%222020-03-21%2016%3A37%3A13%22%7D%2C%22timeZoneId%22%3A%7B%22Value%22%3A%22Europe%2FParis%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A38%22%7D%2C%22timer_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22timer_id%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22timer_remainingtime%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22version%22%3A%7B%22Value%22%3A%22651614420%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A36%22%7D%2C%22voice%22%3A%7B%22Value%22%3A%22sprich%20mir%20nach%20die%20haust%C3%BCr%20wurde%20ge%C3%B6ffnet%22%2C%22Time%22%3A%222020-03-21%2016%3A37%3A14%22%7D%2C%22voice_timestamp%22%3A%7B%22Value%22%3A%221584804702714%22%2C%22Time%22%3A%222020-03-21%2016%3A37%3A14%22%7D%2C%22volume%22%3A%7B%22Value%22%3A%2220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A20%22%7D%2C%22volume_alarm%22%3A%7B%22Value%22%3A%2270%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A59%22%7D%2C%22wakeword%22%3A%7B%22Value%22%3A%22ALEXA%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A49%22%7D%7D%2C%22Attributes%22%3A%7B%22IODev%22%3A%22myRaspiEcho%22%2C%22alias%22%3A%22Echo%20Dot%20Compizimmer%22%2C%22devStateIcon%22%3A%22connected%3Arc_GREEN%3Aoff%20connected%20but%20loginerror%3Arc_RED%3Aon%22%2C%22icon%22%3A%22echo%22%2C%22room%22%3A%22Amazon%2CFHEMSync%22%7D%7D
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=set%20myRaspiEcho%20x_json%20%7B%22Name%22%3A%22myRaspiEcho%22%2C%22PossibleSets%22%3A%22autocreate_devices%3AnoArg%20item_shopping_add%20item_task_add%20AWS_Access_Key%20AWS_Secret_Key%20TTS_IPAddress%20TTS_Filename%20TTS_TuneIn%20POM_TuneIn%20POM_IPAddress%20POM_Filename%20AWS_OutputFormat%3Amp3%2Cogg_vorbis%2Cpcm%20textmessage%20config_address_from%20config_address_to%20config_address_between%20mobilmessage%20NPM_install%3AnoArg%20NPM_login%3Anew%2Crefresh%20%20item_shopping_delete%3A%20item_task_delete%3A%22%2C%22PossibleAttrs%22%3A%22alias%20comment%3AtextField-long%20eventMap%3AtextField-long%20group%20room%20suppressReading%20userReadings%3AtextField-long%20verbose%3A0%2C1%2C2%2C3%2C4%2C5%20disable%3A0%2C1%20IODev%20TTS_Voice%3AAustralianEnglish_Female_Nicole%2CAustralianEnglish_Male_Russell%2CBrazilianPortuguese_Female_Vitoria%2CBrazilianPortuguese_Male_Ricardo%2CBritishEnglish_Female_Amy%2CBritishEnglish_Female_Emma%2CBritishEnglish_Male_Brian%2CCanadianFrench_Female_Chantal%2CCastilianSpanish_Female_Conchita%2CCastilianSpanish_Male_Enrique%2CDanish_Female_Naja%2CDanish_Male_Mads%2CDutch_Female_Lotte%2CDutch_Male_Ruben%2CFrench_Female_Celine%2CFrench_Male_Mathieu%2CGerman_Female_Google%2CGerman_Female_Marlene%2CGerman_Female_Vicki%2CGerman_Male_Hans%2CIcelandic_Female_Dora%2CIcelandic_Male_Karl%2CIndianEnglish_Female_Aditi%2CIndianEnglish_Female_Raveena%2CItalian_Female_Carla%2CItalian_Male_Giorgio%2CJapanese_Female_Mizuki%2CJapanese_Male_Takumi%2CKorean_Female_Seoyeon%2CNorwegian_Female_Liv%2CPolish_Female_Ewa%2CPolish_Female_Maja%2CPolish_Male_Jacek%2CPolish_Male_Jan%2CPortuguese_Female_Ines%2CPortuguese_Male_Cristiano%2CRomanian_Female_Carmen%2CRussian_Female_Tatyana%2CRussian_Male_Maxim%2CSwedish_Female_Astrid%2CTurkish_Female_Filiz%2CUSEnglish_Female_Ivy%2CUSEnglish_Female_Joanna%2CUSEnglish_Female_Kendra%2CUSEnglish_Female_Kimberly%2CUSEnglish_Female_Salli%2CUSEnglish_Male_Joey%2CUSEnglish_Male_Justin%2CUSEnglish_Male_Matthew%2CUSSpanish_Female_Penelope%2CUSSpanish_Male_Miguel%2CWelshEnglish_Female_Gwyneth%2CWelshEnglish_Male_Geraint%20TTS_IgnorPlay%3A0%2C1%20TTS_normalize%3Aslider%2C5%2C1%2C40%20TTS_Translate_From%3Adutch%2Cenglish%2Cfrench%2Cgerman%2Citalian%2Cjapanese%2Ckorean%2Cportuguese%2Crussian%2Cspanish%2Cturkish%20intervalsettings%20intervallogin%20intervalvoice%3Aslider%2C0%2C1%2C100%20ignorevoicecommand%20speak_volume%3Aslider%2C0%2C1%2C100%20server%20cookie%20reminder_delay%20tunein_default%20autocreate_refresh%3A0%2C1%20browser_useragent%20browser_language%20browser_save_data%3A0%2C1%20browser_useragent_random%3A0%2C1%20npm_proxy_port%20npm_proxy_ip%20npm_proxy_listen_ip%20npm_refresh_intervall%20npm_bin%20npm_bin_node%20event-aggregator%20event-min-interval%20event-on-change-reading%20event-on-update-reading%20oldreadings%20stateFormat%3AtextField-long%20timestamp-on-change-reading%20cmdIcon%20devStateIcon%20devStateIcon%3AtextField-long%20devStateStyle%20icon%20lightSceneParamsToSave%20lightSceneRestoreOnlyIfChanged%3A1%2C0%20sortby%20structexclude%20webCmd%20webCmdLabel%3AtextField-long%20widgetOverride%20userattr%22%2C%22Internals%22%3A%7B%22DEF%22%3A%22xxx%40xxx.xx%20xxx%22%2C%22FUUID%22%3A%225e6250bc-f33f-55ca-b2aa-3c0e8ae81d40cd37%22%2C%22LOGINMODE%22%3A%22NPM%22%2C%22NAME%22%3A%22myRaspiEcho%22%2C%22NR%22%3A%2219%22%2C%22NTFY_ORDER%22%3A%2250-myRaspiEcho%22%2C%22STATE%22%3A%22connected%22%2C%22TYPE%22%3A%22echodevice%22%2C%22model%22%3A%22ACCOUNT%22%7D%2C%22Readings%22%3A%7B%22BrowserLanguage%22%3A%7B%22Value%22%3A%22de%2Cen-US%3Bq%3D0.7%2Cen%3Bq%3D0.3%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A59%22%7D%2C%22BrowserUserAgent%22%3A%7B%22Value%22%3A%22Mozilla%2F5.0%20(Windows%20NT%2010.0%3B%20Win64%3B%20x64%3B%20rv%3A62.0)%20Gecko%2F20100101%20Firefox%2F62.0%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A59%22%7D%2C%22COOKIE_MODE%22%3A%7B%22Value%22%3A%22NPM%22%2C%22Time%22%3A%222020-03-21%2016%3A37%3A02%22%7D%2C%22COOKIE_STATE%22%3A%7B%22Value%22%3A%22OK%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22COOKIE_TYPE%22%3A%7B%22Value%22%3A%22READING_NPM%22%2C%22Time%22%3A%222020-03-21%2016%3A37%3A07%22%7D%2C%22alarm_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22amazon_refreshtoken%22%3A%7B%22Value%22%3A%22vorhanden%22%2C%22Time%22%3A%222020-03-21%2007%3A06%3A52%22%7D%2C%22autocreate_devices%22%3A%7B%22Value%22%3A%22stop%22%2C%22Time%22%3A%222020-03-21%2016%3A37%3A07%22%7D%2C%22config_address_between%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22config_address_from%22%3A%7B%22Value%22%3A%22Gwendweg%206%2C%2092521%20Schwarzenfeld%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22config_address_to%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22list_SHOPPING_ITEM%22%3A%7B%22Value%22%3A%22%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A43%22%7D%2C%22list_TASK%22%3A%7B%22Value%22%3A%22%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A43%22%7D%2C%22musicalarm_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22reminder_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22state%22%3A%7B%22Value%22%3A%22connected%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22timer_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22timer_id%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22timer_remainingtime%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2016%3A40%3A35%22%7D%2C%22version%22%3A%7B%22Value%22%3A%220.1.0%22%2C%22Time%22%3A%222020-03-21%2016%3A37%3A07%22%7D%7D%2C%22Attributes%22%3A%7B%22devStateIcon%22%3A%22connected%3Arc_GREEN%3Aoff%20connected%20but%20loginerror%3Arc_RED%3Aon%22%2C%22icon%22%3A%22echo%22%2C%22npm_refresh_intervall%22%3A%2286400%22%2C%22room%22%3A%22Amazon%2CFHEMSync%22%2C%22verbose%22%3A%220%22%7D%7D
[MASTER ] got: 2 results
[MASTER ] longpoll end: retry in: 200msec
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.5","fhem":true,"webname":true,"device":true}
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1
Hab sogar das https weggemacht um zu testen obs nur mit http funzt, aber ist ebenfalls laufend abgestürzt!
Super, danke dir, dass du noch ein Log geschickt hast!
Ich habe den Fehler gefunden :) In deinen Readings sind ";" drin und dadurch kommt nicht das gesamte JSON beim Device an, sondern wird durch FHEM als neues Command interpretiert.
Testweise kannst du mal ein anderes Device ausprobieren, oder die Readings loeschen die das Semikolon beinhalten. Ich suche in der Zwischenzeit nach einer Loesung.
Update fhemsync Version 1.0.7, dann klappt es auch mit ";" in den Readings.
10_FHEMSYNC_DEVICE kannst du aus dem 1. Post ebenfalls aktualisieren, dass verhindert einen Crash wenn irgendetwas beim dekodieren schief gehen sollte.
sind nicht semicolon schon ewig verboten?
beim fhem start gibt es keine warnings im log?
Zitat von: frank am 21 März 2020, 18:34:12
sind nicht semicolon schon ewig verboten?
beim fhem start gibt es keine warnings im log?
Ich glaub im Reading sind die immer erlaubt gewesen, weil Reading ja einen beliebigen Text beinhalten kann. Kann mich aber auch taeuschen.
Zitat von: dominik am 21 März 2020, 18:37:35
Ich glaub im Reading sind die immer erlaubt gewesen, weil Reading ja einen beliebigen Text beinhalten kann. Kann mich aber auch taeuschen.
sorry, ich hatte verstanden: im reading-namen.
Heureka, es funzt!!!
Sogar mit SSL/FHEMSync-selfsignedcert true
Gerät (EchoDOT) wird sofort erstellt.
Nur im Log ist noch dieser Eintrag:
2020.03.21 19:54:05.200 3: EchoDot_Compizimmer: unknown attribute IODev. Type 'attr EchoDot_Compizimmer ?' for a detailed list.
2020.03.21 19:54:05.201 3: attr EchoDot_Compizimmer IODev myRaspiEcho : EchoDot_Compizimmer: unknown attribute IODev. Type 'attr EchoDot_Compizimmer ?' for a detailed list.
Wie kann ich den vermeiden?
Aber für heute ist Feierabend - Danke Dominik!
Super!! Danke fuer's Durchhalten :)
Den Bug schau ich mir noch genauer an, da muss ich mir ueberlegen wie ich die Attribute uebertragen kann, die im Master FHEM nicht existieren.
Update fuer Attribute ist da.
fhemsync 1.0.8
und 10_FHEMSYNC_DEVICE aus dem 1. Post aktualisieren
Es wird das userattr nun mit Eintraegen erweitert die sonst nicht als Attribute vorliegen.
Wichtig: Attribute werden NICHT vom Master zum Remote synchronisiert, nur von Remote nach Master (ausser room, das wird gar nicht synchronisiert).
Zu früh gefreut - Server stürzt wieder ab!
Sofort nach setzen von Attribut FHEMSync-selfsignedcert true!
Kurz vor 20 Uhr ist er noch gelaufen und soeben war der Server abgestürzt!
Ist was im Log ersichtlich? fhemsync oder fhem Log?
Hattest du das Attribut nochmals geloescht und neu gesetzt? Mir ist noch nicht klar was es mit dem FHEMSync-selfsignedcert auf sich hat, das wird genau gleich gehandhabt wie die anderen Attribute. Kannst du mal die anderen Attribute testen ob die auch FHEM zum Absturz bringen?
Momentan hab ich wieder von https auf http umgestellt.
Hier ein list:
Internals:
CFGFN
FD 50
FUUID 5e768554-f33f-55ca-3e38-14e55e4cafca35c2
LAST_START 2020-03-21 22:21:26
LAST_STOP 2020-03-21 22:21:26
NAME fhemsync
NR 766
NTFY_ORDER 50-fhemsync
PID 16725
STARTS 5
STATE running /usr/bin/fhemsync
TYPE FHEMSYNC
currentlogfile ./log/fhemsync-2020-03-21.log
logfile ./log/fhemsync-%Y-%m-%d.log
CoProcess:
cmdFn FHEMSYNC_getCMD
name fhemsync
state running /usr/bin/fhemsync
READINGS:
2020-03-21 22:21:26 fhemsync running /usr/bin/fhemsync
Attributes:
FHEMSync-server 192.168.2.219
FHEMSync-webname fhem
devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
remote-filter room=FHEMSync
remote-server 192.168.2.161
remote-webname fhem
room FHEMSync
stateFormat fhemsync
verbose 0
Ohne ein Gerät im FHEMSync-Raum auf dem Remote-Server läuft es.
Gerät (EchoDOT) in den Raum reingestellt und der Server stürzt ab!
log:
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"1.0.8","fhem":true,"webname":true,"device":true}
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=jsonlist2%20fhemsync
[MASTER ] FHEMSYNC device detected: fhemsync
[SLAVE ] executing: http://192.168.2.161:8083/fhem?XHR=1
[MASTER ] starting longpoll: http://192.168.2.219:8083/fhem?XHR=1&inform=type=status;addglobal=1;filter=.*;since=null;fmt=JSON×tamp=1584825694286
[SLAVE ] starting longpoll: http://192.168.2.161:8083/fhem?XHR=1&inform=type=status;addglobal=1;filter=.*;since=null;fmt=JSON×tamp=1584825695676
[MASTER ] Fetching FHEM devices...
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=jsonlist2%20TYPE%3DFHEMSYNC_DEVICE
[MASTER ] got: 0 results
[SLAVE ] Fetching FHEM devices...
[SLAVE ] executing: http://192.168.2.161:8083/fhem?XHR=1&cmd=jsonlist2%20room%3DFHEMSync
[SLAVE ] got: 0 results
[MASTER ] Fetching FHEM devices...
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=jsonlist2%20TYPE%3DFHEMSYNC_DEVICE
[MASTER ] got: 0 results
[SLAVE ] Fetching FHEM devices...
[SLAVE ] executing: http://192.168.2.161:8083/fhem?XHR=1&cmd=jsonlist2%20room%3DFHEMSync
[SLAVE ] got: 1 results
[MAIN ] Create device: EchoDot_Compizimmer
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=define%20EchoDot_Compizimmer%20FHEMSYNC_DEVICE%20echodevice%20EchoDot_Compizimmer
[MASTER ] Fetching FHEM devices...
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=jsonlist2%20TYPE%3DFHEMSYNC_DEVICE
[SLAVE ] ["EchoDot_Compizimmer","connected","<div id=\"EchoDot_Compizimmer\" title=\"connected\" class=\"col2\"><a href=\"/fhem?cmd.EchoDot_Compizimmer=set EchoDot_Compizimmer off&room=FHEMSync\"><svg class=\" rc_GREEN\" data-txt=\"connected\" xmlns:dc=\"http://purl.org/dc/elements/1.1/\" xmlns:cc=\"http://creativecommons.org/ns#\" xmlns:rdf=\"http://www.w3.org/1999/02/22-rdf-syntax-ns#\" xmlns:svg=\"http://www.w3.org/2000/svg\" xmlns=\"http://www.w3.org/2000/svg\" version=\"1.0\" width=\"468pt\" height=\"468pt\" viewBox=\"0 0 468 468\" id=\"svg2\"> <defs id=\"defs12\" /> <metadata id=\"metadata4\"> Created by potrace 1.8, written by Peter Selinger 2001-2007 <rdf:RDF> <cc:Work rdf:about=\"\"> <dc:format>image/svg+xml</dc:format> <dc:type rdf:resource=\"http://purl.org/dc/dcmitype/StillImage\" /> <dc:title></dc:title> </cc:Work> </rdf:RDF> </metadata> <g transform=\"matrix(0.189474,0,0,-0.189474,0,468)\" id=\"g6\"> <path d=\"M 395,2455 C 244,2422 112,2322 57,2200 5,2083 6,2112 2,1260 0,702 2,446 10,400 43,208 198,52 398,11 c 71,-15 1628,-15 1692,0 174,40 322,190 365,370 22,94 22,1579 0,1683 -21,98 -66,187 -127,252 -61,64 -113,98 -201,128 -61,21 -75,21 -867,23 -640,1 -817,-1 -865,-12 z m 1737,-163 c 74,-36 132,-95 170,-170 l 23,-47 0,-840 0,-840 -28,-57 c -37,-76 -96,-134 -171,-169 l -63,-29 -839,2 -839,3 -50,27 C 273,204 196,285 167,348 l -22,47 0,840 0,840 27,52 c 53,100 128,162 227,189 35,10 236,12 856,11 l 810,-2 67,-33 z\" id=\"path8\" /> </g> <rect width=\"257.7966\" height=\"257.7966\" x=\"103.3661\" y=\"106.81353\" id=\"rect2997\" style=\"fill:#008000;fill-opacity:1;stroke:none\" /> </svg></a></div>"]
[SLAVE ] ["EchoDot_Compizimmer-progress","0","0"]
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=setreading%20EchoDot_Compizimmer%20progress%200
[SLAVE ] ["EchoDot_Compizimmer-progresslen","0","0"]
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=setreading%20EchoDot_Compizimmer%20progresslen%200
[SLAVE ] ["EchoDot_Compizimmer-shuffle","off","off"]
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=setreading%20EchoDot_Compizimmer%20shuffle%20off
[SLAVE ] ["EchoDot_Compizimmer-repeat","off","off"]
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=setreading%20EchoDot_Compizimmer%20repeat%20off
[SLAVE ] ["EchoDot_Compizimmer-volume","20","20"]
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=setreading%20EchoDot_Compizimmer%20volume%2020
[SLAVE ] ["EchoDot_Compizimmer-mute","off","off"]
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=setreading%20EchoDot_Compizimmer%20mute%20off
[MASTER ] executing: http://192.168.2.219:8083/fhem?XHR=1&cmd=set%20EchoDot_Compizimmer%20x_json%20%7B%22Name%22%3A%22EchoDot_Compizimmer%22%2C%22PossibleSets%22%3A%22volume%3Aslider%2C0%2C1%2C100%20play%3AnoArg%20pause%3AnoArg%20next%3AnoArg%20previous%3AnoArg%20forward%3AnoArg%20rewind%3AnoArg%20shuffle%3Aon%2Coff%20repeat%3Aon%2Coff%20dnd%3Aon%2Coff%20volume_alarm%3Aslider%2C0%2C1%2C100%20info%3ABeliebig_Auf_Wiedersehen%2CBeliebig_Bestaetigung%2CBeliebig_Geburtstag%2CBeliebig_Guten_Morgen%2CBeliebig_Gute_Nacht%2CBeliebig_Ich_Bin_Zuhause%2CBeliebig_Kompliment%2CErzaehle_Geschichte%2CErzaehle_Was_Neues%2CErzaehle_Witz%2CKalender_Heute%2CKalender_Morgen%2CKalender_Naechstes_Ereignis%2CNachrichten%2CSinge_Song%2CVerkehr%2CWetter%20sounds%3Aglocken%2Ckirchenglocke%2Csummer%2Ctuerklingel_1%2Ctuerklingel_2%2Ctuerklingel_3%2Cjubelnde_menschenmenge%2Cpublikumsapplaus%2Cflugzeug%2Ckatastrophenalarm%2Cmotoren_an%2Cschilde_hoch%2Csirenen%2Czappen%2Cboing_1%2Cboing_2%2Ckamera%2Clufthupe%2Cquitschende_tuer%2Ctickende_uhr%2Ctrompete%2Chahn%2Chundegebell%2Ckatzenmauzen%2Cloewengebruell%2Cwolfsgeheul%2Cgruselig_quitschende_tuer%2Cweihnachtsglocken%20tunein%20primeplaylist%20primeplaysender%20primeplayeigene%20primeplayeigeneplaylist%20alarm_normal%20alarm_repeat%20reminder_normal%20reminder_repeat%20speak%20speak_ssml%20tts%20tts_translate%3AtextField-long%20playownmusic%3AtextField-long%20saveownplaylist%3AtextField-long%20track%20bluetooth_connect%3A-%20bluetooth_disconnect%3A-%20routine_play%3A%40amzn1.alexa.behaviors.preconfigured%3Aalarm_dismissed_with_condition_preconfigured_routine%2CIch_bin_zuhause%40amzn1.alexa.behaviors.preconfigured%3Aim_home_custom_utt_trigger%2Cgute_Nacht%40amzn1.alexa.behaviors.preconfigured%3Agood_night_custom_utt_trigger%2Clichter_aus%40amzn1.alexa.automation.0a44666b-0c4a-4807-b121-04676b5ef6a2%2Clichter_ein%40amzn1.alexa.automation.4697b58e-f964-4891-b9ba-492eee2e2453%2Cstarte_meinen_Tag%40amzn1.alexa.behaviors.preconfigured%3Astart_my_day_custom_utt_trigger%20%22%2C%22PossibleAttrs%22%3A%22alias%20comment%3AtextField-long%20eventMap%3AtextField-long%20group%20room%20suppressReading%20userReadings%3AtextField-long%20verbose%3A0%2C1%2C2%2C3%2C4%2C5%20disable%3A0%2C1%20IODev%20TTS_Voice%3AAustralianEnglish_Female_Nicole%2CAustralianEnglish_Male_Russell%2CBrazilianPortuguese_Female_Vitoria%2CBrazilianPortuguese_Male_Ricardo%2CBritishEnglish_Female_Amy%2CBritishEnglish_Female_Emma%2CBritishEnglish_Male_Brian%2CCanadianFrench_Female_Chantal%2CCastilianSpanish_Female_Conchita%2CCastilianSpanish_Male_Enrique%2CDanish_Female_Naja%2CDanish_Male_Mads%2CDutch_Female_Lotte%2CDutch_Male_Ruben%2CFrench_Female_Celine%2CFrench_Male_Mathieu%2CGerman_Female_Google%2CGerman_Female_Marlene%2CGerman_Female_Vicki%2CGerman_Male_Hans%2CIcelandic_Female_Dora%2CIcelandic_Male_Karl%2CIndianEnglish_Female_Aditi%2CIndianEnglish_Female_Raveena%2CItalian_Female_Carla%2CItalian_Male_Giorgio%2CJapanese_Female_Mizuki%2CJapanese_Male_Takumi%2CKorean_Female_Seoyeon%2CNorwegian_Female_Liv%2CPolish_Female_Ewa%2CPolish_Female_Maja%2CPolish_Male_Jacek%2CPolish_Male_Jan%2CPortuguese_Female_Ines%2CPortuguese_Male_Cristiano%2CRomanian_Female_Carmen%2CRussian_Female_Tatyana%2CRussian_Male_Maxim%2CSwedish_Female_Astrid%2CTurkish_Female_Filiz%2CUSEnglish_Female_Ivy%2CUSEnglish_Female_Joanna%2CUSEnglish_Female_Kendra%2CUSEnglish_Female_Kimberly%2CUSEnglish_Female_Salli%2CUSEnglish_Male_Joey%2CUSEnglish_Male_Justin%2CUSEnglish_Male_Matthew%2CUSSpanish_Female_Penelope%2CUSSpanish_Male_Miguel%2CWelshEnglish_Female_Gwyneth%2CWelshEnglish_Male_Geraint%20TTS_IgnorPlay%3A0%2C1%20TTS_normalize%3Aslider%2C5%2C1%2C40%20TTS_Translate_From%3Adutch%2Cenglish%2Cfrench%2Cgerman%2Citalian%2Cjapanese%2Ckorean%2Cportuguese%2Crussian%2Cspanish%2Cturkish%20intervalsettings%20intervallogin%20intervalvoice%3Aslider%2C0%2C1%2C100%20ignorevoicecommand%20speak_volume%3Aslider%2C0%2C1%2C100%20server%20cookie%20reminder_delay%20tunein_default%20autocreate_refresh%3A0%2C1%20browser_useragent%20browser_language%20browser_save_data%3A0%2C1%20browser_useragent_random%3A0%2C1%20npm_proxy_port%20npm_proxy_ip%20npm_proxy_listen_ip%20npm_refresh_intervall%20npm_bin%20npm_bin_node%20event-aggregator%20event-min-interval%20event-on-change-reading%20event-on-update-reading%20oldreadings%20stateFormat%3AtextField-long%20timestamp-on-change-reading%20cmdIcon%20devStateIcon%20devStateIcon%3AtextField-long%20devStateStyle%20icon%20lightSceneParamsToSave%20lightSceneRestoreOnlyIfChanged%3A1%2C0%20sortby%20structexclude%20webCmd%20webCmdLabel%3AtextField-long%20widgetOverride%20userattr%22%2C%22Internals%22%3A%7B%22DEF%22%3A%22A3S5BH2HU6VAYF%20G090LF1182340FQB%22%2C%22FUUID%22%3A%225e625a38-f33f-55ca-8296-6aea7706b8c5e20f%22%2C%22LOGINMODE%22%3A%22IODEV%22%2C%22NAME%22%3A%22EchoDot_Compizimmer%22%2C%22NR%22%3A%2223%22%2C%22NTFY_ORDER%22%3A%2250-EchoDot_Compizimmer%22%2C%22STATE%22%3A%22connected%22%2C%22TYPE%22%3A%22echodevice%22%2C%22model%22%3A%22Echo%20Dot%22%7D%2C%22Readings%22%3A%7B%22BrowserLanguage%22%3A%7B%22Value%22%3A%22de%2Cen-US%3B%3Bq%3D0.7%2Cen%3B%3Bq%3D0.3%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A39%22%7D%2C%22BrowserUserAgent%22%3A%7B%22Value%22%3A%22Mozilla%2F5.0%20(Windows%20NT%2010.0%3B%3B%20Win64%3B%3B%20x64%3B%3B%20rv%3A62.0)%20Gecko%2F20100101%20Firefox%2F62.0%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A39%22%7D%2C%22COOKIE_MODE%22%3A%7B%22Value%22%3A%22IODEV%22%2C%22Time%22%3A%222020-03-21%2022%3A09%3A40%22%7D%2C%22alarm_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A53%22%7D%2C%22channel%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A39%22%7D%2C%22currentAlbum%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A39%22%7D%2C%22currentArtist%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A39%22%7D%2C%22currentArtwork%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A39%22%7D%2C%22currentTitle%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A39%22%7D%2C%22currentTuneInID%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A39%22%7D%2C%22deviceAddress%22%3A%7B%22Value%22%3A%22Gwendweg%206%2C%20Schwarzenfeld%2C%20Bayern%2C%20DE%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A56%22%7D%2C%22dnd%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A04%22%7D%2C%22microphone%22%3A%7B%22Value%22%3A%22false%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A56%22%7D%2C%22model%22%3A%7B%22Value%22%3A%22Echo%20Dot%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A55%22%7D%2C%22musicalarm_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A53%22%7D%2C%22mute%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A35%22%7D%2C%22online%22%3A%7B%22Value%22%3A%22true%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A55%22%7D%2C%22playStatus%22%3A%7B%22Value%22%3A%22stopped%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A39%22%7D%2C%22presence%22%3A%7B%22Value%22%3A%22present%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A55%22%7D%2C%22progress%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A39%22%7D%2C%22progresslen%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A39%22%7D%2C%22reminder_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A53%22%7D%2C%22repeat%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A35%22%7D%2C%22shuffle%22%3A%7B%22Value%22%3A%22off%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A35%22%7D%2C%22state%22%3A%7B%22Value%22%3A%22connected%22%2C%22Time%22%3A%222020-03-21%2022%3A14%3A13%22%7D%2C%22timeZoneId%22%3A%7B%22Value%22%3A%22Europe%2FParis%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A56%22%7D%2C%22timer_count%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A53%22%7D%2C%22timer_id%22%3A%7B%22Value%22%3A%22-%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A53%22%7D%2C%22timer_remainingtime%22%3A%7B%22Value%22%3A%220%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A53%22%7D%2C%22version%22%3A%7B%22Value%22%3A%22651614420%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A55%22%7D%2C%22voice%22%3A%7B%22Value%22%3A%22sprich%20mir%20nach%20die%20haust%C3%BCr%20wurde%20ge%C3%B6ffnet%22%2C%22Time%22%3A%222020-03-21%2016%3A52%3A45%22%7D%2C%22voice_timestamp%22%3A%7B%22Value%22%3A%221584805920884%22%2C%22Time%22%3A%222020-03-21%2016%3A52%3A45%22%7D%2C%22volume%22%3A%7B%22Value%22%3A%2220%22%2C%22Time%22%3A%222020-03-21%2022%3A25%3A35%22%7D%2C%22volume_alarm%22%3A%7B%22Value%22%3A%2270%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A07%22%7D%2C%22wakeword%22%3A%7B%22Value%22%3A%22ALEXA%22%2C%22Time%22%3A%222020-03-21%2022%3A26%3A02%22%7D%7D%2C%22Attributes%22%3A%7B%22IODev%22%3A%22myRaspiEcho%22%2C%22alias%22%3A%22Echo%20Dot%20Compizimmer%22%2C%22devStateIcon%22%3A%22connected%3Arc_GREEN%3Aoff%20connected%20but%20loginerror%3Arc_RED%3Aon%22%2C%22icon%22%3A%22echo%22%2C%22room%22%3A%22FHEMSync%2CAmazon%22%7D%7D
[MASTER ] got: 1 results
[MASTER ] longpoll end: retry in: 200msec
Im FHEM.log gibt es folgende Zeile:
2020.03.21 22:21:26.862 3: fhemsync: starting
2020.03.21 22:21:26.904 3: fhemsync: using logfile: ./log/fhemsync-2020-03-21.log
Can't use string ("EchoDot_Compizimmer") as a HASH ref while "strict refs" in use at ./FHEM/10_FHEMSYNC_DEVICE.pm line 121.
Danke fuer das Log, damit ist mir der Fehler schnell klar geworden. Update im 1. Post (FHEMSYNC_DEVICE).
War ein Fehler von mir beim Einbau der Attribute Syncs.
So, jetzt funzt es wieder!
Werde noch testen obs mit SSL auch geht!
Wie kann ich den Speicherort des Logfiles ändern, hab meine Logs alle unter /var/log/fhem
In welchen Zeitabständen sucht fhemsync nach neuen Geräten / kann man das manuell anstoßen ohne fhemsync neu zu starten?
Im FHEM.log taucht wieder folgende Meldung auf:
2020.03.21 23:17:44.347 3: EchoDot_Compizimmer: unknown attribute IODev. Type 'attr EchoDot_Compizimmer ?' for a detailed list.
2020.03.21 23:17:44.348 3: attr EchoDot_Compizimmer IODev myRaspiEcho : EchoDot_Compizimmer: unknown attribute IODev. Type 'attr EchoDot_Compizimmer ?' for a detailed list.
Logfile ist noch hardcoded. Update um es zu aendern folgt noch.
Aktuell wird alle 5 Minuten nach neuen Devices gesucht. Manuell anstossen ist zur Zeit nur durch einen Restart moeglich.
Die Fehlermeldung schau ich mir morgen dann nochmals genauer an. Die wird zur Zeit alle 5 Minuten bei dir erscheinen.
OK, dann Danke nochmals und gute Nacht!
Der Fehler hat mir doch keine Ruhe gelassen. Update 10_FHEMSYNC_DEVICE im 1. Post, danach Neustart von fhemsync. Lass mich dann bitte wissen ob es behoben ist.
gn8
Ich habe ein MQTT2 device, das noch Probleme macht.
Hier ein Auszug aus dem log:
[MASTER ] ["MQTT2_Hubert","Name: Hubert Device: Samsung S9Plus Position: 49.2529924,9.127676","<div id=\"MQTT2_Hubert\" title=\"Name: Hubert Device: Samsung S9Plus Position: 49.2529924,9.127676\" class=\"col2\">Name: Hubert Device: Samsung S9Plus Position: 49.2529924,9.127676</div>"]
[MASTER ] ["MQTT2_Hubert","COMMAND,?"]
[SLAVE ] executing: http://xxx:8083/fhem?XHR=1&cmd=set%20MQTT2_Hubert%20%3F
[MASTER ] ["MQTT2_Hubert","COMMAND,?"]
[SLAVE ] executing: http://xxx:8083/fhem?XHR=1&cmd=set%20MQTT2_Hubert%20%3F
[MASTER ] ["MQTT2_Hubert","COMMAND,?"]
[SLAVE ] executing: http://xxx:8083/fhem?XHR=1&cmd=set%20MQTT2_Hubert%20%3F
[MASTER ] ["MQTT2_Hubert","COMMAND,?"]
[SLAVE ] executing: http://xxx:8083/fhem?XHR=1&cmd=set%20MQTT2_Hubert%20%3F
(node:24067) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at getEncodingOps (buffer.js:664:24)
at fromString (buffer.js:432:11)
at Function.from (buffer.js:288:12)
at toBase64 (/usr/lib/node_modules/fhemsync/node_modules/request/lib/helpers.js:39:17)
at Auth.basic (/usr/lib/node_modules/fhemsync/node_modules/request/lib/auth.js:30:33)
at Auth.onRequest (/usr/lib/node_modules/fhemsync/node_modules/request/lib/auth.js:136:23)
at Request.auth (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1341:14)
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:378:10)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:24067) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
Guten Morgen!
Leider ist der Fehler immer noch da!
Mehr Sorgen bereitet mir aber das FHEMsyncLog, das (wahrscheinlich sind die EchoDots sehr mitteilsam) innerhalb von 20 Minuten auf 1,5MB angewachsen ist!
Und bisher hab ich von meinen 3 DOTs nur einen im FHEMSync-Raum drin!
@carlos, kommt der Fehler wenn du am Master FHEM das Device aufrufst? Ich werde da noch das Command "?" noch rausfiltern.
@punker, das Log wird noch reduziert, ist jetzt fuer die Testphase jedoch hilfreich es noch ziemlich mitteilsam zu haben.
Passiert der Fehler nur bei IODev? Werden die anderen Attribute uebertragen? Ich habe es bei mir gerade nochmals getestet, es scheint so, als ob IODev ein spezielles Attribut ist, ich werde es daher aus den Sync rausnehmen.
fhemsync 1.0.9
- ? Command wird rausgefiltert. Bitte um Test carlos ob das das Problem bei dir behebt.
- IODev Attribut wird rausgefiltert. Bitte um Test punker ob der Fehler im Log nun weg ist.
Fehler ist weg!
Die anderen Attribute werden soweit ich sehe alle übertragen!
Muss dann halt das Log solange die Testphase dauert ab und zu löschen.
Kein Problem.
Danke nochmals, hätte fast nicht mehr dran geglaubt dass du das schaffst!
Hallo dominik,
wenn ich User und Passwort eingeben will kommt eine Fehlermeldung
Zitatstored obfuscated auth data
ich denke es liegt an den Sonderzeichen im Passwort. Gibt es da eine Lösung?
Warum steht bei remote-auth User und Passwort im Klartext?
vg Jens
Die Meldung ist in Ordnung, heisst nur, dass die Daten "verschluesselt" gespeichert wurden.
Remote Passwort ist im Moment noch nicht verschluesselt, das kommt noch in den naechsten Updates.
Hallo Dominik,
Das MQTT2 device wird jetzt ohne Fehler übertragen.
Ein Problem gibts bei den userreadings, da wird eine sub aus myutils verwendet, die es im master so nicht gibt.
Aber das ist kein Problem das an FHEMSYNC liegt und dort gelöst werden könnte.
Ansonsten bin ich mit der Funktionaltät so zufrieden.
Ein Vorschlag noch:
Ich finde die Lösung devices in einen extra Raum zu legen nicht ganz so optimal, evtl könnte man das über ein user attribute besser lösen.
Ist wahrscheinlich Geschmackssache.
Gruß
Carlos
Zitat von: carlos am 22 März 2020, 10:56:47
Ein Vorschlag noch:
Ich finde die Lösung devices in einen extra Raum zu legen nicht ganz so optimal, evtl könnte man das über ein user attribute besser lösen.
Ist wahrscheinlich Geschmackssache.
Einerseits stimmt das, es mit einem Attribut zu machen, auf der anderen Seite sehe ich im FHEMsync-Raum sofort welche Geräte ich synce!
Zitat von: carlos am 22 März 2020, 10:56:47
Hallo Dominik,
Das MQTT2 device wird jetzt ohne Fehler übertragen.
Ein Problem gibts bei den userreadings, da wird eine sub aus myutils verwendet, die es im master so nicht gibt.
Aber das ist kein Problem das an FHEMSYNC liegt und dort gelöst werden könnte.
Ansonsten bin ich mit der Funktionaltät so zufrieden.
Ein Vorschlag noch:
Ich finde die Lösung devices in einen extra Raum zu legen nicht ganz so optimal, evtl könnte man das über ein user attribute besser lösen.
Ist wahrscheinlich Geschmackssache.
Gruß
Carlos
Guter Punkt mit den Userreadings. Eigentlich kann ich Userreadings auch aus den Sync rausnehmen, weil man die Readings sowieso uebertragen bekommt und man benoetigt da nicht extra die userreadings im Master. Sehe ich das richtig? Wenn ja, nehme ich es im naechsten Update raus.
Dein Vorschlag ist jetzt schon moeglich. Du kannst remote-filter anpassen und dort einen anderen Filter auf ein Attribut rein setzen.
ZitatDie Meldung ist in Ordnung, heisst nur, dass die Daten "verschluesselt" gespeichert wurden.
Aber ich kann das
Attribut FHEMSync-auth
mit dieser Fehlermeldung nicht anlegen
Hallo Dominik,
Da hast du recht, die Readings werden ja synchronisiert, von daher braucht man die userreadings nicht.
Kannst du aus meiner Sicht rausnehmen.
bzgl. remote-filter, gute idee, das werde ich mal testen.
Gruß
Carlos
Zitat von: Newbie am 22 März 2020, 11:40:37
Aber ich kann das Attribut FHEMSync-auth
mit dieser Fehlermeldung nicht anlegen
Erscheint in FHEMSync-auth danach ein crypt:......? Wenn ja, dann ist alles gespeichert.
fhemsync 1.0.10
- userReadings Attribut wird nun nicht mehr synchronisiert
Hab mir mal ein globales Attribute FHEMSync definiert, wenn man das auf true setzt und den Filter entsprechend setzt, funktioniert das auch ohne den Raum.
Sehr gut.
Die usereadings werden aber bei mir mit version 1.0.10 immer noch synchronisiert.
Zitat von: carlos am 22 März 2020, 14:29:26
Hab mir mal ein globales Attribute FHEMSync definiert, wenn man das auf true setzt und den Filter entsprechend setzt, funktioniert das auch ohne den Raum.
Sehr gut.
Die usereadings werden aber bei mir mit version 1.0.10 immer noch synchronisiert.
Heissen die userreadings oder userReadings? Laut Doku naemlich userReadings und so habe ich es gefiltert.
Die heissen userReadings wie du schreibst und ja die werden noch mit synchronisiert.
Hallo dominik,
ZitatErscheint in FHEMSync-auth danach ein crypt:......? Wenn ja, dann ist alles gespeichert.
nein
Zitat von: carlos am 22 März 2020, 15:08:07
Die heissen userReadings wie du schreibst und ja die werden noch mit synchronisiert.
Kannst du das Device einmal loeschen und dann fhemsync neu starten?
Zitat von: Newbie am 22 März 2020, 16:18:43
Hallo dominik,
nein
Kannst du mir mal die Log Ausgabe in fhemsync posten bzw. schauen ob im FHEM Log was zu sehen ist? Wie setzt du das Attribut?
So sollte der Befehl aussehen: attr fhemsync FHEMSync-auth user:password
Update fhemsync 2.0.1
ACHTUNG: Erfordert ein Update von 10_FHEMSYNC und 10_FHEMSYNC_DEVICE aus dem 1. Post!
Changes:
- Log reduziert auf Fehler
- FHEMSync-log kann man den Speicherort aendern
- nrarchive definiert die max. Anzahl der Log Files
- remote-auth wird nun verschluesselt gespeichert, daher sind alte Versionen nicht mehr kompatibel
Tja, leider startet mit den Neuen Dateien mein FHEMsync nicht mehr!
Update auch gemacht mit sudo npm install -g fhemsync
log:
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"2.0.1","fhem":true,"webname":true,"device":true,"ssl":true,"selfSignedCert":true}
[MASTER ] FHEMSYNC device detected: fhemsync
(node:30690) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'substr' of undefined
at main (/usr/lib/node_modules/fhemsync/fhemsync.js:410:13)
(node:30690) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:30690) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
Bitte auf 2.0.2 aktualisieren. Ich hatte diesmal nur mit Auth getestet, jetzt geht es auch ohne :)
Danke, funzt wieder!
Ist doch schön, wenn man den ganzen Tag nix anderes tun kann als vor dem PC zu sitzen - gell ;)
Hallo Dominik
Meine userReadings sind folgende und werden immer noch synchronisiert:
attr MQTT2_Hubert userReadings latitude {my $latlon=ReadingsVal($NAME,"latlon",0);;;;my @spl = split(',',$latlon);;;;return $spl[0]},\
longitude {my $latlon=ReadingsVal($NAME,"latlon",0);;;;my @spl = split(',',$latlon);;;;return $spl[1]},\
position {mqtt2position($NAME,"")}
Bei mir ist die FHEM-Hauptinstanz über Nacht wieder mehrmals abgestürzt! :(
Hab daraufhin FHEMsync gelöscht und seitdem keine Abstürze mehr!
Hab meine 3 EchoDOTs wieder direkt in der Hauptinstanz eingebunden.
Werde noch'n bisschen mit FHEMsync experimentieren.
Evtl. sind ja die EchoDOTs an den Abstürzen schuld?
Schau bitte noch im FHEM Log ob du dazu Fehler findest, das würde mir die Fehlersuche erleichtern. Danke.
Hallo,
ich bekomme es nicht zum laufen. FHEMSync-Device ist definiert
[MAIN ] Starting FHEMSync...
[MAIN ] Options: {"version":"2.0.2","fhem":true,"port":true,"webname":true,"auth":true,"device":true}
[MASTER ] Please define FHEMSYNC device in FHEM: define fhemsync FHEMSYNC
2020.03.23 17:18:53.917 4: Connection closed for WEB_X.X.X.X_53260: EOF
2020.03.23 17:18:53.931 4: Connection accepted from WEB_X.X.X.X_53262
2020.03.23 17:18:53.932 4: WEB_X.X.X.X_53262 GET /WEB?XHR=1&cmd=jsonlist2%20fhemsync&fwcsrf=csrf_481264602983146; BUFLEN:0
2020.03.23 17:18:53.932 4: WEB: redirecting /WEB?XHR=1&cmd=jsonlist2%20fhemsync&fwcsrf=csrf_481264602983146 to /fhem
2020.03.23 17:18:53.935 4: Connection accepted from WEB_X.X.X.X_53264
2020.03.23 17:18:53.936 4: WEB_X.X.X.X_53264 GET /fhem; BUFLEN:0
2020.03.23 17:18:53.938 4: WEB: /fhem / RL:1804 / text/html; charset=UTF-8 / Content-Encoding: gzip
/ Cache-Control: no-cache, no-store, must-revalidate
2020.03.23 17:18:53.938 4: Connection closed for WEB_X.X.X.X_53262: EOF
2020.03.23 17:18:53.941 4: Connection closed for WEB_X.X.X.X_53264: EOF
2020.03.23 17:18:53.968 3: fhemsync: read: end of file reached while sysread
2020.03.23 17:18:53.970 3: fhemsync: stopped
2020.03.23 17:18:53.972 4: fhemsync: last run duration was only 0 sec, restarting with delay
2020.03.23 17:19:01.751 4: Connection closed for WEB_X.X.X.X_40746: EOF
2020.03.23 17:19:01.755 4: Connection accepted from WEB_X.X.X.X_40748
2020.03.23 17:19:01.756 4: WEB_X.X.X.X_40748 GET /fhem/FileLog_logWrapper?dev=Logfile&type=text&file=fhem-2020-03.log; BUFLEN:0
defmod fhemsync FHEMSYNC
attr fhemsync FHEMSync-auth crypt:...
attr fhemsync FHEMSync-log ./log/fhemsync-%Y-%m-%d.log
attr fhemsync FHEMSync-port 8083
attr fhemsync FHEMSync-server X.X.X.X
attr fhemsync FHEMSync-ssl true
attr fhemsync FHEMSync-webname WEB
attr fhemsync devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
attr fhemsync nrarchive 10
attr fhemsync remote-auth crypt:...
attr fhemsync remote-port 8083
attr fhemsync remote-server X.X.X.X
attr fhemsync remote-ssl true
attr fhemsync remote-webname WEB
attr fhemsync room FHEMSync
attr fhemsync stateFormat fhemsync
setstate fhemsync stopped
setstate fhemsync 2020-03-23 17:20:54 fhemsync stopped
Wollte heute nochmal bisschen testen, hat aber überhaupt nicht mehr funktioniert, daher gelöscht!
Keine Logeinträge mehr vorhanden gewesen!
FHEMsync startete zwar richtig, ohne Fehler laut Log, hat aber vom RemoteServer keine Devices bekommen!
@Newbie, probier bitte bei
attr fhemsync FHEMSync-webname WEB => attr fhemsync FHEMSync-webname fhem
und
attr fhemsync FHEMSync-selfsignedcert true
@punker, das ist eigenartig, vor allem, da es ja schon funktioniert hat. Ich werde heute noch eine Version mit der Moeglichkeit verbose=5 (also viele Log Eintraege) bereitstellen, damit sollte der Fehler leichte identifizierbar sein.
Der FHEM Crash kann eigentlich nur an irgendwelchen Readings liegen die da ueber Nacht geschickt wurden. Ich schau auch nochmals im Code dass ich das besser abfangen kann.
@Newbie, probier bitte bei
attr fhemsync FHEMSync-webname WEB => attr fhemsync FHEMSync-webname fhem
und
attr fhemsync FHEMSync-selfsignedcert true
Danke, das war's
Update fhemsync 2.0.3 mit verbose Levels fuer das Logging.
0...kein Log
1...errors
2,3...info beim Start
4...info bei allen Reading/Attribute Updates
5...debug
Dazu sind auch die beiden Modulfiles aus dem 1. Post zu aktualisieren.
@punker, bitte lass mal mit verbose=5 (zu definieren im fhemsync Device) ueber Nacht laufen und schick mir dann das Log per PN damit ich mir die Crashes genauer anschauen kann. Danke!
Hallo dominik,
irgendwie gehen bei der Übertragung die Umlaute verloren (z.B. statt Küche =>K�che).
Irgendeine Idee?
vg Jens
Danke fuer die Info, schau ich mir an und sollte sich auch beheben lassen.
Ich habe das gerade mal getestet und per setreading ein Reading mit Umlauten erzeugt, diese werden bei mir jedoch richtig uebertragen. Kannst du das bei dir auch mal mit setreading testen?
Hallo dominik,
nehm alles zurück hat mit deinem Modul nix zu tun, ist mir nur vorher gar nicht aufgefallen. Sorry
Hallo bei mir ist FHEMSync gestoppt - folgendes gibt die LOG aus:
[MAIN ] Starting FHEMSync version 2.0.3...
[MAIN ] Options: {"version":"2.0.3","fhem":true,"port":true,"webname":true,"device":true,"ssl":true,"selfSignedCert":true}
(node:17373) UnhandledPromiseRejectionWarning: StatusCodeError: 401 - undefined
at new StatusCodeError (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/errors.js:32:15)
at Request.plumbing.callback (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:104:33)
at Request.RP$callback [as _callback] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:46:31)
at Request.self.callback (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:185:22)
at Request.emit (events.js:315:20)
at Request.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1154:10)
at Request.emit (events.js:315:20)
at IncomingMessage.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1076:12)
at Object.onceWrapper (events.js:421:28)
at IncomingMessage.emit (events.js:327:22)
at endReadableNT (_stream_readable.js:1201:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
(node:17373) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:17373) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
Wo liegt bei mir der Fehler? Meine beiden Raspis sind beide über https zu erreichen!
Hier noch die Attribute die gesetzt sind:
FHEMSync-filter: room=FHEMSync
FHEMSync-log: ./log/fhemsync-%Y-%m-%d.log
FHEMSync-port: 8083
FHEMSync-selfsignedcert: false
FHEMSync-server: 127.0.0.1
FHEMSync-ssl: false
FHEMSync-webname: fhem
devStateIcon: stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
nrarchive: 10
remote-server: xxx.xxx.xxx.xxx
room: FHEMSync
stateFormat: fhemsync
Hi,
401 ist eigentlich "Unauthorized". Ist Username und Password vielleicht falsch hinterlegt worden?
Hallo, also ehrlich gesagt habe ich kein Username und Passwort verliehen. Attribut auth habe ich nicht gesetzt.
Mal eine Frage zu deiner Einrichtung...
- In der Haupt-FHEM Instanz:
- define fhemsync FHEMSYNC
- attr fhemsync remote-server IP-VON-REMOTE-FHEM
- Folgende weitere Werte koennen gesetzt werden wenn diese von Default abweichen, default:
server: MUSS GESETZT WERDEN
port: 8083
webname: fhem
filter: room=FHEMSync
auth: ""
ssl: false,
selfsignedcert: false
- FHEMSync-* Werte setzen wenn diese von Default abweichen, default:
server: 127.0.0.1
port: 8083
webname: fhem
auth: ""
ssl: false,
selfsignedcert: false
...was ist mit - FHEMSync-* gemeint? Ist das etwas in der Remote-Instanz? oder in der Haupt-Instanz? aber über die Attribute der Haupt-Instanz hast du schon schon kurz darüber geschrieben - also es ist nicht klar was man hier genau einrichten soll!
Ich hab in der Remote-Instanz auch nur die Devices in das FHEMSync-Verzeichnis gelegt und vorher die 2 xxx.pm Dateien ins FHEM Verzeichnis kopiert und reloaded!
Hi,
alle FHEMSync-* Attribute beziehen sich auf den FHEMSync Server, also dort wo FHEMSync drauf laeuft. remote-* ist dann die Remote Instanz.
Verwendest du vielleicht ssl? Wenn ja, sollte FHEMSync-ssl true sein und wahrscheinlich auch -selfsignedcert true.
Ich habe auf der Haupt Instanz SSL und Cert auf true gesetzt -> ohne Erfolg
Nochmal zur Einrichtung - ich habe auf beiden Instanzen 10_FHEMSYNC und 10_FHEMSYNC_DEVICE kopiert und geladen.
War das richtig?
Außerdem wie setze ich Attribut FHEMSync-auth bzw. remote-auth richtig?
<Username> <Passwort>
Du brauchst nur am Master FHEMSync installieren, am Remote FHEM musst du nichts installieren.
Fuer den Master muss man im Normalfall auch nichts an Attributen definieren, solange man Standard und kein https nutzt.
*-auth Attribut Format: user:password
Danke es funktioniert jetzt!
Folgendes habe ich gemacht - die *.pm-Dateien aus dem FHEM Verzeichnis des Remote gelöscht und neu gestartet.
In der Haupt-Instanz habe ich remoteSSL und ServerSSL auf true gesetzt, habe remote-auth und Server-auth gesetzt.
Remot-cert und Server-cert habe ich gesetzt.
Gruß
Hallo @dominik,
ich habe heute dein Modul getestet. Obs stabil läuft wird sich zeigen ;)
Mir war das Zusammenspiel anfangs auch nicht ganz klar.
Zum Glück hatte ich dann gelesen das FHEMSync nur auf dem Master laufen muss.
Auf meinem Uralt RPi-(Client) lies sich das npm schon gar nicht mehr installieren.
Der Hinweis für das Passwort des Remote-FHEM, das man User:Passwort schreiben muss, habe ich erst beim dritten mal lesen war genommen ;=)
Wie schaut es mit SVG-Grafiken aus?
Die Devices werden zwar angelegt aber Mangels Log-Dateien ja nicht gezeichnet. Die Logs werden zwar als Device angelegt, aber gibt es auf dem Master-System nicht.
Daher auch keine Daten für die SVGs.
Das Sonderzeichen "°" von Grad-Celsius wird im Master als "�" dargestellt. Die anderen Umlaute werden ebenso als "�" angezeigt.
Ist das ein Einstellungs/UTF8-Problem ?
Danke derweil für dein Modul
Gruss Gerd
Hi Gerd,
danke fuer das Feedback!
Das mit den Umlauten ist mir gestern auch aufgefallen, werde das noch korrigieren, genauso wie die Anleitung erweitern und in der commandref aufnehmen.
Bzgl. FileLog, du kannst am Master Device das FileLog anlegen statt am Remote Device. Weil alle Events aus dem Device werden sowieso am Master auch ausgeloest. Lass mich wissen ob das fuer dich eine passende Loesung ist.
Gruss Dominik
Umlaute gefixed, Update im 1. Post.
Moin Dominik
Werd ich eventuell heute Abend dazu kommen.
Danke!
Gruß Gerd
Zitat von: dominik am 04 April 2020, 09:34:39
Umlaute gefixed, Update im 1. Post.
Hallo Dominik
Umlaute und Sonderzeichen sind jetzt ok.
Danke
@dominik
soweit Funktionieren deine Module.
Was mir jetzt noch aufgefallen ist,
meine Device-Definitionen habe ich in diverse CFG-Dateien ausgelagert die ich mit z.B.
include /opt/fhem/FHEM/Firmata_Arduino-Mega_USB.cfg aus der fhem.cfg lade.
Diese Configurations-Info steht mit im Device des Slaves.
Nun wirft mir das Master-FHEM bei jedem neu starten die Meldungen wie folgt:
ZitatMessages collected while initializing FHEM:
./log/fhem.save: Please define FA_26_A2D984000007 first
Please define FA_26_A2D984000007 first
Please define FA_26_A2D984000007 first
Please define FA_26_A2D984000007 first
Please define FA_26_A2D984000007 first
Please define FA_26_A2D984000007 first
Please define FA_26_A2D984000007 first
Please define FA_26_A2D984000007 first
Please define FA_26_A2D984000007 first
Please define FIRMATA first
Please define FI_26_A3D984001605 first
Please define FI_26_A3D984001605 first
Please define FI_26_A3D984001605 first
Please define FI_26_A3D984001605 first
Please define FI_26_A3D984001605 first
Please define FI_26_A3D984001605 first
Please define FI_26_A3D984001605 first
Please define FI_26_A3D984001605 first
Please define FI_26_A3D984001605 first
Please define HI_28_A2D984001677 first
Please define HI_28_A2D984001677 first
Please define HI_28_A2D984001677 first
Please define HI_28_A2D984001677 first
usw....
Das sind 1Wire-Sensoren mit dem Firmata-Master.
Ist das verhalten so normal/bekannt/vorgesehen? ;)
Wenn ich in der fhem.cfg des Masters,
include /opt/fhem/FHEM/Firmata_Arduino-Mega_USB.cfg einfüge, versucht er nun natürlich auf dem Master den USB-Port anzusprechen.
Das syncronisierte Device auf dem Master hat im Device den Verweis auf die CFG stehen.
ZitatCFGFN /opt/fhem/FHEM/Firmata_Arduino-Mega_USB.cfg
Die Informationen stehen nun natürlich nach dem Neustart nicht dort. Das syncronisierte Device wird ja erst nach dem Start des Masters beim Slave abgefragt.
Schönes WE
Gruss Gerd
Hi,
liegt das am CFGFN Internal? Wenn ja, kann ich das von der Synchronisation ausnehmen, damit sollte es dann klappen.
Moin
Ich vermute mal.
Eventl. per Attribut irgendwie einstellbar?
Probier es bitte mal mit der Datei im Anhang, bitte vorher das Device mit CFGFN loeschen und dann FHEMSync neu starten.
Hay Dominik,
Klasse. Wenn ich aufs FHEM-Haus Klicke dann kommen nun keine Meldungen mehr!
Ich hab FHEMSync gestoppt und die Sync-Device im Master alle gelöscht.
Nach einem Start von FHEMSync waren alle Device vorhanden und keine Meldungen mehr zu sehen.
Noch mal alles gelöscht und shutdown restart .
Ebenfalls keine Meldungen mehr zu sehen !
Klasse. Jetzt kann ich vorsichtig weiter probieren ;)
Danke und Gruss
Gerd
PS: Eventl. könntest Du auf der Startseite dazu schreiben welche Version Aktuell zum download abgelegt ist oder gar in der Kopfzeile der Module die Version + Datum Aktuell halten ;)
Oder gar im Device die Version mit anzeigen .
Perfekt, danke fuer den schnellen Test!
Guter Punkt mit der Versionierung, ich werde das beim naechsten Update mit einfuegen. Danke!
Hallo Dominik,
Logfiles und Grafik funktioniert auch soweit ich das sehen kann.
Wie oft synchronisiert sich der Master mit dem Slave? Ist das Device abhängig oder fest vorgegeben?
Was passiert wenn der Slave für die Dauer der 1Wire abfrage keine Reaktion im FHEMWEB zeigt?
Gibt es ein Timeout? Ich hab mein Firmata OWX bisher noch nicht umgestellt.
Und so lange die Abfragen auf den sechs 1Wire Bus laufen reagiert FHEM nicht.
Morgen werde ich dann sehen ob die SVG bei beiden FHEM gleich ausschaut ;)
Gruss
Gerd
Readings und Attribute werden am Master sofort aktualisiert. Alle 5 Minuten (aktuell hardcoded) wird geprüft ob
- es neue Devices am Slave gibt
- Attribute am Slave gelöscht wurden
- Readings am Slave gelöscht wurden
Du kannst auch jegliche Commands am Master ausführen und diese werden sofort an den Slave geschickt. Da siehst du dann gut, dass die Readings sofort aktualisiert werden.
Wenn die Verbindung hängt, wird zuerst versucht nach 200ms ein Reconnect durchzuführen, geht der mehrmals schief, wird die Zeit verlängert. Ich glaube ich habe da 1 Minute als Maximum drin, sitze gerade nicht vorm Code.
Okay...Merci
Das "get" fehlt aber z.B. jetzt bei den OWTHERM Devices ;)
Siehe die zwei Bilder, einmal vom Slave und Master. Ist jetzt für mich in dem Fall nicht Wichtig aber das GET gibt es ja auch bei anderen Modulen.
In dem Fall kann man über das GET eine neue Abfrage (alarm,id,temperatur,version) starten.
Jetzt aber erst einmal ins Bett ;)
Gruss Gerd
Stimmt, get ist noch nicht integriert, das kann ich noch nachliefern :)
Hallo Dominik,
könntest du das Modul auf mehrere Remote-Raspis erweitern? Ich möchte von meinem Haupt-Raspi gerne Daten von mehreren Raspis im Haus sammeln.
Gruss Maik
@Gerd, kannst du bitte die aktuelle Version 0.9.0 (1. Post) und fhemsync 2.1.0 (npm install -g fhemsync) probieren? Damit sollte nun get funktionieren.
@Maik, schau ich mir gerne die naechsten Tage an. Sollte eigentlich kein Problem sein.
Moin Dominik
Werd ich machen wenn ich Zeit dafür bekomme ;)
Danke und ein schönen Sonntag ringsum
Gerd
@Dominik
Zitat von: dominik am 05 April 2020, 09:13:09
@Gerd, kannst du bitte die aktuelle Version 0.9.0 (1. Post) und fhemsync 2.1.0 (npm install -g fhemsync) probieren? Damit sollte nun get funktionieren.
fhemsync habe ich aktualisiert.
Zitatroot@rpi0:~# npm install -g fhemsync
npm WARN deprecated request@2.88.2: request has been deprecated, see https://github.com/request/request/issues/3142
/usr/bin/fhemsync -> /usr/lib/node_modules/fhemsync/fhemsync.js
+ fhemsync@2.1.0
updated 1 package in 8.937s
Was sagt mir die Meldung? Mein npm zu alt?
Hab zum Test alle 5 OWX im Master inkl. Firmata gelöscht und FHEM neu gestartet. Aber GET seh ich keine in den Devices?!
Verbose 4 hab ich an.
Gruss Gerd
Hallo diminik,
wenn ich auf zwei Geräten FHEMSync installiere, um jeweils vom Anderen Daten zu holen verabschiedet sich nach ca. 1min mal die eine Verbindung mal die Andere.
im FHEMSync-Log steht dann:
...
(node:9191) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at nodeRNG (/usr/local/lib/node_modules/fhemsync/node_modules/uuid/lib/rng.js:7:17)
at v4 (/usr/local/lib/node_modules/fhemsync/node_modules/uuid/v4.js:13:52)
at new Multipart (/usr/local/lib/node_modules/fhemsync/node_modules/request/lib/multipart.js:10:19)
at new Request (/usr/local/lib/node_modules/fhemsync/node_modules/request/request.js:124:21)
at request (/usr/local/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/local/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/local/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
...
FHEMSync ist die aktuelle Version 2.1.0
vg Jens
@Gerd, bitte fhemsync 2.3.0 installieren und Update aus dem 1. Post. Lass mich wissen ob es klappt, ich habe leider kein Device zum Testen und im dummy bekomme ich kein get.
@Maik, teste bitte mit der neuen Version aus dem 1. Post und fhemsync 2.3.0. Damit kannst du bis zu 5 Remote FHEM Instanzen synchronisieren.
@Jens, das sieht nach einer Endlosschleife aus. Wie hast du die Installation eingerichtet? Der Filter sollte auf unterschiedliche Raeume zeigen.
Hallo dominik,
Zitat@Jens, das sieht nach einer Endlosschleife aus. Wie hast du die Installation eingerichtet? Der Filter sollte auf unterschiedliche Raeume zeigen.
hatte ich gesetzt, nur leider auf einem Rechner falsch (room:HomeCloud) >:(
Läuft jetzt natürlich, sorry
vg Jens
Dominik danke.
Probier ich vorm Bett gehen.
Gruß Gerd
Hallo Dominik,
leider Negativ :(
Ich habe fünf Devices + Firmata im Master gelöscht.
Shutdown von FHEM gemacht,
npm install -g fhemsync ausgeführt.
Zitatroot@rpi0:~# npm install -g fhemsync
npm WARN deprecated request@2.88.2: request has been deprecated, see https://github.com/request/request/issues/3142
/usr/bin/fhemsync -> /usr/lib/node_modules/fhemsync/fhemsync.js
+ fhemsync@2.3.0
updated 1 package in 7.531s
FHEM via Bash gestartet, aber ich sehe kein GET im Device.
Muss ich den nach jedem npm install die Device löschen oder werden/sollte das GET dann automatisch erscheinen?
Irgendwie was runterfahren oder falsche Reihenfolge installiert?
Du schreibst leider immer noch nicht die Version-Nummer in die Module ;)
Gruss Gerd
PS: Hab die Woche Kurzarbeit, kann also auch zwischendurch testen....
Ok...muss ich mir morgen nochmals genauer anschauen.
Loeschen der Devices ist nicht mehr notwendig, das war nur in der 1. GET Testversion.
Kannst du mir bitte das Log mit verbose=5 (beim FHEMSYNC Device) per PN zukommen lassen? Danke!
Aja, das WARN beim Installieren ist ok. Das liegt nur daran, dass das 'request' Modul nun nicht mehr weiter supported wird. Da muss ich noch auf ein anderes Modul umstellen - geht gerade einigen so.
Version Nummer ist im Log zu finden, bau ich nun auch im FHEMSYNC Device unter Internals als VERSION ein. In den FHEMSYNC_DEVICEs kann ich es nicht einfuegen, da dort die Version durchaus auch vom Remote Device kommen kann.
Hallo Dominik,
Verbose 5 habe ich gemacht.
Ein Device habe ich gelöscht und danach ein restart ausgelöst.
List vom gelöschten Device
Internals:
ALARM 0
ASYNC 0
CFGFN
DEF OWTHERM KH_28_FFA45D811604
ERRCOUNT 0
FUUID 5e8af07e-f33f-d1a8-16d8-a48a5771e24db898
INTERVAL 300
NAME KH_28_FFA45D811604
NOTIFYDEV global
NR 9709
NTFY_ORDER 50-KH_28_FFA45D811604
OW_FAMILY 28
OW_ID FFA45D811604
PRESENT 1
REMOTENAME KH_28_FFA45D811604
REMOTETYPE OWTHERM
ROM_ID 28.FFA45D811604.CF
STATE T: 26.88 °C
TYPE FHEMSYNC_DEVICE
owg_temp 26.875
owg_th 75
owg_tl -25
READINGS:
2020-04-06 11:04:00 state T: 26.88 °C
2020-04-06 11:04:00 temperature 26.875
helper:
getlist
setlist interval tempHigh tempLow
json:
Name KH_28_FFA45D811604
PossibleAttrs alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 IODev model:DS1820,DS18B20,DS1822 stateAL stateAH tempOffset tempUnit:Celsius,Fahrenheit,Kelvin tempConv:onkick,onread tempLow tempHigh resolution:9,10,11,12 interval event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading cmdIcon devStateIcon devStateIcon:textField-long devStateStyle icon sortby webCmd webCmdLabel:textField-long widgetOverride userattr
PossibleSets interval tempHigh tempLow
Attributes:
IODev OWio6
alias Kaltwasser
comment D_Kaltwasser
group Heizung
icon sani_water_cold
model DS18B20
room FHEMSync,Keller,OWX
tempHigh 75
tempLow -25
Internals:
ALARM 0
ASYNC 0
CFGFN /opt/fhem/FHEM/Keller_Heizung.cfg
DEF DS18B20 FFA45D811604 300
ERRCOUNT 0
FUUID 5c4ab7dc-f33f-7a7a-27ba-27e17fa0160df822
INTERVAL 300
NAME KH_28_FFA45D811604
NOTIFYDEV global
NR 612
NTFY_ORDER 50-KH_28_FFA45D811604
OW_FAMILY 28
OW_ID FFA45D811604
PRESENT 1
ROM_ID 28.FFA45D811604.CF
STATE T: 26.88 �C
TYPE OWTHERM
owg_temp 26.875
owg_th 75
owg_tl -25
Readings:
state:
Time 2020-04-06 11:02:43
Value T: 26.88 �C
temperature:
Time 2020-04-06 11:02:43
Value 26.875
Attributes:
alias Kaltwasser
comment D_Kaltwasser
group Heizung
icon sani_water_cold
model DS18B20
room FHEMSync
tempHigh 75
tempLow -25
userattr comment icon model tempHigh tempLow
Wo hin soll ich das Log schicken? Per Forum geht das ja nicht.
Gruss Gerd
PS: Bin nun im Carport Kabel für Steckdose legen ;) Schau sporadisch rein und wenn ich deine Email hab schick ich dir das Log.
Hallo dominik,
nach Update auf 2.3.0 hab ich das im FHEM-Log:
2020.04.06 12:12:32.207 0: syntax error at ./FHEM/10_FHEMSYNC_DEVICE.pm line 124, near "else if"
Can't use global $@ in "my" at ./FHEM/10_FHEMSYNC_DEVICE.pm line 126, near "($@"
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 129.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 129.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 131.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 135.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 135.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 138.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 139.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 140.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 140.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 142.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 145.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 147.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 153.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 158.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 162.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 163.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 165.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 165.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 169.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 174.
Global symbol "$name" requires explicit package name (did you forget to declare "my $name"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 180.
syntax error at ./FHEM/10_FHEMSYNC_DEVICE.pm line 185, near "}"
./FHEM/10_FHEMSYNC_DEVICE.pm has too many errors.
vg Jens
Danke für den Hinweis, schau ich mir am Abend an, genauso wie das Get.
Zitat von: Newbie am 06 April 2020, 12:15:36
Hallo dominik,
nach Update auf 2.3.0 hab ich das im FHEM-Log:
2020.04.06 12:12:32.207 0: syntax error at ./FHEM/10_FHEMSYNC_DEVICE.pm line 124, near "else if"
Can't use global $@ in "my" at ./FHEM/10_FHEMSYNC_DEVICE.pm line 126, near "($@"
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 129.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 129.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 131.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 135.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 135.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 138.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 139.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 140.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 140.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 142.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 145.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 147.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 153.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 158.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 162.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 163.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 165.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 165.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 169.
Global symbol "$hash" requires explicit package name (did you forget to declare "my $hash"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 174.
Global symbol "$name" requires explicit package name (did you forget to declare "my $name"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 180.
syntax error at ./FHEM/10_FHEMSYNC_DEVICE.pm line 185, near "}"
./FHEM/10_FHEMSYNC_DEVICE.pm has too many errors.
vg Jens
Fixed im 1. Post.
@Gerd, bitte per PN und einfach copy&paste einfuegen.
@dominik
OK. Wenn das alles rein passt.
Bin aber noch beschäftigt
Fixed im 1. Post.
danke, läuft wieder
@dominik
das sind 700kb. Ich hänge das hier als ZIP an.
Wenn Du es hast, lösche ich den Anhang wieder.
Anhang gelöscht
Gruss
Gerd
Danke, ist heruntergeladen und kann geloescht werden.
Fehler gefunden, bitte mit fhemsync 2.4.0 testen dann sollten wir wieder einen Schritt weiter sein.
Hallo Dominik,
Ergebnis leider Negativ :(
-FHEM gestoppt
-Update in der Bash
-Auf dem Master wieder das Device gelöscht, (Kaltwasser)
Aber GET taucht nicht auf.
Anbei das Aktuelle ZIP
Gruss Gerd
Hi,
im Log ist leider noch Version 2.3.0. Pruefe bitte nochmals ob 2.4.0 installiert wurde:
fhemsync -V
in der Linux Console.
Ach..hattest du die Module auch neu?
Schau noch kitchen Impossible
Danach schau ich
Gruß
Nein, eigentlich nur fhemsync. Die Module hatte ich nicht aktualisiert.
Wahrscheinlich ist das Update nicht ganz durch gegangen:
sudo npm install -g fhemsync
fhemsync -V
Da sollte dann 2.4.0 angezeigt werden.
Alles klar, schon mal Danke fuers Testen!
Hm, aber in der Ersten Zeile des Logs steht doch "
[MAIN ] Starting FHEMSync version 2.4.0..."
Die Abfrage bestätigt das auch...
Zitatroot@rpi0:~# fhemsync -V
2.4.0
Die Datei sollte bei Dir aus dem ZIP
fhemsync-2020-04-06_2_4_0.log heissen.
Zitat von: Maista am 06 April 2020, 22:19:13
Hm, aber in der Ersten Zeile des Logs steht doch "[MAIN ] Starting FHEMSync version 2.4.0..."
Die Abfrage bestätigt das auch...
Die Datei sollte bei Dir aus dem ZIP fhemsync-2020-04-06_2_4_0.log heissen.
Quergecheckt, das ZIP welches ich eingespielt habe entspricht dem Log mit der 2.4.0
Du hast recht, der Texteditor am Chromebook hat mir noch das alte File angezeigt. Sorry!
Ich suche noch mal weiter...
So, ich habe nun ein Device zum Testen gefunden :)
Update im 1. Post und fhemsync 2.5.0 - funktioniert nun!
Wenn ein GET/SET einen Text zurueck liefert, wird dieser nun in einem Reading "FHEMSYNC-lastGetResponse" bzw. "FHEMSYNC-lastSetResponse" angezeigt.
Moin Dominik,
Klasse, das Menü hat nun ein Get ;D und die Antwort steht im Master-Reading
FHEMSYNC-lastGetResponse (nach Refresh zu sehen) .
Ups, nun habe ich im Master ein ATTRIBUT geändert.
Zitat2020-04-07 09:29:47 Global global ATTR FIRMATA verbose 0
Übertragen wird dies aber nicht.
Kommt das noch oder ist das zu viel des guten ;=)?
Danke erst einmal.
Gruss Gerd
Super :)
Das Attribut im Master wird nach einiger Zeit sogar gelöscht, da es am Slave nicht vorliegt.
Ich bin davon ausgegangen, dass die Attributkonfiguration immer am Slave gemacht wird. Ich kann das gerne noch ändern, so dass die Attribute vom Master auf den Slave übertragen werden. Werde das beim nächsten Update mit aufnehmen.
Zitat von: dominik am 07 April 2020, 09:38:04
Super :)
Das Attribut im Master wird nach einiger Zeit sogar gelöscht, da es am Slave nicht vorliegt.
Ich bin davon ausgegangen, dass die Attributkonfiguration immer am Slave gemacht wird. Ich kann das gerne noch ändern, so dass die Attribute vom Master auf den Slave übertragen werden. Werde das beim nächsten Update mit aufnehmen.
Ah...geheime Funktionen 8)
Ich dachte schon get funktioniert nicht bis ich dann realisiert habe das du ja von einem Reading geschrieben hast.
Wen Sync dann alles ;) oder das was Sinn macht.
Im FHEMSync Log standen Daten obwohl Verbose 0. Ist das so gewünscht?
Zitat von: Maista am 07 April 2020, 09:49:10
Ah...geheime Funktionen 8)
Ich dachte schon get funktioniert nicht bis ich dann realisiert habe das du ja von einem Reading geschrieben hast.
Wen Sync dann alles ;) oder das was Sinn macht.
Im FHEMSync Log standen Daten obwohl Verbose 0. Ist das so gewünscht?
Sync fuer Attribute ist nun implementiert.
Modul: v0.9.6
fhemsync: v2.6.0
Einzig wenn Attribute am Slave geloescht werden, dauert es 5 Minuten bis diese gesynced werden, da habe ich im Moment noch keine andere Moeglichkeit gefunden.
Hallo Dominik,
Klasse, das Ändern im Master ändert die Attribute im Slave. An drei Devices probiert.
Allerdings habe ich nun das Problem das mein Firmata-Device alle 5 Minuten seine Ports initialisiert?!
Und per Telegram bekomme ich die Meldung das FHEM (notify auf "Initialized") neu geladen wurde!
Das war bis vor der Version mit Attribute nicht der Fall.
Der Event-Monitor im Slave sieht so aus:
2020-04-07 22:52:59 OWMULTI NAFT.002 relHumidity: 38.65
2020-04-07 22:52:59 OWMULTI NAFT.002 vsense: 0.187
2020-04-07 22:52:59 OWMULTI NAFT.002 vsense.t: 0
2020-04-07 22:52:59 OWMULTI NAFT.002 time: 0
2020-04-07 22:52:59 OWMULTI NAFT.002 VDD: 5.00
2020-04-07 22:52:59 OWMULTI NAFT.002 temperature: 20.7
2020-04-07 22:52:59 OWMULTI NAFT.002 Feuchtigkeit: 38.65 percent|% (T: 20.7 °C vs: 0.19 V vs.t: 0.00 Vs)
2020-04-07 22:52:59 Global global ATTR KH_28_FF715C811603 alias Zirkulation
2020-04-07 22:52:59 Global global ATTR KH_28_FF715C811603 group Heizung
2020-04-07 22:52:59 Global global ATTR KH_28_FF715C811603 comment E_Zirkulation
2020-04-07 22:52:59 Global global ATTR KH_28_FF715C811603 model DS18B20
2020-04-07 22:52:59 Global global ATTR KH_28_FF715C811603 icon sani_pump
2020.04.07 22:52:59 5 : FIRMATA FRM:>f0730106f7
2020.04.07 22:52:59 5 : SW: f0730106f7
2020.04.07 22:52:59 5 : FIRMATA FRM:>f0732c06287e4763155045014f06003000405325197f01f7
2020.04.07 22:52:59 5 : SW: f0732c06287e4763155045014f06003000405325197f01f7
2020.04.07 22:52:59 5 : FIRMATA FRM:<f073430606007c7f7f1ff7
2020-04-07 22:52:59 Global global ATTR KH_28_FF715C811603 tempLow -25
2020-04-07 22:52:59 Global global ATTR KH_28_FF8E8C811603 comment A_Vorlauf
2020-04-07 22:52:59 Global global ATTR KH_28_FF8E8C811603 group Heizung
2020.04.07 22:52:59 5 : FIRMATA FRM:>f0730106f7
2020.04.07 22:52:59 5 : SW: f0730106f7
2020.04.07 22:52:59 5 : FIRMATA FRM:>f0732c06287e3b64185045010707003800405325197f01f7
2020.04.07 22:52:59 5 : SW: f0732c06287e3b64185045010707003800405325197f01f7
2020.04.07 22:52:59 5 : FIRMATA FRM:<f073430607007c7f7f1ff7
2020-04-07 22:52:59 Global global ATTR KH_28_FF8E8C811603 tempLow -25
2020-04-07 22:52:59 Global global ATTR KH_28_FF8E8C811603 icon sani_heating
2020-04-07 22:52:59 Global global ATTR KH_28_FF8E8C811603 model DS18B20
2020.04.07 22:52:59 5 : FIRMATA FRM:>f0730106f7
2020.04.07 22:52:59 5 : SW: f0730106f7
2020.04.07 22:52:59 5 : FIRMATA FRM:>f0732c06287e3b64185045010707004000405325197f01f7
2020.04.07 22:52:59 5 : SW: f0732c06287e3b64185045010707004000405325197f01f7
2020.04.07 22:52:59 5 : FIRMATA FRM:<f073430608007c7f7f1ff7
2020-04-07 22:52:59 Global global ATTR KH_28_FF8E8C811603 tempHigh 75
2020-04-07 22:52:59 Global global ATTR KH_28_FF8E8C811603 alias Vorlauf
2020.04.07 22:52:59 5 : FIRMATA FRM:>f0730105f7
2020.04.07 22:52:59 5 : SW: f0730105f7
2020.04.07 22:52:59 5 : FIRMATA FRM:>f0732c05284466260840453b6b06004800405325367f01f7
2020.04.07 22:52:59 5 : SW: f0732c05284466260840453b6b06004800405325367f01f7
2020.04.07 22:52:59 5 : FIRMATA FRM:<f073430509007c7f7f1ff7
2020-04-07 22:52:59 Global global ATTR HI_28_A2D984001677 tempLow -54
2020-04-07 22:52:59 Global global ATTR HI_28_A2D984001677 interval 300
2020-04-07 22:52:59 Global global ATTR HI_28_A2D984001677 icon weather_light_meter
2020-04-07 22:52:59 Global global ATTR HI_28_A2D984001677 model DS18B20
2020-04-07 22:52:59 Global global ATTR HI_28_A2D984001677 resolution 12
2020-04-07 22:52:59 Global global ATTR HI_28_A2D984001677 group Klima
2020-04-07 22:53:00 Global global ATTR HI_28_A2D984001677 stateFormat Sonne: Helligkeit
2020-04-07 22:53:00 Global global ATTR HI_28_A2D984001677 alias Büro
2020.04.07 22:53:00 5 : FIRMATA FRM:>f0730105f7
2020.04.07 22:53:00 5 : SW: f0730105f7
2020.04.07 22:53:00 5 : FIRMATA FRM:>f0732c05284466260840453b6b06005000405325367f01f7
2020.04.07 22:53:00 5 : SW: f0732c05284466260840453b6b06005000405325367f01f7
2020.04.07 22:53:00 5 : FIRMATA FRM:<f07343050a007c7f7f1ff7
2020-04-07 22:53:00 Global global ATTR HI_28_A2D984001677 tempHigh 75
2020-04-07 22:53:00 Global global ATTR FA_26_A2D984000007 VUnit percent|%
2020-04-07 22:53:00 Global global ATTR FA_26_A2D984000007 alias Aussen-Norden
2020-04-07 22:53:00 Global global ATTR FA_26_A2D984000007 VFunction (161.29 * V / VDD - 25.8065)/(1.0546 - 0.00216 * T)
2020-04-07 22:53:00 Global global ATTR FA_26_A2D984000007 model DS2438
2020-04-07 22:53:00 Global global ATTR FA_26_A2D984000007 icon temperature_humidity
2020-04-07 22:53:00 Global global ATTR FA_26_A2D984000007 group Klima
2020-04-07 22:53:00 Global global ATTR FA_26_A2D984000007 VName relHumidity|Feuchtigkeit
2020-04-07 22:53:00 Global global ATTR KH_28_FF5A50811605 comment B_Rücklauf
2020-04-07 22:53:00 Global global ATTR KH_28_FF5A50811605 group Heizung
2020.04.07 22:53:00 5 : FIRMATA FRM:>f0730106f7
2020.04.07 22:53:00 5 : SW: f0730106f7
2020.04.07 22:53:00 5 : FIRMATA FRM:>f0732c06287e6b02155045020806004800405325197f01f7
2020.04.07 22:53:00 5 : SW: f0732c06287e6b02155045020806004800405325197f01f7
2020.04.07 22:53:00 5 : FIRMATA FRM:<f073430609007c7f7f1ff7
2020-04-07 22:53:00 Global global ATTR KH_28_FF5A50811605 tempLow -25
2020-04-07 22:53:00 Global global ATTR TA_28_FF313C4E0400 group Klima
2020-04-07 22:53:00 Global global ATTR KH_28_FF5A50811605 icon sani_heating
2020-04-07 22:53:00 Global global ATTR FI_26_A3D984001605 VUnit percent|%
2020-04-07 22:53:00 Global global ATTR FI_26_A3D984001605 VFunction (161.29 * V / 5 - 25.8065)/(1.0546 - 0.00216 * T)
2020-04-07 22:53:08 Global global ATTR KH_28_FF5A50811605 model DS18B20
2020-04-07 22:53:10 Global global ATTR TA_28_FF313C4E0400 stateFormat Aussen: Temperatur_r °C
2020-04-07 22:53:10 Global global ATTR TA_28_FF313C4E0400 icon temp_temperature
2020-04-07 22:53:10 Global global ATTR TA_28_FF313C4E0400 interval 300
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0730103f7
2020.04.07 22:53:10 5 : SW: f0730103f7
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0732c03287e476163090100250700400040532a007e01f7
2020.04.07 22:53:10 5 : SW: f0732c03287e476163090100250700400040532a007e01f7
2020.04.07 22:53:10 5 : FIRMATA FRM:<f073430308007c7f7f1ff7
2020-04-07 22:53:10 Global global ATTR TA_28_FF313C4E0400 tempLow 0
2020-04-07 22:53:10 Global global ATTR TA_28_FF313C4E0400 model DS18B20
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0730103f7
2020.04.07 22:53:10 5 : SW: f0730103f7
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0732c03287e476163090100250700480040532a007e01f7
2020.04.07 22:53:10 5 : SW: f0732c03287e476163090100250700480040532a007e01f7
2020.04.07 22:53:10 5 : FIRMATA FRM:<f073430309007c7f7f1ff7
2020-04-07 22:53:10 Global global ATTR TA_28_FF313C4E0400 tempHigh 85
2020-04-07 22:53:10 Global global ATTR TA_28_FF313C4E0400 alias Aussen-Norden
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0730106f7
2020.04.07 22:53:10 5 : SW: f0730106f7
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0732c06287e5f64165045021c07005000405325197f01f7
2020.04.07 22:53:10 5 : SW: f0732c06287e5f64165045021c07005000405325197f01f7
2020.04.07 22:53:10 5 : FIRMATA FRM:<f07343060a007c7f7f1ff7
2020-04-07 22:53:10 Global global ATTR KH_28_FF976C811605 tempLow -25
2020-04-07 22:53:10 Global global ATTR KH_28_FF976C811605 icon sani_water_hot
2020-04-07 22:53:10 Global global ATTR KH_28_FF976C811605 model DS18B20
2020-04-07 22:53:10 Global global ATTR KH_28_FF976C811605 comment C_Warmwasser
2020-04-07 22:53:10 Global global ATTR KH_28_FF976C811605 group Heizung
2020-04-07 22:53:10 Global global ATTR KH_28_FF976C811605 alias Warmwasser
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0730106f7
2020.04.07 22:53:10 5 : SW: f0730106f7
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0732c06287e5f64165045021c07005800405325197f01f7
2020.04.07 22:53:10 5 : SW: f0732c06287e5f64165045021c07005800405325197f01f7
2020.04.07 22:53:10 5 : FIRMATA FRM:<f07343060b007c7f
2020.04.07 22:53:10 5 : FIRMATA FRM:<7f1ff7
2020-04-07 22:53:10 Global global ATTR KH_28_FF976C811605 tempHigh 75
2020-04-07 22:53:10 Global global ATTR KH_28_FFA45D811604 alias Kaltwasser
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0730106f7
2020.04.07 22:53:10 5 : SW: f0730106f7
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0732c06287e136d155005024f07006000405325197f01f7
2020.04.07 22:53:10 5 : SW: f0732c06287e136d155005024f07006000405325197f01f7
2020.04.07 22:53:10 5 : FIRMATA FRM:<f0734306
2020.04.07 22:53:10 5 : FIRMATA FRM:<0c007c7f7f1ff7
2020-04-07 22:53:10 Global global ATTR KH_28_FFA45D811604 tempHigh 75
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0730106f7
2020.04.07 22:53:10 5 : SW: f0730106f7
2020.04.07 22:53:10 5 : FIRMATA FRM:>f0732c06287e6b02155045020806006800405325197f01f7
2020.04.07 22:53:10 5 : SW: f0732c06287e6b02155045020806006800405325197f01f7
2020.04.07 22:53:10 5 : FIRMATA FRM:<f07343
2020.04.07 22:53:10 5 : FIRMATA FRM:<060d007c7f7f1ff7
2020-04-07 22:53:10 Global global ATTR KH_28_FF5A50811605 tempHigh 75
2020-04-07 22:53:10 Global global ATTR KH_28_FF5A50811605 alias Rücklauf
2020-04-07 22:53:11 Global global ATTR KH_28_FFA45D811604 icon sani_water_cold
2020.04.07 22:53:11 5 : FIRMATA FRM:>f0730106f7
2020.04.07 22:53:11 5 : SW: f0730106f7
2020.04.07 22:53:11 5 : FIRMATA FRM:>f0732c06287e136d155005024f07007000405325197f01f7
2020.04.07 22:53:11 5 : SW: f0732c06287e136d155005024f07007000405325197f01f7
2020.04.07 22:53:11 5 : FIRMATA FRM:<f07343060e007c7f7f1ff7
2020-04-07 22:53:11 Global global ATTR KH_28_FFA45D811604 tempLow -25
2020-04-07 22:53:11 Global global ATTR KH_28_FFA45D811604 model DS18B20
2020-04-07 22:53:11 Global global ATTR KH_28_FFA45D811604 group Heizung
2020-04-07 22:53:14 Global global ATTR LI_28_A3D984001605 stateFormat Luftdruck: Luftdruck_tm3 hPa
2020-04-07 22:53:14 Global global ATTR LI_28_A3D984001605 group Klima
2020-04-07 22:53:14 Global global ATTR LI_28_A3D984001605 comment DS18B20 > BMP280 mit ATTINY84 Average soll Tendenz berechnen
2020-04-07 22:53:14 Global global ATTR LI_28_A3D984001605 model DS18B20
2020-04-07 22:53:14 Global global ATTR LI_28_A3D984001605 icon weather_barometric_pressure
2020.04.07 22:53:14 5 : FIRMATA FRM:>f0730105f7
2020.04.07 22:53:14 5 : SW: f0730105f7
2020.04.07 22:53:14 5 : FIRMATA FRM:>f0732c0528466626084045021806005800405325467e01f7
2020.04.07 22:53:14 5 : SW: f0732c0528466626084045021806005800405325467e01f7
2020.04.07 22:53:14 5 : FIRMATA FRM:<f07343050b007c7f
2020.04.07 22:53:14 5 : FIRMATA FRM:<7f1ff7
2020-04-07 22:53:14 Global global ATTR LI_28_A3D984001605 tempLow 70
2020.04.07 22:53:14 5 : FIRMATA FRM:>f0730105f7
2020.04.07 22:53:14 5 : SW: f0730105f7
2020.04.07 22:53:14 5 : FIRMATA FRM:>f0732c0528466626084045021806006000405325467e01f7
2020.04.07 22:53:14 5 : SW: f0732c0528466626084045021806006000405325467e01f7
2020.04.07 22:53:14 5 : FIRMATA FRM:<f0734305
2020.04.07 22:53:14 5 : FIRMATA FRM:<0c007c7f7f1ff7
2020-04-07 22:53:14 Global global ATTR LI_28_A3D984001605 tempHigh 75
2020-04-07 22:53:14 Global global ATTR LI_28_A3D984001605 alias Büro
2020-04-07 22:53:14 Global global ATTR NAFT.002 alias Schlafzimmer
2020-04-07 22:53:14 Global global ATTR NAFT.002 VUnit percent|%
2020-04-07 22:53:14 Global global ATTR NAFT.002 VFunction (161.29 * V / 5 - 25.8065)/(1.0546 - 0.00216 * T)
2020-04-07 22:53:14 Global global ATTR NAFT.002 icon temperature_humidity
2020-04-07 22:53:14 Global global ATTR NAFT.002 model DS2438
2020-04-07 22:53:14 Global global ATTR NAFT.002 comment Norden AussenFeuchte SHT23 & Temperatur via RPi1
2020-04-07 22:53:14 Global global ATTR NAFT.002 VName relHumidity|Feuchtigkeit
2020-04-07 22:53:14 Global global ATTR NAFT.002 group Klima
2020-04-07 22:53:14 Global global ATTR NAVOC.002 model DS18B20
2020-04-07 22:53:14 Global global ATTR NAVOC.002 icon scene_toilet_alternat
2020-04-07 22:53:14 Global global ATTR NAVOC.002 interval 300
2020-04-07 22:53:14 Global global ATTR NAVOC.002 stateFormat Luftguete: VOC
2020-04-07 22:53:14 Global global ATTR NAVOC.002 group Klima
2020-04-07 22:53:14 Global global ATTR NAVOC.002 comment Schlafzimmer Luftgüte via RPi1
2020-04-07 22:53:14 Global global ATTR NAVOC.002 alias Schlafzimmer Luftgüte
2020-04-07 22:53:14 Global global ATTR TA_28_736020050000 alias Arduino
2020.04.07 22:53:14 5 : FIRMATA FRM:>f0730102f7
2020.04.07 22:53:14 5 : SW: f0730102f7
2020.04.07 22:53:14 5 : FIRMATA FRM:>f0732c0228660103520000000b07000000405325463e00f7
2020.04.07 22:53:14 5 : SW: f0732c0228660103520000000b07000000405325463e00f7
2020.04.07 22:53:14 5 : FIRMATA FRM:<f073430200007c7f
2020.04.07 22:53:14 5 : FIRMATA FRM:<7f1ff7
2020-04-07 22:53:14 Global global ATTR TA_28_736020050000 tempHigh 75
2020-04-07 22:53:14 Global global ATTR TA_28_736020050000 icon temp_temperature
2020-04-07 22:53:14 Global global ATTR TA_28_736020050000 interval 300
2020.04.07 22:53:14 5 : FIRMATA FRM:>f0730102f7
2020.04.07 22:53:14 5 : SW: f0730102f7
2020.04.07 22:53:14 5 : FIRMATA FRM:>f0732c0228660103520000000b07000800405325463e00f7
2020.04.07 22:53:14 5 : SW: f0732c0228660103520000000b07000800405325463e00f7
2020.04.07 22:53:14 5 : FIRMATA FRM:<f073430201007c7f7f1ff7
2020-04-07 22:53:14 Global global ATTR TA_28_736020050000 tempLow 70
2020-04-07 22:53:14 Global global ATTR TA_28_736020050000 model DS18B20
2020-04-07 22:53:14 Global global ATTR TA_28_736020050000 group Klima
2020-04-07 22:53:14 Global global ATTR TA_28_736020050000 resolution 9
2020-04-07 22:53:14 Global global ATTR FI_26_A3D984001605 alias Büro
2020-04-07 22:53:14 Global global ATTR FI_26_A3D984001605 VName relHumidity|Feuchtigkeit
2020-04-07 22:53:14 Global global ATTR FI_26_A3D984001605 comment DS2438-Emulation mit ATTINY84 via RPi1
2020-04-07 22:53:14 Global global ATTR FI_26_A3D984001605 group Klima
2020-04-07 22:53:14 Global global ATTR FI_26_A3D984001605 model DS2438
2020-04-07 22:53:14 Global global ATTR FI_26_A3D984001605 icon temperature_humidity
2020-04-07 22:53:14 FRM FIRMATA defined
2020.04.07 22:53:14 3 : Opening FIRMATA device /dev/ttyACM0
2020.04.07 22:53:14 3 : Setting FIRMATA serial parameters to 57600,8,N,1
2020.04.07 22:53:14 5 : FIRMATA FRM_DoInit
2020-04-07 22:53:14 FRM FIRMATA connected
2020.04.07 22:53:15 5 : FIRMATA FRM:>ff
2020.04.07 22:53:15 5 : SW: ff
2020.04.07 22:53:15 5 : FIRMATA setup stage 1
2020.04.07 22:53:15 3 : FIRMATA device opened
2020-04-07 22:53:15 FRM FIRMATA CONNECTED
2020-04-07 22:53:15 Global global ATTR FIRMATA disable 0
2020-04-07 22:53:15 Global global ATTR FIRMATA verbose 5
2020-04-07 22:53:15 Global global ATTR TA_28_736020050000 stateFormat Arduino: Temperatur_r °C
2020-04-07 22:53:15 Global global ATTR FIRMATA comment Arduino-Mega via RPi1
2020-04-07 22:53:15 Global global ATTR FIRMATA icon DIN_rail_firmata
2020.04.07 22:53:15 5 : FIRMATA setup stage 1
2020.04.07 22:53:15 5 : FIRMATA setup stage 1
2020.04.07 22:53:15 5 : FIRMATA setup stage 1
2020.04.07 22:53:15 5 : FIRMATA setup stage 1
2020.04.07 22:53:15 5 : FIRMATA setup stage 1
2020.04.07 22:53:15 5 : FIRMATA setup stage 1
2020.04.07 22:53:15 5 : FIRMATA setup stage 1
2020.04.07 22:53:15 5 : FIRMATA setup stage 1
2020.04.07 22:53:15 5 : FIRMATA setup stage 1
2020.04.07 22:53:16 5 : FIRMATA setup stage 1
2020.04.07 22:53:17 5 : FIRMATA setup stage 1
2020.04.07 22:53:18 5 : FIRMATA setup stage 1
2020.04.07 22:53:18 3 : FIRMATA querying Firmata versions
2020.04.07 22:53:18 5 : FIRMATA FRM:>f90000
2020.04.07 22:53:18 5 : SW: f90000
2020.04.07 22:53:18 5 : FIRMATA FRM:>f079f7
2020.04.07 22:53:18 5 : SW: f079f7
2020.04.07 22:53:18 5 : FIRMATA FRM:<f90206f079
2020.04.07 22:53:18 5 : FIRMATA setup stage 1
2020.04.07 22:53:18 5 : FIRMATA FRM:<020643006f006e0066006900670075007200610062006c00
2020.04.07 22:53:18 5 : FIRMATA setup stage 1
2020.04.07 22:53:18 5 : FIRMATA FRM:<65004600690072006d006100740061005f00440053003200
2020.04.07 22:53:18 5 : FIRMATA setup stage 1
2020.04.07 22:53:18 5 : FIRMATA FRM:<3400380032005f004d004500470041005f005500530042002e
2020.04.07 22:53:18 5 : FIRMATA setup stage 1
2020.04.07 22:53:18 5 : FIRMATA FRM:<0069006e006f00f7e05503e17f07e27b04e37303e47f07e5
2020.04.07 22:53:18 5 : FIRMATA setup stage 1
2020.04.07 22:53:18 3 : FIRMATA Firmata Firmware Version: ConfigurableFirmata_DS2482_MEGA_USB.ino V_2_06 (using Protocol Version: V_2_06)
2020.04.07 22:53:18 5 : FIRMATA FRM:>f069f7
2020.04.07 22:53:18 5 : SW: f069f7
2020.04.07 22:53:18 5 : FIRMATA FRM:>f06bf7
2020.04.07 22:53:18 5 : SW: f06bf7
2020.04.07 22:53:18 5 : FIRMATA FRM:<7f07e67f05e76804e86003e92403ea0c03eb7902ec2303ed0903ee0b03ef3903f90206f079020643006f006e00660069
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<00670075007200610062006c0065004600690072006d006100740061005f004400530032003400380032005f004d0045
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<00470041005f005500530042002e0069006e006f00f7f06a
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f7f
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<7f7f7f7f7f7f000102030405060708090a0b0c0d0e0ff7f0
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<6c7f7f0001010103080701091c7f0001010103080701091c7f00010101030807017f00010101030807017f000101010308
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<07017f00010101030807017f00010101030807017f000101
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<01030807017f00010101030807017f00010101030807017f
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<00010101030807017f00010101030807017f000101010701
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<7f0001010107017f0001010107017f0001010107017f0001
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<01010701091c7f000101010701091c7f0001010106010701
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<0702091c7f00010101060107010702091c7f000101010701
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<7f0001010107017f0001010107017f0001010107017f0001
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<010107017f0001010107017f0001010107017f0001010107
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<017f0001010107017f0001010107017f0001010107017f00
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<01010107017f0001010107017f0001010107017f0001010107
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<017f0001010107017f0001010107017f0001010107017f00
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<01010107017f0001010107017f0001010107017f00010101
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<07017f00010101030807017f00010101030807017f000101
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<01030807017f0001010107017f0001010107017f00010101
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<07017f0001010107017f0001010107017f0001010107017f
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<0001010107017f00010101020a07017f00010101020a0701
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<7f00010101020a07017f00010101020a07017f0001010102
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<0a07017f00010101020a07017f00010101020a07017f0001
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<0101020a07017f00010101020a07017f00010101020a0701
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<7f00010101020a07017f00010101020a07017f0001010102
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<0a07017f00010101020a07017f00010101020a07017f000101
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:<01020a07017ff7
2020.04.07 22:53:18 5 : FIRMATA setup stage 2
2020.04.07 22:53:18 5 : FIRMATA FRM:>f07a6807f7
2020.04.07 22:53:18 5 : SW: f07a6807f7
2020.04.07 22:53:18 5 : FIRMATA setup stage 3
2020.04.07 22:53:18 5 : FIRMATA initializing 'LED13'
2020.04.07 22:53:18 5 : FIRMATA FRM:>f40d01
2020.04.07 22:53:18 5 : SW: f40d01
2020.04.07 22:53:18 5 : FIRMATA FRM:>d101
2020.04.07 22:53:18 5 : SW: d101
2020.04.07 22:53:18 5 : FIRMATA FRM:>910000
2020.04.07 22:53:18 5 : SW: 910000
2020-04-07 22:53:18 TelegramBot myfhemBot message @Maista FHEM - ⚠CFG neu geladen⚠ value: off
2020-04-07 22:53:18 FRM_OUT LED13 value: off
2020-04-07 22:53:18 FRM_OUT LED13 Initialized
2020.04.07 22:53:18 5 : FIRMATA initializing 'Luefter'
2020.04.07 22:53:18 5 : FIRMATA FRM:>f40c01
2020.04.07 22:53:18 5 : SW: f40c01
2020.04.07 22:53:18 5 : FIRMATA FRM:>d101
2020.04.07 22:53:18 5 : SW: d101
2020.04.07 22:53:18 5 : FIRMATA FRM:>910000
2020.04.07 22:53:18 5 : SW: 910000
2020-04-07 22:53:18 FRM_OUT Luefter value: off
2020-04-07 22:53:18 FRM_OUT Luefter Initialized
2020.04.07 22:53:18 5 : FIRMATA initializing 'OWio2'
2020.04.07 22:53:18 5 : FIRMATA FRM:>f40207
2020.04.07 22:53:18 5 : SW: f40207
2020.04.07 22:53:18 5 : FIRMATA initializing 'OWio3'
2020.04.07 22:53:18 5 : FIRMATA FRM:>f40307
2020.04.07 22:53:18 5 : SW: f40307
2020.04.07 22:53:18 5 : FIRMATA initializing 'OWio5'
2020.04.07 22:53:18 5 : FIRMATA FRM:>f40507
2020.04.07 22:53:18 5 : SW: f40507
2020.04.07 22:53:18 5 : FIRMATA initializing 'OWio6'
2020.04.07 22:53:18 5 : FIRMATA FRM:>f40607
2020.04.07 22:53:18 5 : SW: f40607
2020.04.07 22:53:18 5 : FIRMATA initializing 'OWio7'
2020.04.07 22:53:18 5 : FIRMATA FRM:>f40707
2020.04.07 22:53:18 5 : SW: f40707
2020.04.07 22:53:18 5 : FIRMATA initializing 'UKW_12V'
2020.04.07 22:53:18 5 : FIRMATA FRM:>f40801
2020.04.07 22:53:18 5 : SW: f40801
2020.04.07 22:53:18 5 : FIRMATA FRM:>d101
2020.04.07 22:53:18 5 : SW: d101
2020.04.07 22:53:18 5 : FIRMATA FRM:>910000
2020.04.07 22:53:18 5 : SW: 910000
2020-04-07 22:53:18 FRM_OUT UKW_12V value: off
2020-04-07 22:53:18 FRM_OUT UKW_12V Initialized
2020.04.07 22:53:18 5 : FIRMATA initializing 'UKW_ON'
2020.04.07 22:53:18 5 : FIRMATA FRM:>f40901
2020.04.07 22:53:18 5 : SW: f40901
2020.04.07 22:53:18 5 : FIRMATA FRM:>d101
2020.04.07 22:53:18 5 : SW: d101
2020.04.07 22:53:18 5 : FIRMATA FRM:>910000
2020.04.07 22:53:18 5 : SW: 910000
2020-04-07 22:53:18 FRM_OUT UKW_ON value: off
2020-04-07 22:53:18 FRM_OUT UKW_ON Initialized
2020-04-07 22:53:18 FRM FIRMATA Initialized
2020.04.07 22:53:18 5 : FIRMATA setup stage 5
2020.04.07 22:53:18 5 : FIRMATA FRM:<910000910000910000910000
2020-04-07 22:53:18 TelegramBot myfhemBot sentMsgResult: SUCCESS
2020-04-07 22:53:18 TelegramBot myfhemBot sentMsgId: 59509
2020-04-07 22:53:18 TelegramBot myfhemBot sentMsgPeerId: 13363079
2020-04-07 22:53:25 Global global ATTR FIRMATA verbose 0
2020.04.07 22:53:28 1 : OWX_Discover: 1-Wire devices found on bus OWio2 (TA_28_736020050000)
2020.04.07 22:53:28 1 : OWX_Discover: 1-Wire devices found on bus OWio3 (TA_28_FF313C4E0400,FA_26_A2D984000007)
2020.04.07 22:53:28 1 : OWX_Discover: Warning, FI_26_A3D984001605 on bus OWio5 is defined with duplicate ROM ID
2020.04.07 22:53:28 1 : OWX_Discover: 1-Wire devices found on bus OWio5 (HI_28_A2D984001677,LI_28_A3D984001605,FI_26_A3D984001605)
2020.04.07 22:53:28 1 : OWX_Discover: 1-Wire devices found on bus OWio6 (KH_28_FFA45D811604,KH_28_FF5A50811605,KH_28_FF8E8C811603,KH_28_FF715C811603,KH_28_FF976C811605)
2020.04.07 22:53:28 1 : OWX_Discover: 1-Wire devices found on bus OWio7 (NAVOC.002,NAFT.002)
Ist das Richtig das man hier "Global global" sieht? Es werden ja nun alle Attribute gesetzt, vermutlich bringt das Firmata dazu neu zu starten.
Alle Attribute werden alle 5 Minuten neu gesetzt auch wenn es keine Änderung gab, kann das sein?
Das sehe ich so auch bei den 1Wire-Sensoren. Da werden alle 5min ALLE Attribute geändert.
Ist vermutlich nicht im Sinne des Erfinders ;)
Kann ich die Attribute ändern, irgendwie abschalten?
Hab jetzt erst mal Firmata aus dem FHEMSync Verzeichnis raus genommen.
Eine Lösung wäre vielleicht das man im FHEMSYNC-Device ebenfalls ein Button macht der die Änderung der Attribute einmalig anschiebt?
Gute Nacht erst mal.
Gruss Gerd
Teste bitte morgen die Version anbei bei dir. Damit sollten nur geaenderte Attribute uebertragen werden.
gn8
Jetzt passiert garnix mehr?!
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 1 : reload: Error:Modul 10_FHEMSYNC_DEVICE deactivated: Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.07 23:22:40 0 : Global symbol "$NAME" requires explicit package name (did you forget to declare "my $NAME"?) at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
Alle Devices wurden beim restart von FHEM angemeckert und gelöscht/nicht mehr erzeugt
Dominik,
ich habs deaktiviert. Mein Eigener Versuch das "my" passend zu setzen hat auch nicht geholfen.
Im Modul wird an verschiedenen Stellen "my $name " gesetzt. Scheint irgendwo falsch zu sitzen ;)
Bis morgen wieder.
Ein $ zu viel gewesen :) Anbei der Fix, damit sollte nun nicht mehr alle 5 Minuten das Attribut aktualisiert werden. Lass mich morgen wissen ob es klappt, dann aktualisiere ich die Version im 1. Post. Danke dir!
Einen hab ich noch:
Zitat2020.04.08 09:06:40 1: PERL WARNING: Use of uninitialized value in string ne at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
>Beim erneuten Shutdown restart und Stacktrace 1 ist keine Fehlermeldung mehr zu sehen?!
Funktioniert bisher ohne Probleme.
Gruss Gerd
Hallo @Dominik
Schau mir gerade via VPN die Grafik vom Slave an.
Heute morgen hat es ein paar Stunden (zwischen 3-9 Ihr) kein sync zwischen Master und Slave gegeben.
Beim Slave wurden die Daten aber weiter geloggt.
Hier das Log vom Master:
[MAIN ] Monitoring remote device: NAVOC.002
[MAIN ] Monitoring remote device: TA_28_736020050000
[MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[SLAVE1 ] longpoll end: retry in: 60000msec
[SLAVE1 ] longpoll end: retry in: 60000msec
[SLAVE1 ] longpoll end: retry in: 60000msec
[SLAVE1 ] longpoll end: retry in: 60000msec
[SLAVE1 ] longpoll end: retry in: 60000msec
(node:6572) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:6572) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 2)
(node:6572) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
[SLAVE1 ] longpoll end: retry in: 60000msec
[SLAVE1 ] longpoll end: retry in: 60000msec
[SLAVE1 ] longpoll end: retry in: 60000msec
Funktioniert ohne mein Zutun.
Gruß Gerd
Hi,
die "uninitialized" Fehlermeldung sollte mit Version 0.9.7 (1. Post) behoben sein.
Der 2. Fehler (Verbindungsproblem) ist ebenfalls mit der neuen fhemsync 2.7.0 Version behoben sein. Wobei es so aussieht, als waere dein RPi nicht mehr erreichbar gewesen, da "longpoll end".
Moin Dominik
Komme erst Montag oder Dienstag Heim
Der Slave hat mir aber die ganze Zeit durch Daten gesammelt.
Er schickt mir auch jede Stunde per Telegram eine Status Meldung.
Das einzige was eben alle 5min stockt wenn der 1wire angefragt wird.
Na schau mehr mal ;D
Danke für das Update.
Um 20uhr gab's heute wieder ein Hänger. Andere Meldung.
Aber heute nicht mehr.
Gruß Gerd
Zitat von: dominik am 11 April 2020, 22:08:29
Hi,
die "uninitialized" Fehlermeldung sollte mit Version 0.9.7 (1. Post) behoben sein.
Der 2. Fehler (Verbindungsproblem) ist ebenfalls mit der neuen fhemsync 2.7.0 Version behoben sein. Wobei es so aussieht, als waere dein RPi nicht mehr erreichbar gewesen, da "longpoll end".
Hallo Dominik,
bin am Sonntag Abend wieder daheim angekommen. Habe heute dann Zeit dazu gefunden beides "upzudaten".
Schau mehr mal was passiert.....
Gruss Gerd
@dominik
Kleiner Hinweis,
ich habe auf dem Slave folgendes definiert:
defmod Luefter FRM_OUT 12
attr Luefter IODev FIRMATA
attr Luefter comment Lüfter Ein/Aus
attr Luefter icon vent_ventilation
attr Luefter room Aussen,Ereignisse,FHEMSync,OWX
attr Luefter stateFormat value
Im Master wurde
defmod Luefter FHEMSYNC_DEVICE FRM_OUT Luefter
attr Luefter userattr comment icon stateFormat
attr Luefter room FHEMSync
angelegt.
Was fehlt sind die Attribute comment, icon und stateFormat
Diese Attribute wurden ohne Inhalt in das userattr geschrieben.
Das ist mir jetzt erst aufgefallen. FHEMSync müsste also den Comment das Icon und das StateFormat mit übernehmen.
Im Grunde alle Attribute ;=)
Ausser natürlich "Room" und "IODev".
Schönen Feiertag noch.
Gruss Gerd
@dominik
Funktioniert auch mit einem zweiten RPi und Abfrage eines I2C-BMP180.
Gruss Gerd
Hallo Dominik,
ich frag auch Readings vom DLNARenderer-Modul ab
beim Slave steht das im readings
currentTitle NDR 2 - Der Norden h�lt zusammen - ndr.de/ndr2 2020-04-14 18:02:56
beim Master kommt dann meist das
currentTitle <BINARY> 2020-04-14 18:02:56
Irgendeine Idee?
vg Jens
@Gerd, ich schau mir das heute noch an wegen dem Attributsync und melde mich dann.
@Jens, kommt es im DLNARenderer mit dem "Spezialzeichen" an? Eigentlich muss ich dann den DLNARenderer aktualisieren - da ich den auch gebaut hatte. Wobei es niemals als <BINARY> ankommen sollte, dann schau ich mir noch an.
Hallo dominik,
...kommt es im DLNARenderer mit dem "Spezialzeichen" an? ...
ja das kommt da schon so an.
<BINARY> wird auch nicht immer angezeigt, manchmal kommt der original Text mit Sonderzeichen
@Gerd, das Problem duerfte nun geloest sein.
Update 0.9.8 und 2.7.1 fhemsync. Danke fuer die Info, dass es auch mit 2 RPis klappt :)
@Jens, ich konnte das Problem leider noch nicht nachstellen. Kannst du bitte mal den Logauszug mit verbose=5 posten wenn die Daten uebertragen werden? Ich frage mich noch wie das <BINARY> da rein kommt.
ohne Sonderzeichen
[2020-4-14 22:40:21] [SLAVE1 ] ["DLNA_a307ba9db333","playing","<div id=\"DLNA_a307ba9db333\" title=\"playing\" class=\"col2\"><a href=\"/fhem?cmd.DLNA_a307ba9db333=set DLNA_a307ba9db333 on&room=odroid&fwcsrf=csrf_111645292585337\">playing</a></div>"]
[2020-4-14 22:40:21] [SLAVE1 ] ["DLNA_a307ba9db333-currentTitle","Tom Gregory - Losing Sleep","Tom Gregory - Losing Sleep"]
[2020-4-14 22:40:21] [SLAVE1 ] update reading: DLNA_a307ba9db333-currentTitle => Tom Gregory - Losing Sleep
[2020-4-14 22:40:21] [MAIN ] starting FHEM_execute_await
[2020-4-14 22:40:21] [MASTER ] executing: http://x.x.x.x:8083/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20currentTitle%20Tom%20Gregory%20-%20Losing%20Sleep&fwcsrf=csrf_154782731900610
[2020-4-14 22:40:21] [SLAVE1 ] ["DLNA_a307ba9db333","playing","<div id=\"DLNA_a307ba9db333\" title=\"playing\" class=\"col2\"><a href=\"/fhem?cmd.DLNA_a307ba9db333=set DLNA_a307ba9db333 on&room=odroid&fwcsrf=csrf_111645292585337\">playing</a></div>"]
[2020-4-14 22:40:21] [SLAVE1 ] ["DLNA_a307ba9db333-state","playing","playing"]
[2020-4-14 22:40:21] [SLAVE1 ] update reading: DLNA_a307ba9db333-state => playing
[2020-4-14 22:40:21] [MAIN ] starting FHEM_execute_await
[2020-4-14 22:40:21] [MASTER ] executing: http://x.x.x.x:8083/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20state%20playing&fwcsrf=csrf_154782731900610
[2020-4-14 22:40:21] [MASTER ] ["DLNA_a307ba9db333","playing","<div id=\"DLNA_a307ba9db333\" title=\"playing\" class=\"col2\"><a href=\"/fhem?cmd.DLNA_a307ba9db333=set DLNA_a307ba9db333 on&room=FHEMSync&fwcsrf=csrf_154782731900610\">playing</a></div>"]
[2020-4-14 22:40:21] [MASTER ] ["DLNA_a307ba9db333-currentTitle","Tom Gregory - Losing Sleep","Tom Gregory - Losing Sleep"]
[2020-4-14 22:40:21] [MASTER ] ["DLNA_a307ba9db333","playing","<div id=\"DLNA_a307ba9db333\" title=\"playing\" class=\"col2\"><a href=\"/fhem?cmd.DLNA_a307ba9db333=set DLNA_a307ba9db333 on&room=FHEMSync&fwcsrf=csrf_154782731900610\">playing</a></div>"]
[2020-4-14 22:40:21] [MASTER ] ["DLNA_a307ba9db333-state","playing","playing"]
[2020-4-14 22:40:21] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_154782731900610","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"http:","slashes":true,"auth":null,"host":"x.x.x.x:8083","port":"8083","hostname":"x.x.x.x","hash":null,"search":"?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20currentTitle%20Tom%20Gregory%20-%20Losing%20Sleep&fwcsrf=csrf_154782731900610","query":"XHR=1&cmd=setreading%20DLNA_a307ba9db333%20currentTitle%20Tom%20Gregory%20-%20Losing%20Sleep&fwcsrf=csrf_154782731900610","pathname":"/fhem","path":"/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20currentTitle%20Tom%20Gregory%20-%20Losing%20Sleep&fwcsrf=csrf_154782731900610","href":"http://x.x.x.x:8083/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20currentTitle%20Tom%20Gregory%20-%20Losing%20Sleep&fwcsrf=csrf_154782731900610"},"method":"GET","headers":{"authorization":"Basic RkhFTTowMkhKVVdIS0owNQ==","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[2020-4-14 22:40:21] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_154782731900610","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"http:","slashes":true,"auth":null,"host":"x.x.x.x:8083","port":"8083","hostname":"x.x.x.x","hash":null,"search":"?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20state%20playing&fwcsrf=csrf_154782731900610","query":"XHR=1&cmd=setreading%20DLNA_a307ba9db333%20state%20playing&fwcsrf=csrf_154782731900610","pathname":"/fhem","path":"/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20state%20playing&fwcsrf=csrf_154782731900610","href":"http://x.x.x.x:8083/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20state%20playing&fwcsrf=csrf_154782731900610"},"method":"GET","headers":{"authorization":"Basic RkhFTTowMkhKVVdIS0owNQ==","accept-encoding":"gzip, deflate","accept":"application/json"}}}
mit Sonderzeichen
[2020-4-14 22:43:51] [SLAVE1 ] ["DLNA_a307ba9db333","playing","<div id=\"DLNA_a307ba9db333\" title=\"playing\" class=\"col2\"><a href=\"/fhem?cmd.DLNA_a307ba9db333=set DLNA_a307ba9db333 on&room=odroid&fwcsrf=csrf_111645292585337\">playing</a></div>"]
[2020-4-14 22:43:51] [SLAVE1 ] ["DLNA_a307ba9db333-currentTitle","NDR 2 - Der Norden h�lt zusammen - ndr.de/ndr2","NDR 2 - Der Norden h�lt zusammen - ndr.de/ndr2"]
[2020-4-14 22:43:51] [SLAVE1 ] update reading: DLNA_a307ba9db333-currentTitle => NDR 2 - Der Norden h�lt zusammen - ndr.de/ndr2
[2020-4-14 22:43:51] [MAIN ] starting FHEM_execute_await
[2020-4-14 22:43:51] [MASTER ] executing: http://x.x.x.x:8083/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20currentTitle%20NDR%202%20-%20Der%20Norden%20h%EF%BF%BDlt%20zusammen%20-%20ndr.de%2Fndr2&fwcsrf=csrf_154782731900610
[2020-4-14 22:43:51] [SLAVE1 ] ["DLNA_a307ba9db333","playing","<div id=\"DLNA_a307ba9db333\" title=\"playing\" class=\"col2\"><a href=\"/fhem?cmd.DLNA_a307ba9db333=set DLNA_a307ba9db333 on&room=odroid&fwcsrf=csrf_111645292585337\">playing</a></div>"]
[2020-4-14 22:43:51] [SLAVE1 ] ["DLNA_a307ba9db333-state","playing","playing"]
[2020-4-14 22:43:51] [SLAVE1 ] update reading: DLNA_a307ba9db333-state => playing
[2020-4-14 22:43:51] [MAIN ] starting FHEM_execute_await
[2020-4-14 22:43:51] [MASTER ] executing: http://x.x.x.x:8083/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20state%20playing&fwcsrf=csrf_154782731900610
[2020-4-14 22:43:51] [MASTER ] ["DLNA_a307ba9db333","playing","<div id=\"DLNA_a307ba9db333\" title=\"playing\" class=\"col2\"><a href=\"/fhem?cmd.DLNA_a307ba9db333=set DLNA_a307ba9db333 on&room=FHEMSync&fwcsrf=csrf_154782731900610\">playing</a></div>"]
[2020-4-14 22:43:51] [MASTER ] ["DLNA_a307ba9db333-currentTitle","NDR 2 - Der Norden h�lt zusammen - ndr.de/ndr2","NDR 2 - Der Norden h�lt zusammen - ndr.de/ndr2"]
[2020-4-14 22:43:51] [MASTER ] ["DLNA_a307ba9db333","playing","<div id=\"DLNA_a307ba9db333\" title=\"playing\" class=\"col2\"><a href=\"/fhem?cmd.DLNA_a307ba9db333=set DLNA_a307ba9db333 on&room=FHEMSync&fwcsrf=csrf_154782731900610\">playing</a></div>"]
[2020-4-14 22:43:51] [MASTER ] ["DLNA_a307ba9db333-state","playing","playing"]
[2020-4-14 22:43:52] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_154782731900610","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"http:","slashes":true,"auth":null,"host":"x.x.x.x:8083","port":"8083","hostname":"x.x.x.x","hash":null,"search":"?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20currentTitle%20NDR%202%20-%20Der%20Norden%20h%EF%BF%BDlt%20zusammen%20-%20ndr.de%2Fndr2&fwcsrf=csrf_154782731900610","query":"XHR=1&cmd=setreading%20DLNA_a307ba9db333%20currentTitle%20NDR%202%20-%20Der%20Norden%20h%EF%BF%BDlt%20zusammen%20-%20ndr.de%2Fndr2&fwcsrf=csrf_154782731900610","pathname":"/fhem","path":"/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20currentTitle%20NDR%202%20-%20Der%20Norden%20h%EF%BF%BDlt%20zusammen%20-%20ndr.de%2Fndr2&fwcsrf=csrf_154782731900610","href":"http://x.x.x.x:8083/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20currentTitle%20NDR%202%20-%20Der%20Norden%20h%EF%BF%BDlt%20zusammen%20-%20ndr.de%2Fndr2&fwcsrf=csrf_154782731900610"},"method":"GET","headers":{"authorization":"Basic RkhFTTowMkhKVVdIS0owNQ==","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[2020-4-14 22:43:52] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_154782731900610","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"http:","slashes":true,"auth":null,"host":"x.x.x.x:8083","port":"8083","hostname":"x.x.x.x","hash":null,"search":"?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20state%20playing&fwcsrf=csrf_154782731900610","query":"XHR=1&cmd=setreading%20DLNA_a307ba9db333%20state%20playing&fwcsrf=csrf_154782731900610","pathname":"/fhem","path":"/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20state%20playing&fwcsrf=csrf_154782731900610","href":"http://x.x.x.x:8083/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20state%20playing&fwcsrf=csrf_154782731900610"},"method":"GET","headers":{"authorization":"Basic RkhFTTowMkhKVVdIS0owNQ==","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[2020-4-14 22:44:51] [SLAVE1 ] ["DLNA_a307ba9db333","playing","<div id=\"DLNA_a307ba9db333\" title=\"playing\" class=\"col2\"><a href=\"/fhem?cmd.DLNA_a307ba9db333=set DLNA_a307ba9db333 on&room=odroid&fwcsrf=csrf_111645292585337\">playing</a></div>"]
[2020-4-14 22:44:51] [SLAVE1 ] ["DLNA_a307ba9db333-currentTitle","Tom Gregory - Run To You - Live in G�ttingen 2018","Tom Gregory - Run To You - Live in G�ttingen 2018"]
[2020-4-14 22:44:51] [SLAVE1 ] update reading: DLNA_a307ba9db333-currentTitle => Tom Gregory - Run To You - Live in G�ttingen 2018
[2020-4-14 22:44:51] [MAIN ] starting FHEM_execute_await
[2020-4-14 22:44:51] [MASTER ] executing: http://x.x.x.x:8083/fhem?XHR=1&cmd=setreading%20DLNA_a307ba9db333%20currentTitle%20Tom%20Gregory%20-%20Run%20To%20You%20-%20Live%20in%20G%EF%BF%BDttingen%202018&fwcsrf=csrf_154782731900610
[2020-4-14 22:44:51] [SLAVE1 ] ["DLNA_a307ba9db333","playing","<div id=\"DLNA_a307ba9db333\" title=\"playing\" class=\"col2\"><a href=\"/fhem?cmd.DLNA_a307ba9db333=set DLNA_a307ba9db333 on&room=odroid&fwcsrf=csrf_111645292585337\">playing</a></div>"]
[2020-4-14 22:44:51] [SLAVE1 ] ["DLNA_a307ba9db333-state","playing","playing"]
[2020-4-14 22:44:51] [SLAVE1 ] update reading: DLNA_a307ba9db333-state => playing
[2020-4-14 22:44:51] [MAIN ] starting FHEM_execute_await
hier die <Binary>-Anzeige
[2020-4-14 23:03:04] [MASTER ] Fetching FHEM devices...
[2020-4-14 23:03:04] [MAIN ] starting FHEM_execute_await
[2020-4-14 23:03:04] [MASTER ] executing: http://x.x.x.x:8083/fhem?XHR=1&cmd=jsonlist2%20TYPE%3DFHEMSYNC_DEVICE&fwcsrf=csrf_154782731900610
[2020-4-14 23:03:04] [MASTER ] ["DLNA_a307ba9db333","playing","<div id=\"DLNA_a307ba9db333\" title=\"playing\" class=\"col2\"><a href=\"/fhem?cmd.DLNA_a307ba9db333=set DLNA_a307ba9db333 on&room=FHEMSync&fwcsrf=csrf_154782731900610\">playing</a></div>"]
[2020-4-14 23:03:04] [MASTER ] ["DLNA_a307ba9db333-currentTitle","<BINARY>","<BINARY>"]
[2020-4-14 23:03:04] [MASTER ] response: {"statusCode":200,"body":{"Arg":"TYPE=FHEMSYNC_DEVICE","Results":[{"Name":"DLNA_a307ba9db333","PossibleSets":"pause:noArg on:noArg channel:1,2,3,4,5,6,7,8,9,10 next:noArg stop:noArg stream previous:noArg mute:on,off play:noArg off:noArg pauseToggle:noArg seek speak volume:slider,0,1,100 off-till on-for-timer toggle off-for-timer off-till-overnight blink on-till intervals on-till-overnight ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading DbLogExclude DbLogInclude DbLogValueFn:textField-long alexaName alexaProactiveEvents:1,0 alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle genericDeviceType:security,ignore,switch,outlet,light,blind,thermometer,thermostat,contact,garage,window,lock,scene homebridgeMapping:textField-long icon room_map sortby structexclude webCmd webCmdLabel:textField-long widgetOverride channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups oldreadings ttsLanguage webCmd userattr","Internals":{"DEF":"DLNARenderer DLNA_a307ba9db333","FUUID":"5e8c1e88-f33f-df36-04d3-670017854d5c5bd1","FVERSION":"98_DLNARenderer.pm:v2.0.7-s15836/2018-01-09","NAME":"DLNA_a307ba9db333","NR":"207","REMOTENAME":"DLNA_a307ba9db333","REMOTETYPE":"DLNARenderer","STATE":"playing","TYPE":"FHEMSYNC_DEVICE","UDN":"uuid:34153de0-fe3b-4693-9103-a307ba9db333"},"Readings":{"channel":{"Value":"2","Time":"2020-04-07 08:32:40"},"currentTitle":{"Value":"NDR 2 - Der Norden h�lt zusammen - ndr.de/ndr2","Time":"2020-04-14 23:01:54"},"currentTrackURI":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-14 06:31:47"},"friendlyName":{"Value":"Küche","Time":"2020-04-11 03:27:26"},"manufacturer":{"Value":"Lautsprecher Teufel GmbH","Time":"2020-04-11 03:27:26"},"manufacturerURL":{"Value":"https://www.teufel.de/","Time":"2020-04-11 03:27:26"},"modelDescription":{"Value":"Virtual Media Player","Time":"2020-04-11 03:27:26"},"modelName":{"Value":"Teufel One M","Time":"2020-04-11 03:27:26"},"modelNumber":{"Value":"1","Time":"2020-04-11 03:27:26"},"multiRoomSupport":{"Value":"0","Time":"2020-04-11 03:27:26"},"multiRoomVolume":{"Value":"19","Time":"2020-04-13 07:31:50"},"mute":{"Value":"0","Time":"2020-04-07 08:32:40"},"presence":{"Value":"online","Time":"2020-04-11 03:27:26"},"state":{"Value":"playing","Time":"2020-04-14 23:01:54"},"stream":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-07 08:32:40"},"transportState":{"Value":"PLAYING","Time":"2020-04-14 06:31:47"},"transportStatus":{"Value":"OK","Time":"2020-04-07 08:32:40"},"volume":{"Value":"19","Time":"2020-04-13 07:33:22"}},"Attributes":{"alias":"Küche","channel_01":"http://91.250.82.237:80/berlin.mp3","channel_02":"http://addrad.io/4WRJh9","oldreadings":"currentTrackURI,channel","room":"FHEMSync","ttsLanguage":"de","userattr":"channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups oldreadings ttsLanguage webCmd","webCmd":"volume"}},{"Name":"DLNA_cd1784188004","PossibleSets":"play:noArg mute:on,off previous:noArg seek volume:slider,0,1,100 speak pauseToggle:noArg off:noArg on:noArg next:noArg channel:1,2,3,4,5,6,7,8,9,10 pause:noArg stop:noArg stream off-till on-for-timer toggle off-for-timer off-till-overnight blink intervals on-till on-till-overnight ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading DbLogExclude DbLogInclude DbLogValueFn:textField-long alexaName alexaProactiveEvents:1,0 alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle genericDeviceType:security,ignore,switch,outlet,light,blind,thermometer,thermostat,contact,garage,window,lock,scene homebridgeMapping:textField-long icon room_map sortby structexclude webCmd webCmdLabel:textField-long widgetOverride channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups ttsLanguage webCmd userattr","Internals":{"DEF":"DLNARenderer DLNA_cd1784188004","FUUID":"5e8c1e88-f33f-df36-a919-573740df645aecd4","FVERSION":"98_DLNARenderer.pm:v2.0.7-s15836/2018-01-09","NAME":"DLNA_cd1784188004","NR":"208","REMOTENAME":"DLNA_cd1784188004","REMOTETYPE":"DLNARenderer","STATE":"stopped","TYPE":"FHEMSYNC_DEVICE","UDN":"uuid:ff9fb204-1b09-4d02-ab92-cd1784188004"},"Readings":{"channel":{"Value":"2","Time":"2020-04-14 06:25:07"},"currentAlbum":{"Value":"Natural History: The Very Best of Talk Talk","Time":"2020-04-13 11:17:53"},"currentAlbumArtURI":{"Value":"","Time":"2020-04-13 11:21:54"},"currentArtist":{"Value":"Talk Talk","Time":"2020-04-13 11:12:51"},"currentDuration":{"Value":"","Time":"2020-04-14 06:30:09"},"currentOriginalTrackNumber":{"Value":"12","Time":"2020-04-13 11:32:36"},"currentTitle":{"Value":"Nelly Furtado - Say it right","Time":"2020-04-14 06:24:29"},"currentTrackURI":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-13 11:32:40"},"friendlyName":{"Value":"Schlafzimmer","Time":"2020-04-12 12:36:32"},"manufacturer":{"Value":"Lautsprecher Teufel GmbH","Time":"2020-04-12 12:36:32"},"manufacturerURL":{"Value":"https://www.teufel.de/","Time":"2020-04-12 12:36:32"},"modelDescription":{"Value":"Virtual Media Player","Time":"2020-04-12 12:36:32"},"modelName":{"Value":"Teufel One M","Time":"2020-04-12 12:36:32"},"modelNumber":{"Value":"1","Time":"2020-04-12 12:36:32"},"multiRoomSupport":{"Value":"0","Time":"2020-04-12 12:36:32"},"multiRoomVolume":{"Value":"25","Time":"2020-04-14 06:25:03"},"mute":{"Value":"0","Time":"2020-04-10 15:17:07"},"presence":{"Value":"online","Time":"2020-04-12 12:36:32"},"state":{"Value":"stopped","Time":"2020-04-14 06:50:03"},"stream":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-14 06:25:07"},"transportState":{"Value":"STOPPED","Time":"2020-04-14 06:51:05"},"transportStatus":{"Value":"OK","Time":"2020-04-07 08:32:40"},"volume":{"Value":"25","Time":"2020-04-14 06:25:03"}},"Attributes":{"alias":"Schlafzimmer","channel_01":"http://91.250.82.237:80/berlin.mp3","channel_02":"http://addrad.io/4WRJh9","room":"FHEMSync","userattr":"channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups ttsLanguage webCmd","webCmd":"volume"}},{"Name":"Wecker","PossibleSets":"disable:noArg enable:noArg ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading DbLogExclude DbLogInclude DbLogValueFn:textField-long alexaName alexaProactiveEvents:1,0 alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle genericDeviceType:security,ignore,switch,outlet,light,blind,thermometer,thermostat,contact,garage,window,lock,scene homebridgeMapping:textField-long icon room_map sortby structexclude webCmd webCmdLabel:textField-long widgetOverride 1 userattr","Internals":{"DEF":"DOIF Wecker","FUUID":"5e8c1e88-f33f-df36-1f14-7c6d46dabd22258a","FVERSION":"98_DOIF.pm:0.212240/2020-02-18","MODEL":"Perl","NAME":"Wecker","NOTIFYDEV":"global","NR":"209","NTFY_ORDER":"50-Wecker","REMOTENAME":"Wecker","REMOTETYPE":"DOIF","STATE":"off","TYPE":"FHEMSYNC_DEVICE","VERSION":"21224 2020-02-18 18:45:49"},"Readings":{"block_01":{"Value":"executed","Time":"2020-04-07 08:32:40"},"mode":{"Value":"enabled","Time":"2020-04-07 08:32:40"},"state":{"Value":"off","Time":"2020-04-14 06:50:03"},"timer_01_c01":{"Value":"15.04.2020 06:25:00|8","Time":"2020-04-14 06:30:09"},"timer_02_c01":{"Value":"15.04.2020 08:00:00|7","Time":"2020-04-14 08:00:01"},"timer_03_c01":{"Value":"15.04.2020 06:50:00|8","Time":"2020-04-14 06:51:05"},"timer_04_c01":{"Value":"15.04.2020 22:00:00|7","Time":"2020-04-14 22:03:11"}},"Attributes":{"room":"FHEMSync","userattr":"1"}}],"totalResultsReturned":3},"headers":{"content-length":"2060","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_154782731900610","content-type":"application/json; charset=utf-8"},"request":{"uri":{"protocol":"http:","slashes":true,"auth":null,"host":"x.x.x.x:8083","port":"8083","hostname":"x.x.x.x","hash":null,"search":"?XHR=1&cmd=jsonlist2%20TYPE%3DFHEMSYNC_DEVICE&fwcsrf=csrf_154782731900610","query":"XHR=1&cmd=jsonlist2%20TYPE%3DFHEMSYNC_DEVICE&fwcsrf=csrf_154782731900610","pathname":"/fhem","path":"/fhem?XHR=1&cmd=jsonlist2%20TYPE%3DFHEMSYNC_DEVICE&fwcsrf=csrf_154782731900610","href":"http://x.x.x.x:8083/fhem?XHR=1&cmd=jsonlist2%20TYPE%3DFHEMSYNC_DEVICE&fwcsrf=csrf_154782731900610"},"method":"GET","headers":{"authorization":"Basic RkhFTTowMkhKVVdIS0owNQ==","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[2020-4-14 23:03:04] [MASTER ] response: {"Arg":"TYPE=FHEMSYNC_DEVICE","Results":[{"Name":"DLNA_a307ba9db333","PossibleSets":"pause:noArg on:noArg channel:1,2,3,4,5,6,7,8,9,10 next:noArg stop:noArg stream previous:noArg mute:on,off play:noArg off:noArg pauseToggle:noArg seek speak volume:slider,0,1,100 off-till on-for-timer toggle off-for-timer off-till-overnight blink on-till intervals on-till-overnight ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading DbLogExclude DbLogInclude DbLogValueFn:textField-long alexaName alexaProactiveEvents:1,0 alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle genericDeviceType:security,ignore,switch,outlet,light,blind,thermometer,thermostat,contact,garage,window,lock,scene homebridgeMapping:textField-long icon room_map sortby structexclude webCmd webCmdLabel:textField-long widgetOverride channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups oldreadings ttsLanguage webCmd userattr","Internals":{"DEF":"DLNARenderer DLNA_a307ba9db333","FUUID":"5e8c1e88-f33f-df36-04d3-670017854d5c5bd1","FVERSION":"98_DLNARenderer.pm:v2.0.7-s15836/2018-01-09","NAME":"DLNA_a307ba9db333","NR":"207","REMOTENAME":"DLNA_a307ba9db333","REMOTETYPE":"DLNARenderer","STATE":"playing","TYPE":"FHEMSYNC_DEVICE","UDN":"uuid:34153de0-fe3b-4693-9103-a307ba9db333"},"Readings":{"channel":{"Value":"2","Time":"2020-04-07 08:32:40"},"currentTitle":{"Value":"NDR 2 - Der Norden h�lt zusammen - ndr.de/ndr2","Time":"2020-04-14 23:01:54"},"currentTrackURI":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-14 06:31:47"},"friendlyName":{"Value":"Küche","Time":"2020-04-11 03:27:26"},"manufacturer":{"Value":"Lautsprecher Teufel GmbH","Time":"2020-04-11 03:27:26"},"manufacturerURL":{"Value":"https://www.teufel.de/","Time":"2020-04-11 03:27:26"},"modelDescription":{"Value":"Virtual Media Player","Time":"2020-04-11 03:27:26"},"modelName":{"Value":"Teufel One M","Time":"2020-04-11 03:27:26"},"modelNumber":{"Value":"1","Time":"2020-04-11 03:27:26"},"multiRoomSupport":{"Value":"0","Time":"2020-04-11 03:27:26"},"multiRoomVolume":{"Value":"19","Time":"2020-04-13 07:31:50"},"mute":{"Value":"0","Time":"2020-04-07 08:32:40"},"presence":{"Value":"online","Time":"2020-04-11 03:27:26"},"state":{"Value":"playing","Time":"2020-04-14 23:01:54"},"stream":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-07 08:32:40"},"transportState":{"Value":"PLAYING","Time":"2020-04-14 06:31:47"},"transportStatus":{"Value":"OK","Time":"2020-04-07 08:32:40"},"volume":{"Value":"19","Time":"2020-04-13 07:33:22"}},"Attributes":{"alias":"Küche","channel_01":"http://91.250.82.237:80/berlin.mp3","channel_02":"http://addrad.io/4WRJh9","oldreadings":"currentTrackURI,channel","room":"FHEMSync","ttsLanguage":"de","userattr":"channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups oldreadings ttsLanguage webCmd","webCmd":"volume"}},{"Name":"DLNA_cd1784188004","PossibleSets":"play:noArg mute:on,off previous:noArg seek volume:slider,0,1,100 speak pauseToggle:noArg off:noArg on:noArg next:noArg channel:1,2,3,4,5,6,7,8,9,10 pause:noArg stop:noArg stream off-till on-for-timer toggle off-for-timer off-till-overnight blink intervals on-till on-till-overnight ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading DbLogExclude DbLogInclude DbLogValueFn:textField-long alexaName alexaProactiveEvents:1,0 alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle genericDeviceType:security,ignore,switch,outlet,light,blind,thermometer,thermostat,contact,garage,window,lock,scene homebridgeMapping:textField-long icon room_map sortby structexclude webCmd webCmdLabel:textField-long widgetOverride channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups ttsLanguage webCmd userattr","Internals":{"DEF":"DLNARenderer DLNA_cd1784188004","FUUID":"5e8c1e88-f33f-df36-a919-573740df645aecd4","FVERSION":"98_DLNARenderer.pm:v2.0.7-s15836/2018-01-09","NAME":"DLNA_cd1784188004","NR":"208","REMOTENAME":"DLNA_cd1784188004","REMOTETYPE":"DLNARenderer","STATE":"stopped","TYPE":"FHEMSYNC_DEVICE","UDN":"uuid:ff9fb204-1b09-4d02-ab92-cd1784188004"},"Readings":{"channel":{"Value":"2","Time":"2020-04-14 06:25:07"},"currentAlbum":{"Value":"Natural History: The Very Best of Talk Talk","Time":"2020-04-13 11:17:53"},"currentAlbumArtURI":{"Value":"","Time":"2020-04-13 11:21:54"},"currentArtist":{"Value":"Talk Talk","Time":"2020-04-13 11:12:51"},"currentDuration":{"Value":"","Time":"2020-04-14 06:30:09"},"currentOriginalTrackNumber":{"Value":"12","Time":"2020-04-13 11:32:36"},"currentTitle":{"Value":"Nelly Furtado - Say it right","Time":"2020-04-14 06:24:29"},"currentTrackURI":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-13 11:32:40"},"friendlyName":{"Value":"Schlafzimmer","Time":"2020-04-12 12:36:32"},"manufacturer":{"Value":"Lautsprecher Teufel GmbH","Time":"2020-04-12 12:36:32"},"manufacturerURL":{"Value":"https://www.teufel.de/","Time":"2020-04-12 12:36:32"},"modelDescription":{"Value":"Virtual Media Player","Time":"2020-04-12 12:36:32"},"modelName":{"Value":"Teufel One M","Time":"2020-04-12 12:36:32"},"modelNumber":{"Value":"1","Time":"2020-04-12 12:36:32"},"multiRoomSupport":{"Value":"0","Time":"2020-04-12 12:36:32"},"multiRoomVolume":{"Value":"25","Time":"2020-04-14 06:25:03"},"mute":{"Value":"0","Time":"2020-04-10 15:17:07"},"presence":{"Value":"online","Time":"2020-04-12 12:36:32"},"state":{"Value":"stopped","Time":"2020-04-14 06:50:03"},"stream":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-14 06:25:07"},"transportState":{"Value":"STOPPED","Time":"2020-04-14 06:51:05"},"transportStatus":{"Value":"OK","Time":"2020-04-07 08:32:40"},"volume":{"Value":"25","Time":"2020-04-14 06:25:03"}},"Attributes":{"alias":"Schlafzimmer","channel_01":"http://91.250.82.237:80/berlin.mp3","channel_02":"http://addrad.io/4WRJh9","room":"FHEMSync","userattr":"channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups ttsLanguage webCmd","webCmd":"volume"}},{"Name":"Wecker","PossibleSets":"disable:noArg enable:noArg ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading DbLogExclude DbLogInclude DbLogValueFn:textField-long alexaName alexaProactiveEvents:1,0 alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle genericDeviceType:security,ignore,switch,outlet,light,blind,thermometer,thermostat,contact,garage,window,lock,scene homebridgeMapping:textField-long icon room_map sortby structexclude webCmd webCmdLabel:textField-long widgetOverride 1 userattr","Internals":{"DEF":"DOIF Wecker","FUUID":"5e8c1e88-f33f-df36-1f14-7c6d46dabd22258a","FVERSION":"98_DOIF.pm:0.212240/2020-02-18","MODEL":"Perl","NAME":"Wecker","NOTIFYDEV":"global","NR":"209","NTFY_ORDER":"50-Wecker","REMOTENAME":"Wecker","REMOTETYPE":"DOIF","STATE":"off","TYPE":"FHEMSYNC_DEVICE","VERSION":"21224 2020-02-18 18:45:49"},"Readings":{"block_01":{"Value":"executed","Time":"2020-04-07 08:32:40"},"mode":{"Value":"enabled","Time":"2020-04-07 08:32:40"},"state":{"Value":"off","Time":"2020-04-14 06:50:03"},"timer_01_c01":{"Value":"15.04.2020 06:25:00|8","Time":"2020-04-14 06:30:09"},"timer_02_c01":{"Value":"15.04.2020 08:00:00|7","Time":"2020-04-14 08:00:01"},"timer_03_c01":{"Value":"15.04.2020 06:50:00|8","Time":"2020-04-14 06:51:05"},"timer_04_c01":{"Value":"15.04.2020 22:00:00|7","Time":"2020-04-14 22:03:11"}},"Attributes":{"room":"FHEMSync","userattr":"1"}}],"totalResultsReturned":3}
[2020-4-14 23:03:04] [MASTER ] got: 3 devices
[2020-4-14 23:03:04] [SLAVE1 ] Fetching FHEM devices...
[2020-4-14 23:03:04] [MAIN ] starting FHEM_execute_await
[2020-4-14 23:03:04] [SLAVE1 ] executing: http://x.x.x.x:8083/fhem?XHR=1&cmd=jsonlist2%20room%3Dodroid&fwcsrf=csrf_111645292585337
[2020-4-14 23:03:04] [SLAVE1 ] response: {"statusCode":200,"body":{"Arg":"room=odroid","Results":[{"Name":"DLNA_a307ba9db333","PossibleSets":"mute:on,off play:noArg previous:noArg speak volume:slider,0,1,100 seek pauseToggle:noArg off:noArg channel:1,2,3,4,5,6,7,8,9,10 next:noArg on:noArg pause:noArg stream stop:noArg on-till-overnight intervals on-till blink off-till-overnight off-for-timer toggle on-for-timer off-till ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 ignoredIPs usedonlyIPs event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading alexaName alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle icon sortby webCmd webCmdLabel:textField-long widgetOverride channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups oldreadings ttsLanguage webCmd userattr","Internals":{"DEF":"uuid:34153de0-fe3b-4693-9103-a307ba9db333","FUUID":"5e761f4d-f33f-0d83-284a-46aeffbeb4bf37dc","FVERSION":"98_DLNARenderer.pm:v2.0.7-s15836/2018-01-09","NAME":"DLNA_a307ba9db333","NR":"27","STATE":"playing","TYPE":"DLNARenderer","UDN":"uuid:34153de0-fe3b-4693-9103-a307ba9db333"},"Readings":{"channel":{"Value":"2","Time":"2020-04-05 13:05:05"},"currentAlbumArtURI":{"Value":"","Time":"2020-04-05 11:17:24"},"currentDuration":{"Value":"","Time":"2020-04-05 11:17:24"},"currentTitle":{"Value":"<BINARY>","Time":"2020-04-14 23:01:54"},"currentTrackURI":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-14 06:31:43"},"friendlyName":{"Value":"Küche","Time":"2020-04-12 12:36:30"},"manufacturer":{"Value":"Lautsprecher Teufel GmbH","Time":"2020-04-12 12:36:30"},"manufacturerURL":{"Value":"https://www.teufel.de/","Time":"2020-04-12 12:36:30"},"modelDescription":{"Value":"Virtual Media Player","Time":"2020-04-12 12:36:30"},"modelName":{"Value":"Teufel One M","Time":"2020-04-12 12:36:30"},"modelNumber":{"Value":"1","Time":"2020-04-12 12:36:30"},"multiRoomSupport":{"Value":"0","Time":"2020-04-12 12:36:30"},"multiRoomVolume":{"Value":"19","Time":"2020-04-13 07:31:47"},"mute":{"Value":"0","Time":"2020-04-05 13:05:04"},"presence":{"Value":"online","Time":"2020-04-12 12:36:30"},"state":{"Value":"playing","Time":"2020-04-14 23:01:54"},"stream":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-05 13:05:04"},"transportState":{"Value":"PLAYING","Time":"2020-04-14 06:31:45"},"transportStatus":{"Value":"OK","Time":"2020-04-09 07:38:07"},"volume":{"Value":"19","Time":"2020-04-13 07:31:47"}},"Attributes":{"alias":"Küche","channel_01":"http://91.250.82.237:80/berlin.mp3","channel_02":"http://addrad.io/4WRJh9","oldreadings":"currentTrackURI,channel","room":"odroid","ttsLanguage":"de","userattr":"channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups oldreadings ttsLanguage webCmd","webCmd":"volume"}},{"Name":"DLNA_cd1784188004","PossibleSets":"stream stop:noArg pause:noArg next:noArg channel:1,2,3,4,5,6,7,8,9,10 on:noArg pauseToggle:noArg off:noArg volume:slider,0,1,100 speak seek previous:noArg play:noArg mute:on,off on-till-overnight intervals on-till off-till-overnight blink toggle on-for-timer off-for-timer off-till ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 ignoredIPs usedonlyIPs event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading alexaName alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle icon sortby webCmd webCmdLabel:textField-long widgetOverride channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups ttsLanguage webCmd userattr","Internals":{"DEF":"uuid:ff9fb204-1b09-4d02-ab92-cd1784188004","FUUID":"5e8221bb-f33f-0d83-a6a3-72436b6aa2c4ab4e","FVERSION":"98_DLNARenderer.pm:v2.0.7-s15836/2018-01-09","NAME":"DLNA_cd1784188004","NR":"32","STATE":"stopped","TYPE":"DLNARenderer","UDN":"uuid:ff9fb204-1b09-4d02-ab92-cd1784188004"},"Readings":{"channel":{"Value":"2","Time":"2020-04-14 06:25:05"},"currentAlbum":{"Value":"Natural History: The Very Best of Talk Talk","Time":"2020-04-13 11:17:52"},"currentAlbumArtURI":{"Value":"","Time":"2020-04-14 06:25:05"},"currentArtist":{"Value":"Talk Talk","Time":"2020-04-13 11:12:51"},"currentDuration":{"Value":"","Time":"2020-04-14 06:25:05"},"currentOriginalTrackNumber":{"Value":"12","Time":"2020-04-13 11:32:36"},"currentTitle":{"Value":"Nelly Furtado - Say it right","Time":"2020-04-14 06:24:24"},"currentTrackURI":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-13 11:32:40"},"friendlyName":{"Value":"Schlafzimmer","Time":"2020-04-12 12:36:32"},"manufacturer":{"Value":"Lautsprecher Teufel GmbH","Time":"2020-04-12 12:36:32"},"manufacturerURL":{"Value":"https://www.teufel.de/","Time":"2020-04-12 12:36:32"},"modelDescription":{"Value":"Virtual Media Player","Time":"2020-04-12 12:36:32"},"modelName":{"Value":"Teufel One M","Time":"2020-04-12 12:36:32"},"modelNumber":{"Value":"1","Time":"2020-04-12 12:36:32"},"multiRoomSupport":{"Value":"0","Time":"2020-04-12 12:36:32"},"multiRoomVolume":{"Value":"25","Time":"2020-04-14 06:25:00"},"mute":{"Value":"0","Time":"2020-04-10 15:17:06"},"presence":{"Value":"online","Time":"2020-04-12 12:36:32"},"state":{"Value":"stopped","Time":"2020-04-14 06:50:00"},"stream":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-14 06:25:05"},"transportState":{"Value":"STOPPED","Time":"2020-04-14 06:50:00"},"transportStatus":{"Value":"OK","Time":"2020-04-05 13:05:04"},"volume":{"Value":"25","Time":"2020-04-14 06:25:00"}},"Attributes":{"alias":"Schlafzimmer","channel_01":"http://91.250.82.237:80/berlin.mp3","channel_02":"http://addrad.io/4WRJh9","room":"odroid","userattr":"channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups ttsLanguage webCmd","webCmd":"volume"}},{"Name":"Wecker","PossibleSets":"disable:noArg enable:noArg ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 disable:0,1 loglevel:0,1,2,3,4,5,6 notexist checkReadingEvent:0,1 addStateEvent:1,0 weekdays setList:textField-long readingList DOIF_Readings:textField-long event_Readings:textField-long uiTable:textField-long event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading alexaName alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle icon sortby webCmd webCmdLabel:textField-long widgetOverride 1 userattr","Internals":{"DEF":"{if ([06:25|8] or [08:00|7]) {fhem_set (\"DLNA_cd1784188004 play\"); \nfhem_set (\"DLNA_cd1784188004 volume 25\"); set_Exec(\"Sender\", 5,'fhem_set (\"DLNA_cd1784188004 channel 2\")');\nset_State(\"on\")}\nelsif ([06:50|8] or [22:00|7]) {fhem_set (\"DLNA_cd1784188004 off\"); set_State(\"off\")}}","FUUID":"5c4c1350-f33f-0d83-23f1-5c3bb732c9bda914","FVERSION":"98_DOIF.pm:0.212240/2020-02-18","MODEL":"Perl","NAME":"Wecker","NOTIFYDEV":"global","NR":"17","NTFY_ORDER":"50-Wecker","STATE":"off","TYPE":"DOIF","VERSION":"21224 2020-02-18 18:45:49"},"Readings":{"block_01":{"Value":"executed","Time":"2020-04-14 06:50:00"},"mode":{"Value":"enabled","Time":"2020-04-05 13:05:05"},"state":{"Value":"off","Time":"2020-04-14 06:50:00"},"timer_01_c01":{"Value":"15.04.2020 06:25:00|8","Time":"2020-04-14 06:25:00"},"timer_02_c01":{"Value":"15.04.2020 08:00:00|7","Time":"2020-04-14 08:00:00"},"timer_03_c01":{"Value":"15.04.2020 06:50:00|8","Time":"2020-04-14 06:50:00"},"timer_04_c01":{"Value":"15.04.2020 22:00:00|7","Time":"2020-04-14 22:00:00"}},"Attributes":{"room":"Teufel,odroid","userattr":"1"}}],"totalResultsReturned":3},"headers":{"content-length":"2060","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_111645292585337","content-type":"application/json; charset=utf-8"},"request":{"uri":{"protocol":"http:","slashes":true,"auth":null,"host":"x.x.x.x:8083","port":"8083","hostname":"x.x.x.x","hash":null,"search":"?XHR=1&cmd=jsonlist2%20room%3Dodroid&fwcsrf=csrf_111645292585337","query":"XHR=1&cmd=jsonlist2%20room%3Dodroid&fwcsrf=csrf_111645292585337","pathname":"/fhem","path":"/fhem?XHR=1&cmd=jsonlist2%20room%3Dodroid&fwcsrf=csrf_111645292585337","href":"http://x.x.x.x:8083/fhem?XHR=1&cmd=jsonlist2%20room%3Dodroid&fwcsrf=csrf_111645292585337"},"method":"GET","headers":{"authorization":"Basic ZmhlbToqMDJrbGVpbmVyVGV1ZmVsMDUj","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[2020-4-14 23:03:04] [SLAVE1 ] response: {"Arg":"room=odroid","Results":[{"Name":"DLNA_a307ba9db333","PossibleSets":"mute:on,off play:noArg previous:noArg speak volume:slider,0,1,100 seek pauseToggle:noArg off:noArg channel:1,2,3,4,5,6,7,8,9,10 next:noArg on:noArg pause:noArg stream stop:noArg on-till-overnight intervals on-till blink off-till-overnight off-for-timer toggle on-for-timer off-till ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 ignoredIPs usedonlyIPs event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading alexaName alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle icon sortby webCmd webCmdLabel:textField-long widgetOverride channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups oldreadings ttsLanguage webCmd userattr","Internals":{"DEF":"uuid:34153de0-fe3b-4693-9103-a307ba9db333","FUUID":"5e761f4d-f33f-0d83-284a-46aeffbeb4bf37dc","FVERSION":"98_DLNARenderer.pm:v2.0.7-s15836/2018-01-09","NAME":"DLNA_a307ba9db333","NR":"27","STATE":"playing","TYPE":"DLNARenderer","UDN":"uuid:34153de0-fe3b-4693-9103-a307ba9db333"},"Readings":{"channel":{"Value":"2","Time":"2020-04-05 13:05:05"},"currentAlbumArtURI":{"Value":"","Time":"2020-04-05 11:17:24"},"currentDuration":{"Value":"","Time":"2020-04-05 11:17:24"},"currentTitle":{"Value":"<BINARY>","Time":"2020-04-14 23:01:54"},"currentTrackURI":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-14 06:31:43"},"friendlyName":{"Value":"Küche","Time":"2020-04-12 12:36:30"},"manufacturer":{"Value":"Lautsprecher Teufel GmbH","Time":"2020-04-12 12:36:30"},"manufacturerURL":{"Value":"https://www.teufel.de/","Time":"2020-04-12 12:36:30"},"modelDescription":{"Value":"Virtual Media Player","Time":"2020-04-12 12:36:30"},"modelName":{"Value":"Teufel One M","Time":"2020-04-12 12:36:30"},"modelNumber":{"Value":"1","Time":"2020-04-12 12:36:30"},"multiRoomSupport":{"Value":"0","Time":"2020-04-12 12:36:30"},"multiRoomVolume":{"Value":"19","Time":"2020-04-13 07:31:47"},"mute":{"Value":"0","Time":"2020-04-05 13:05:04"},"presence":{"Value":"online","Time":"2020-04-12 12:36:30"},"state":{"Value":"playing","Time":"2020-04-14 23:01:54"},"stream":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-05 13:05:04"},"transportState":{"Value":"PLAYING","Time":"2020-04-14 06:31:45"},"transportStatus":{"Value":"OK","Time":"2020-04-09 07:38:07"},"volume":{"Value":"19","Time":"2020-04-13 07:31:47"}},"Attributes":{"alias":"Küche","channel_01":"http://91.250.82.237:80/berlin.mp3","channel_02":"http://addrad.io/4WRJh9","oldreadings":"currentTrackURI,channel","room":"odroid","ttsLanguage":"de","userattr":"channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups oldreadings ttsLanguage webCmd","webCmd":"volume"}},{"Name":"DLNA_cd1784188004","PossibleSets":"stream stop:noArg pause:noArg next:noArg channel:1,2,3,4,5,6,7,8,9,10 on:noArg pauseToggle:noArg off:noArg volume:slider,0,1,100 speak seek previous:noArg play:noArg mute:on,off on-till-overnight intervals on-till off-till-overnight blink toggle on-for-timer off-for-timer off-till ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 ignoredIPs usedonlyIPs event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading alexaName alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle icon sortby webCmd webCmdLabel:textField-long widgetOverride channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups ttsLanguage webCmd userattr","Internals":{"DEF":"uuid:ff9fb204-1b09-4d02-ab92-cd1784188004","FUUID":"5e8221bb-f33f-0d83-a6a3-72436b6aa2c4ab4e","FVERSION":"98_DLNARenderer.pm:v2.0.7-s15836/2018-01-09","NAME":"DLNA_cd1784188004","NR":"32","STATE":"stopped","TYPE":"DLNARenderer","UDN":"uuid:ff9fb204-1b09-4d02-ab92-cd1784188004"},"Readings":{"channel":{"Value":"2","Time":"2020-04-14 06:25:05"},"currentAlbum":{"Value":"Natural History: The Very Best of Talk Talk","Time":"2020-04-13 11:17:52"},"currentAlbumArtURI":{"Value":"","Time":"2020-04-14 06:25:05"},"currentArtist":{"Value":"Talk Talk","Time":"2020-04-13 11:12:51"},"currentDuration":{"Value":"","Time":"2020-04-14 06:25:05"},"currentOriginalTrackNumber":{"Value":"12","Time":"2020-04-13 11:32:36"},"currentTitle":{"Value":"Nelly Furtado - Say it right","Time":"2020-04-14 06:24:24"},"currentTrackURI":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-13 11:32:40"},"friendlyName":{"Value":"Schlafzimmer","Time":"2020-04-12 12:36:32"},"manufacturer":{"Value":"Lautsprecher Teufel GmbH","Time":"2020-04-12 12:36:32"},"manufacturerURL":{"Value":"https://www.teufel.de/","Time":"2020-04-12 12:36:32"},"modelDescription":{"Value":"Virtual Media Player","Time":"2020-04-12 12:36:32"},"modelName":{"Value":"Teufel One M","Time":"2020-04-12 12:36:32"},"modelNumber":{"Value":"1","Time":"2020-04-12 12:36:32"},"multiRoomSupport":{"Value":"0","Time":"2020-04-12 12:36:32"},"multiRoomVolume":{"Value":"25","Time":"2020-04-14 06:25:00"},"mute":{"Value":"0","Time":"2020-04-10 15:17:06"},"presence":{"Value":"online","Time":"2020-04-12 12:36:32"},"state":{"Value":"stopped","Time":"2020-04-14 06:50:00"},"stream":{"Value":"http://addrad.io/4WRJh9","Time":"2020-04-14 06:25:05"},"transportState":{"Value":"STOPPED","Time":"2020-04-14 06:50:00"},"transportStatus":{"Value":"OK","Time":"2020-04-05 13:05:04"},"volume":{"Value":"25","Time":"2020-04-14 06:25:00"}},"Attributes":{"alias":"Schlafzimmer","channel_01":"http://91.250.82.237:80/berlin.mp3","channel_02":"http://addrad.io/4WRJh9","room":"odroid","userattr":"channel_01 channel_02 channel_03 channel_04 channel_05 channel_06 channel_07 channel_08 channel_09 channel_10 multiRoomGroups ttsLanguage webCmd","webCmd":"volume"}},{"Name":"Wecker","PossibleSets":"disable:noArg enable:noArg ","PossibleAttrs":"alias comment:textField-long eventMap:textField-long group room suppressReading userReadings:textField-long verbose:0,1,2,3,4,5 disable:0,1 loglevel:0,1,2,3,4,5,6 notexist checkReadingEvent:0,1 addStateEvent:1,0 weekdays setList:textField-long readingList DOIF_Readings:textField-long event_Readings:textField-long uiTable:textField-long event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading alexaName alexaRoom cmdIcon devStateIcon devStateIcon:textField-long devStateStyle icon sortby webCmd webCmdLabel:textField-long widgetOverride 1 userattr","Internals":{"DEF":"{if ([06:25|8] or [08:00|7]) {fhem_set (\"DLNA_cd1784188004 play\"); \nfhem_set (\"DLNA_cd1784188004 volume 25\"); set_Exec(\"Sender\", 5,'fhem_set (\"DLNA_cd1784188004 channel 2\")');\nset_State(\"on\")}\nelsif ([06:50|8] or [22:00|7]) {fhem_set (\"DLNA_cd1784188004 off\"); set_State(\"off\")}}","FUUID":"5c4c1350-f33f-0d83-23f1-5c3bb732c9bda914","FVERSION":"98_DOIF.pm:0.212240/2020-02-18","MODEL":"Perl","NAME":"Wecker","NOTIFYDEV":"global","NR":"17","NTFY_ORDER":"50-Wecker","STATE":"off","TYPE":"DOIF","VERSION":"21224 2020-02-18 18:45:49"},"Readings":{"block_01":{"Value":"executed","Time":"2020-04-14 06:50:00"},"mode":{"Value":"enabled","Time":"2020-04-05 13:05:05"},"state":{"Value":"off","Time":"2020-04-14 06:50:00"},"timer_01_c01":{"Value":"15.04.2020 06:25:00|8","Time":"2020-04-14 06:25:00"},"timer_02_c01":{"Value":"15.04.2020 08:00:00|7","Time":"2020-04-14 08:00:00"},"timer_03_c01":{"Value":"15.04.2020 06:50:00|8","Time":"2020-04-14 06:50:00"},"timer_04_c01":{"Value":"15.04.2020 22:00:00|7","Time":"2020-04-14 22:00:00"}},"Attributes":{"room":"Teufel,odroid","userattr":"1"}}],"totalResultsReturned":3}
[2020-4-14 23:03:04] [SLAVE1 ] got: 3 devices
[2020-4-14 23:03:04] [MAIN ] Monitoring remote device: DLNA_a307ba9db333
[2020-4-14 23:03:04] [MAIN ] starting FHEM_execute_await
[2020-4-14 23:03:04] [MASTER ] executing: http://x.x.x.x:8083/fhem?channel ....
@dominik
Klasse. Werde ich erst am Mittwoch Abend testen können :o
Erst noch Carport streichen ;)
Gruß Gerd
@dominik,
Moin,
leider ein Tag verspätet mit dem probieren :-\
Haus musste mal wieder etwas geputzt werden.
Heute hab ich die Module und in der Bash ein Update gemacht.
Hat alles geklappt!
Wie ich nun gesehen habe ändern sich die Attribute sogar im Slave wenn ich sie im Master ändere ;)
Ist das so gewollt ;) ?
In den userattr steht nun folgendes comment icon interval model resolution stateFormat tempHigh tempLow
Soll das dem Anwender mitteilen welche Attribute im Slave-Device vorhanden sind?
Schau mehr mal was die Zeit bringt ;)
Die Module habe ich eingespielt und dann ein Update von FHEM durchgeführt. Nun steht eine Meldung im Log die FHEMSync betrifft:
2020.04.16 19:34:36 1: HMCCURPCPROC: [d_rpc178042BidCos_RF : 22857] Scheduled CCU ping every 300 seconds
2020.04.16 19:34:40 1: PERL WARNING: Use of uninitialized value in string ne at ./FHEM/10_FHEMSYNC_DEVICE.pm line 171.
2020.04.16 19:40:53 1: PERL WARNING: Argument "" isn't numeric in numeric ge (>=) at FHEM/TimeSeries.pm line 265.
2020.04.16 19:40:53 1: PERL WARNING: Use of uninitialized value $value in concatenation (.) or string at fhem.pl line 4917.
2020.04.16 19:40:53 1: PERL WARNING: Use of uninitialized value in substitution iterator at fhem.pl line 4656.
2020.04.16 19:41:30 1: PERL WARNING: Use of uninitialized value in string ne at fhem.pl line 4850.
2020.04.16 19:41:30 1: PERL WARNING: Use of uninitialized value in string eq at fhem.pl line 4861.
Die TimeSeries kommen schon etwas länger. Der Rest hat mit FHEMSync vermutlich nix zu tun.
Danke erst einmal....
Gruss
Gerd
Hallo Dominik
ist das "normal" das folgendes immer im Log zu finden ist:
...
[18.4.2020, 20:53:22] [MAIN ] Monitoring remote device: BMP180
[18.4.2020, 20:53:34] [SLAVE2 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:53:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:53:57] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[18.4.2020, 20:53:57] [MAIN ] Monitoring remote device: FIRMATA
[18.4.2020, 20:53:57] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[18.4.2020, 20:53:57] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[18.4.2020, 20:53:57] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[18.4.2020, 20:53:57] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[18.4.2020, 20:53:57] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[18.4.2020, 20:53:57] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[18.4.2020, 20:53:58] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[18.4.2020, 20:53:58] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[18.4.2020, 20:53:58] [MAIN ] Monitoring remote device: Luefter
[18.4.2020, 20:53:58] [MAIN ] Monitoring remote device: NAFT.002
[18.4.2020, 20:53:58] [MAIN ] Monitoring remote device: NAVOC.002
[18.4.2020, 20:53:58] [MAIN ] Monitoring remote device: TA_28_736020050000
[18.4.2020, 20:53:58] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[18.4.2020, 20:53:58] [MAIN ] Monitoring remote device: UKW_12V
[18.4.2020, 20:53:58] [MAIN ] Monitoring remote device: UKW_ON
[18.4.2020, 20:54:04] [SLAVE2 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:54:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:54:34] [SLAVE2 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:54:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:55:04] [SLAVE2 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:55:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:55:34] [SLAVE2 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:55:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:56:04] [SLAVE2 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:56:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:56:34] [SLAVE2 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:57:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:57:04] [SLAVE2 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:57:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:57:34] [SLAVE2 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:58:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:58:04] [SLAVE2 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:58:23] [MAIN ] Monitoring remote device: BMP180
[18.4.2020, 20:58:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:58:34] [SLAVE2 ] longpoll end: retry in: 30000msec
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: FIRMATA
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: Luefter
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: NAFT.002
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: NAVOC.002
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: TA_28_736020050000
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: UKW_12V
[18.4.2020, 20:58:59] [MAIN ] Monitoring remote device: UKW_ON
[18.4.2020, 20:59:04] [SLAVE1 ] longpoll end: retry in: 30000msec
....
jump to the top
Wird das Device jedes mal gelöscht und neu angelegt? Ich meine das war bisher nicht so.
Hab allerdings nur noch die letzten drei Tage als Log aufgehoben. Daher weis ich nicht ob das vorher auch schon war ::)
Zwischen ~5:xx - ~15:xx gab es heute wieder irgend ein Problem.
In der Zeit wurde nichts syncronisiert.
Der Slave lief aber ohne Probleme.
Das Log von heute im Anhang als ZIP. Verbose ist allerdings 0.....
Gruss Gerd
Edit: ZIP gelöscht
Hi Gerd,
danke dir fuer die Rueckmeldung!
Den "uninitialized" Fehler sollte ich nun in Version 0.9.9 behoben haben (1. Post). Die vielen userAttr muss ich leider so machen, da es in FHEM keine andere Moeglichkeit gibt die Attribute nachtraeglich einem Device hinzuzufuegen. Die Liste enthaelt alle Attribute die nicht im Standard vorhanden sind.
Die longpoll Errors und die "Failed to fetch devices" Fehler im Log bei dir sind sehr eigenartig. Ich hatte heute 6h den 2. Pi ausgeschalten und da hatte ich die "Failed to fetch devices" Fehlermeldung. Die longpoll Errors deuten manchmal darauf hin, dass eventuell FHEM fuer ein paar Sekunden blockiert ist und daher der longpoll endet. Kann das der Fall sein? Das "Failed to fetch devices" im Log kann nur vorkommen wenn FHEM nicht erreichbar ist. Ich habe das Log gerade erweitert, damit der Fehler im Log erkenntlich ist. Bitte auf 2.7.3 fhemsync aktualisieren. Wenn das wieder passiert, wissen wir beim naechsten Mal mehr :)
Hay Dominik,
wie schon mal erwähnt ist der Slave1 der RPi welcher mir 1Wire Sensoren abfragt.
Dieser läuft noch nicht Asyncron in der Bus-Abfrage. In der Zeit bleibt das System immer stehen bis alles fertig ist.
Ich hatte noch keine Lust das erneut zu probieren.
Also bei mir ist das schon "normal". Nach dem abfragen der 1Wire läuft das System ja weiter.
Auf dem Slave werden die Daten ins Log geschrieben.
Ich sehe im Log von FHEM gerade das ich um die 15Uhr ein Update mit Shutdown gemacht habe.
Deswegen ist das FHEMSync vermutlich dann wieder angelaufen.
Okay , ich mach dann ein Update von FHEMSync. Ich musste heute mein zigbee2mqtt.io updaten da scheinbar durch die Installation von FHEMSync etwas zu neu war für das ZigBee2mqtt :o
Läuft aber wieder.
Ich melde mich wieder ;)
Moin Dominik,
früher als erwartet melde ich mich schon wieder :(
[20.4.2020, 06:26:43] [MAIN ] Monitoring remote device: BMP180
[20.4.2020, 06:27:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:27:08] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:27:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:27:38] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:28:05] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:28:08] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:28:35] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:28:38] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:28:57] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[20.4.2020, 06:28:57] [MAIN ] Monitoring remote device: FIRMATA
[20.4.2020, 06:28:58] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[20.4.2020, 06:28:58] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[20.4.2020, 06:28:58] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[20.4.2020, 06:28:59] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[20.4.2020, 06:28:59] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[20.4.2020, 06:28:59] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[20.4.2020, 06:28:59] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[20.4.2020, 06:29:00] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[20.4.2020, 06:29:00] [MAIN ] Monitoring remote device: Luefter
[20.4.2020, 06:29:00] [MAIN ] Monitoring remote device: NAFT.002
[20.4.2020, 06:29:01] [MAIN ] Monitoring remote device: NAVOC.002
[20.4.2020, 06:29:01] [MAIN ] Monitoring remote device: TA_28_736020050000
[20.4.2020, 06:29:01] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[20.4.2020, 06:29:02] [MAIN ] Monitoring remote device: UKW_12V
[20.4.2020, 06:29:02] [MAIN ] Monitoring remote device: UKW_ON
[20.4.2020, 06:29:10] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:29:10] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:29:40] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:29:40] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:30:10] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:30:10] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:30:40] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:30:40] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:31:10] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:31:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:31:40] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:31:44] [MAIN ] Monitoring remote device: BMP180
[20.4.2020, 06:32:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:32:11] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:32:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:32:41] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:33:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:33:11] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:33:35] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:33:41] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:34:04] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[20.4.2020, 06:34:04] [MAIN ] Monitoring remote device: FIRMATA
[20.4.2020, 06:34:04] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[20.4.2020, 06:34:05] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[20.4.2020, 06:34:05] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[20.4.2020, 06:34:05] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[20.4.2020, 06:34:06] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[20.4.2020, 06:34:06] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[20.4.2020, 06:34:06] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[20.4.2020, 06:34:06] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[20.4.2020, 06:34:07] [MAIN ] Monitoring remote device: Luefter
[20.4.2020, 06:34:07] [MAIN ] Monitoring remote device: NAFT.002
[20.4.2020, 06:34:07] [MAIN ] Monitoring remote device: NAVOC.002
[20.4.2020, 06:34:08] [MAIN ] Monitoring remote device: TA_28_736020050000
[20.4.2020, 06:34:08] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[20.4.2020, 06:34:08] [MAIN ] Monitoring remote device: UKW_12V
[20.4.2020, 06:34:09] [MAIN ] Monitoring remote device: UKW_ON
[20.4.2020, 06:34:09] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:34:16] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:34:39] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:34:46] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:35:09] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:35:16] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:35:39] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:35:46] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:36:16] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:36:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:36:45] [MASTER ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:36:45] [SLAVE2 ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:36:46] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:37:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:37:16] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:37:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:37:46] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:38:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:38:16] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:38:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:38:47] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:39:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:39:09] [MASTER ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:39:10] [SLAVE1 ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:39:17] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:39:35] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:39:47] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:40:05] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:40:17] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:40:35] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:40:47] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:41:17] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:41:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:41:46] [MASTER ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:41:46] [SLAVE2 ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:41:47] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:42:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:42:17] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:42:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:42:47] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:43:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:43:17] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:43:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:43:47] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:44:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:44:10] [MASTER ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:44:11] [SLAVE1 ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:44:17] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:44:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:44:47] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:45:05] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:45:17] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:45:35] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:45:47] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:46:17] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:46:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:46:47] [MASTER ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:46:47] [SLAVE2 ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:46:47] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:47:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:47:18] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:47:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:47:48] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:48:04] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:48:18] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:48:34] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:48:48] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:49:05] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:49:11] [MASTER ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:49:12] [SLAVE1 ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[20.4.2020, 06:49:18] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:49:35] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:49:48] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 06:50:05] [SLAVE1 ] longpoll end: retry in: 30000msec
....
Laeuft mein Speicher voll?
Laut Speicher-Nutzung "RAM-Nutzung Total: 923.23, Min: 289.98, Max: 579.86, Aktuell: 579.86"
nach einem set fhemsync restart
....
[20.4.2020, 20:14:36] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 20:14:36] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 20:15:06] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 20:15:06] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 20:15:37] [SLAVE1 ] longpoll end: retry in: 30000msec
[20.4.2020, 20:15:37] [SLAVE2 ] longpoll end: retry in: 30000msec
[20.4.2020, 20:16:12] [MAIN ] Starting FHEMSync version 2.7.3...
[20.4.2020, 20:16:12] [MAIN ] Options: {"version":"2.7.3","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[20.4.2020, 20:16:35] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:35] [MAIN ] Monitoring remote device: BMP180
[20.4.2020, 20:16:35] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:35] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:35] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:35] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:36] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:36] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:36] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:36] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:37] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:37] [SLAVE1 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:37] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:37] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:37] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:37] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:38] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:38] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:38] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:38] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:38] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:39] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:39] [SLAVE1 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:39] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: FIRMATA
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: Luefter
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: NAFT.002
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: NAVOC.002
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: TA_28_736020050000
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: UKW_12V
[20.4.2020, 20:16:39] [MAIN ] Monitoring remote device: UKW_ON
[20.4.2020, 20:16:39] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:39] [SLAVE2 ] longpoll end: retry in: 200msec
[20.4.2020, 20:16:40] [SLAVE2 ] longpoll end: retry in: 200msec
Nun kommen wieder Daten. Im Event-Monitor hatte ich zuvor nichts mehr gesehen.
Gruss Gerd
Danke dir fuer das Log, das hilft mir schon viel weiter! Da muss ich wohl irgendwo im Code einen Loop drin haben der den Speicher voll laufen laesst. Werde mir das genauer anschauen.
Hay Dominik,
ich hab erst seit kurzem das Loggen des RAM aktiviert.
Weil mein PI vermutlich das blöde Problem mit altem Perl und Speicher voll laufen hat.
Im Anhang sieht man das auch das nach dem set fhemsync restart der Speicher wieder mehr Platz hat.
Viel Spass beim suchen. So wie ich das gelesen und verstanden habe gibt Perl den Speicher ja nicht mehr unbedingt her den es sich ein mal genehmigt hat.
Zumindest nicht bei altem Perl.
Danke und bis dann.
Gerd
Hallo Dominik.
Was macht die Kunst? Speicher Problem gefunden?
Sobald der Speicher Verbrauch ein gewissen Wert erreicht hat hört FHEMSync auf zu arbeiten.
Wenn ich ein "Set FHEMSync restart " durchführe läuft's bis zum nächsten Anstieg.
Gruß Gerd
Hi,
noch nicht, ich hatte bei mir gestern auch gesehen, dass fhemsync neu gestartet wurde, weil es zu viel Speicher konsumiert hat. Ich werde mich noch die Woche dran setzen.
Hi,
möchte nur bescheid geben, dass ich noch dran bin an der Fehlersuche. Ich kann den Crash täglich bei mir reproduzieren und bin nun auf Detailsuche wo der Memory Leak liegt.
Moin Dominik,
das ist schön ;=)
Wie kann das sein das FHEM weiter läuft nur FHEMSync nicht weiter arbeitet?
Wenn ich restart von FHEMSync mache gehts dann ja weiter.
Ist das auch eins dieser leidigen Speicher-Probleme von alten Perl-Versionen?
Gruss Gerd
So...endlich gefunden und behoben. Update fhemsync 2.8.0, damit sollte es nun keinen Crash mehr geben.
Merci,
gerade installiert.
Schau mehr mal was der Tag bringt ;=)
Gn8 Gerd
Das war leider nix ?!
Startet und Stoppt wieder!
[9.5.2020, 00:29:39] [MAIN ] Starting FHEMSync version 2.8.0...
[9.5.2020, 00:29:39] [MAIN ] Options: {"version":"2.8.0","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[9.5.2020, 00:29:41] [SLAVE1 ] longpoll end: retry in: 200msec
/usr/lib/node_modules/fhemsync/fhemsync.js:257
setTimeout(this.startLongpoll().bind(this), timeout);
^
TypeError: this.startLongpoll is not a function
at Request.<anonymous> (/usr/lib/node_modules/fhemsync/fhemsync.js:257:21)
at Request.emit (events.js:327:22)
at IncomingMessage.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1076:12)
at Object.onceWrapper (events.js:421:28)
at IncomingMessage.emit (events.js:327:22)
at endReadableNT (_stream_readable.js:1201:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
[9.5.2020, 00:30:04] [MAIN ] Starting FHEMSync version 2.8.0...
[9.5.2020, 00:30:04] [MAIN ] Options: {"version":"2.8.0","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[9.5.2020, 00:30:12] [SLAVE1 ] longpoll end: retry in: 200msec
/usr/lib/node_modules/fhemsync/fhemsync.js:257
setTimeout(this.startLongpoll().bind(this), timeout);
^
TypeError: this.startLongpoll is not a function
at Request.<anonymous> (/usr/lib/node_modules/fhemsync/fhemsync.js:257:21)
at Request.emit (events.js:327:22)
at IncomingMessage.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1076:12)
at Object.onceWrapper (events.js:421:28)
at IncomingMessage.emit (events.js:327:22)
at endReadableNT (_stream_readable.js:1201:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
[9.5.2020, 00:30:34] [MAIN ] Starting FHEMSync version 2.8.0...
[9.5.2020, 00:30:34] [MAIN ] Options: {"version":"2.8.0","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[9.5.2020, 00:30:37] [SLAVE2 ] longpoll end: retry in: 200msec
/usr/lib/node_modules/fhemsync/fhemsync.js:257
setTimeout(this.startLongpoll().bind(this), timeout);
^
TypeError: this.startLongpoll is not a function
at Request.<anonymous> (/usr/lib/node_modules/fhemsync/fhemsync.js:257:21)
at Request.emit (events.js:327:22)
at IncomingMessage.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1076:12)
at Object.onceWrapper (events.js:421:28)
at IncomingMessage.emit (events.js:327:22)
at endReadableNT (_stream_readable.js:1201:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
[9.5.2020, 00:30:58] [MAIN ] Starting FHEMSync version 2.8.0...
[9.5.2020, 00:30:58] [MAIN ] Options: {"version":"2.8.0","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[9.5.2020, 00:30:59] [SLAVE1 ] longpoll end: retry in: 200msec
/usr/lib/node_modules/fhemsync/fhemsync.js:257
setTimeout(this.startLongpoll().bind(this), timeout);
^
TypeError: this.startLongpoll is not a function
at Request.<anonymous> (/usr/lib/node_modules/fhemsync/fhemsync.js:257:21)
at Request.emit (events.js:327:22)
at IncomingMessage.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1076:12)
at Object.onceWrapper (events.js:421:28)
at IncomingMessage.emit (events.js:327:22)
at endReadableNT (_stream_readable.js:1201:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
[9.5.2020, 00:31:20] [MAIN ] Starting FHEMSync version 2.8.0...
[9.5.2020, 00:31:20] [MAIN ] Options: {"version":"2.8.0","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[9.5.2020, 00:31:21] [SLAVE1 ] longpoll end: retry in: 200msec
/usr/lib/node_modules/fhemsync/fhemsync.js:257
setTimeout(this.startLongpoll().bind(this), timeout);
^
TypeError: this.startLongpoll is not a function
at Request.<anonymous> (/usr/lib/node_modules/fhemsync/fhemsync.js:257:21)
at Request.emit (events.js:327:22)
at IncomingMessage.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1076:12)
at Object.onceWrapper (events.js:421:28)
at IncomingMessage.emit (events.js:327:22)
at endReadableNT (_stream_readable.js:1201:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
[9.5.2020, 00:31:42] [MAIN ] Starting FHEMSync version 2.8.0...
[9.5.2020, 00:31:42] [MAIN ] Options: {"version":"2.8.0","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[9.5.2020, 00:31:43] [SLAVE1 ] longpoll end: retry in: 200msec
/usr/lib/node_modules/fhemsync/fhemsync.js:257
setTimeout(this.startLongpoll().bind(this), timeout);
^
TypeError: this.startLongpoll is not a function
at Request.<anonymous> (/usr/lib/node_modules/fhemsync/fhemsync.js:257:21)
at Request.emit (events.js:327:22)
at IncomingMessage.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1076:12)
at Object.onceWrapper (events.js:421:28)
at IncomingMessage.emit (events.js:327:22)
at endReadableNT (_stream_readable.js:1201:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
[9.5.2020, 00:31:53] [MAIN ] Starting FHEMSync version 2.8.0...
[9.5.2020, 00:31:53] [MAIN ] Options: {"version":"2.8.0","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[9.5.2020, 00:31:54] [SLAVE1 ] longpoll end: retry in: 200msec
/usr/lib/node_modules/fhemsync/fhemsync.js:257
setTimeout(this.startLongpoll().bind(this), timeout);
^
TypeError: this.startLongpoll is not a function
at Request.<anonymous> (/usr/lib/node_modules/fhemsync/fhemsync.js:257:21)
at Request.emit (events.js:327:22)
at IncomingMessage.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1076:12)
at Object.onceWrapper (events.js:421:28)
at IncomingMessage.emit (events.js:327:22)
at endReadableNT (_stream_readable.js:1201:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
[9.5.2020, 00:32:06] [MAIN ] Starting FHEMSync version 2.8.0...
[9.5.2020, 00:32:06] [MAIN ] Options: {"version":"2.8.0","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[9.5.2020, 00:32:06] [SLAVE1 ] longpoll end: retry in: 200msec
/usr/lib/node_modules/fhemsync/fhemsync.js:257
setTimeout(this.startLongpoll().bind(this), timeout);
^
TypeError: this.startLongpoll is not a function
at Request.<anonymous> (/usr/lib/node_modules/fhemsync/fhemsync.js:257:21)
at Request.emit (events.js:327:22)
at IncomingMessage.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1076:12)
at Object.onceWrapper (events.js:421:28)
at IncomingMessage.emit (events.js:327:22)
at endReadableNT (_stream_readable.js:1201:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
[9.5.2020, 00:32:15] [MAIN ] Starting FHEMSync version 2.8.0...
[9.5.2020, 00:32:15] [MAIN ] Options: {"version":"2.8.0","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[9.5.2020, 00:32:15] [SLAVE1 ] longpoll end: retry in: 200msec
/usr/lib/node_modules/fhemsync/fhemsync.js:257
setTimeout(this.startLongpoll().bind(this), timeout);
^
TypeError: this.startLongpoll is not a function
at Request.<anonymous> (/usr/lib/node_modules/fhemsync/fhemsync.js:257:21)
at Request.emit (events.js:327:22)
at IncomingMessage.<anonymous> (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:1076:12)
at Object.onceWrapper (events.js:421:28)
at IncomingMessage.emit (events.js:327:22)
at endReadableNT (_stream_readable.js:1201:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
jump to the top
Zitat2020-05-09 00:32:58 FHEMSYNC fhemsync fhemsync: running /usr/bin/fhemsync
2020-05-09 00:33:00 FHEMSYNC fhemsync fhemsync: stopped
2020-05-09 00:33:20 FHEMSYNC fhemsync fhemsync: running /usr/bin/fhemsync
2020-05-09 00:33:22 FHEMSYNC fhemsync fhemsync: stopped
2
Die Module haben sich ja nicht geändert ?!
Update 2.8.1. Module sind gleich geblieben.
Ich glaub du hattest schon mal gesagt, dass bei dir Remote FHEM manchmal 1 Minute hängt, oder? Weil das longpoll ended ist eher der Fehlerfall wenn das kommt.
Ja. Genau.
Nach dem Frühstück Versuch ich's wen ich Zeit bekomme ;)
Moin Dominik,
so, etwas verzögert Zeit bekommen ;)
Sollte man das machen:
╭────────────────────────────────────────────────────────────────╮
│ │
│ New patch version of npm available! 6.14.4 → 6.14.5 │
│ Changelog: https://github.com/npm/cli/releases/tag/v6.14.5 │
│ Run npm install -g npm to update! │
│ │
╰────────────────────────────────────────────────────────────────╯
Oder geht dann wieder etwas in die Hose ;)?
FHEMSync scheint nun wieder zu funktionieren, Danke Dir! ;D
[10.5.2020, 11:43:23] [MAIN ] Starting FHEMSync version 2.8.1...
[10.5.2020, 11:43:23] [MAIN ] Options: {"version":"2.8.1","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[10.5.2020, 11:43:24] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:24] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: BMP180
[10.5.2020, 11:43:24] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:24] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: FIRMATA
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: Luefter
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: NAFT.002
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: NAVOC.002
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: TA_28_736020050000
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: UKW_12V
[10.5.2020, 11:43:24] [MAIN ] Monitoring remote device: UKW_ON
[10.5.2020, 11:43:25] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:25] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:25] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:25] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:25] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:25] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:25] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:25] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:25] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:25] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:26] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:26] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:26] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:26] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:26] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:26] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:26] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:26] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:27] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:27] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:27] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:27] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:27] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:27] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:27] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:27] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:27] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:27] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:28] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:28] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:28] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:28] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:28] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:28] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:28] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:28] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:28] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:28] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:29] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:29] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:29] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:29] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:29] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:29] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:29] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:29] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:29] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:30] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:30] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:30] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:30] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:30] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:30] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:30] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:30] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:30] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:31] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:31] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:31] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:31] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:31] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:31] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:31] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:31] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:31] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:32] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:32] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:32] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:32] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:32] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:32] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:32] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:32] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:32] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:32] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:33] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:33] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:33] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:33] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:33] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:33] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:33] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:33] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:34] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:34] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:34] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:34] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:34] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:34] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:34] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:34] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:34] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:34] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:35] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:35] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:35] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:35] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:35] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:35] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:35] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:35] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:35] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:35] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:36] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:36] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:36] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:36] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:36] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:36] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:36] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:36] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:36] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:37] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:37] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:37] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:37] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:37] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:37] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:37] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:37] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:37] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:38] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:38] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:38] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:38] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:38] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:38] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:38] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:38] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:38] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:39] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:39] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:39] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:39] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:39] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:39] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:39] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:39] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:39] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:39] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:40] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:40] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:40] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:40] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:40] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:40] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:40] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:40] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:40] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:40] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:41] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:41] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:41] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:41] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:41] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:41] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:41] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:41] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:42] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:42] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:42] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:42] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:42] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:42] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:42] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:42] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:42] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:42] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:43] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:43] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:43] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:43] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:43] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:43] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:43] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:43] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:43] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:44] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:44] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:44] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:44] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:44] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:44] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:44] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:44] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:44] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:45] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:45] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:45] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:45] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:45] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:45] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:45] [SLAVE1 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:45] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:45] [SLAVE1 ] longpoll end: retry in: 30000msec
[10.5.2020, 11:43:45] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:46] [SLAVE2 ] longpoll end: retry in: 200msec
[10.5.2020, 11:43:46] [SLAVE2 ] longpoll end: retry in: 30000msec
[10.5.2020, 11:44:15] [SLAVE1 ] longpoll end: retry in: 30000msec
[10.5.2020, 11:44:16] [SLAVE2 ] longpoll end: retry in: 30000msec
[10.5.2020, 11:44:46] [SLAVE2 ] longpoll end: retry in: 30000msec
[10.5.2020, 11:45:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[10.5.2020, 11:45:16] [SLAVE2 ] longpoll end: retry in: 30000msec
Im Log könnte eventuell bei [Main ] die Slave-Nummer mit dabei stehen. Dann sieht man auch gleich zu welcher Sklave die Daten gehören.
Dann schauen wir nun was der Speicher sagt.
Schönen Sonntag wünsch ich.
Gruss Gerd
Speicher sieht nun auch besser aus.
npm Update kannst du immer wieder machen. Diese Meldung kommt direkt von npm, das prueft automatisch bei jeder Ausfuehrung ob eine Aktualisierung vorliegt.
Das Log kann ich anpassen, damit dort klarer hervor kommt von welchem Slave das Device kommt.
Hallo Dominik
Leider gab es heute um 2 Uhr erneut Fehler
Bekomme das leider nicht via mobilfone hier rein kopiert >:(
Muß bis morgen warten... >:(
Gruß
So, am PC geht das besser ::)
[11.5.2020, 01:53:18] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[11.5.2020, 01:53:18] [MAIN ] Monitoring remote device: FIRMATA
[11.5.2020, 01:53:19] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[11.5.2020, 01:53:19] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[11.5.2020, 01:53:19] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[11.5.2020, 01:53:19] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[11.5.2020, 01:53:19] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[11.5.2020, 01:53:19] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[11.5.2020, 01:53:19] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[11.5.2020, 01:53:19] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[11.5.2020, 01:53:19] [MAIN ] Monitoring remote device: Luefter
[11.5.2020, 01:53:20] [MAIN ] Monitoring remote device: NAFT.002
[11.5.2020, 01:53:20] [MAIN ] Monitoring remote device: NAVOC.002
[11.5.2020, 01:53:20] [MAIN ] Monitoring remote device: TA_28_736020050000
[11.5.2020, 01:53:20] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[11.5.2020, 01:53:20] [MAIN ] Monitoring remote device: UKW_12V
[11.5.2020, 01:53:20] [MAIN ] Monitoring remote device: UKW_ON
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Caseless.set (/usr/lib/node_modules/fhemsync/node_modules/caseless/index.js:11:20)
at Request.resp.setHeader (/usr/lib/node_modules/fhemsync/node_modules/caseless/index.js:54:14)
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:387:10)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 3)
(node:2892) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Caseless.set (/usr/lib/node_modules/fhemsync/node_modules/caseless/index.js:11:20)
at Request.resp.setHeader (/usr/lib/node_modules/fhemsync/node_modules/caseless/index.js:54:14)
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:290:10)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 4)
[11.5.2020, 01:53:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:53:43] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:54:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:54:13] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:54:27] [MAIN ] Monitoring remote device: BMP180
[11.5.2020, 01:54:43] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:55:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:55:13] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:55:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:55:43] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:56:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:56:13] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:56:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:56:43] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:57:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:57:14] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:57:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:57:44] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:58:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:58:14] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:58:21] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[11.5.2020, 01:58:21] [MAIN ] Monitoring remote device: FIRMATA
[11.5.2020, 01:58:21] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[11.5.2020, 01:58:21] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[11.5.2020, 01:58:21] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[11.5.2020, 01:58:21] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[11.5.2020, 01:58:22] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[11.5.2020, 01:58:22] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[11.5.2020, 01:58:22] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[11.5.2020, 01:58:22] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[11.5.2020, 01:58:22] [MAIN ] Monitoring remote device: Luefter
[11.5.2020, 01:58:22] [MAIN ] Monitoring remote device: NAFT.002
[11.5.2020, 01:58:22] [MAIN ] Monitoring remote device: NAVOC.002
[11.5.2020, 01:58:22] [MAIN ] Monitoring remote device: TA_28_736020050000
[11.5.2020, 01:58:22] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[11.5.2020, 01:58:23] [MAIN ] Monitoring remote device: UKW_12V
[11.5.2020, 01:58:23] [MAIN ] Monitoring remote device: UKW_ON
[11.5.2020, 01:58:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:58:44] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:59:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:59:14] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 01:59:27] [MAIN ] Monitoring remote device: BMP180
[11.5.2020, 01:59:44] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:00:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:00:14] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:00:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:00:44] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:01:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:01:14] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:01:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:01:44] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:02:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:02:15] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:02:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:02:45] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:03:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:03:15] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:03:24] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[11.5.2020, 02:03:24] [MAIN ] Monitoring remote device: FIRMATA
[11.5.2020, 02:03:24] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[11.5.2020, 02:03:24] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[11.5.2020, 02:03:24] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[11.5.2020, 02:03:24] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[11.5.2020, 02:03:24] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[11.5.2020, 02:03:24] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[11.5.2020, 02:03:25] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[11.5.2020, 02:03:25] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[11.5.2020, 02:03:25] [MAIN ] Monitoring remote device: Luefter
[11.5.2020, 02:03:25] [MAIN ] Monitoring remote device: NAFT.002
[11.5.2020, 02:03:25] [MAIN ] Monitoring remote device: NAVOC.002
[11.5.2020, 02:03:25] [MAIN ] Monitoring remote device: TA_28_736020050000
[11.5.2020, 02:03:25] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[11.5.2020, 02:03:25] [MAIN ] Monitoring remote device: UKW_12V
[11.5.2020, 02:03:25] [MAIN ] Monitoring remote device: UKW_ON
[11.5.2020, 02:03:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:03:45] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:04:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:04:15] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:04:28] [MAIN ] Monitoring remote device: BMP180
[11.5.2020, 02:04:45] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:05:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:05:15] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:05:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:05:45] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:06:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:06:15] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:06:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:06:45] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:07:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:07:16] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:07:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:07:46] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:08:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:08:16] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:08:26] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[11.5.2020, 02:08:26] [MAIN ] Monitoring remote device: FIRMATA
[11.5.2020, 02:08:26] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[11.5.2020, 02:08:26] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[11.5.2020, 02:08:27] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[11.5.2020, 02:08:27] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[11.5.2020, 02:08:27] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[11.5.2020, 02:08:27] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[11.5.2020, 02:08:27] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[11.5.2020, 02:08:27] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[11.5.2020, 02:08:27] [MAIN ] Monitoring remote device: Luefter
[11.5.2020, 02:08:27] [MAIN ] Monitoring remote device: NAFT.002
[11.5.2020, 02:08:27] [MAIN ] Monitoring remote device: NAVOC.002
[11.5.2020, 02:08:28] [MAIN ] Monitoring remote device: TA_28_736020050000
[11.5.2020, 02:08:28] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[11.5.2020, 02:08:28] [MAIN ] Monitoring remote device: UKW_12V
[11.5.2020, 02:08:28] [MAIN ] Monitoring remote device: UKW_ON
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 14)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 15)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 16)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 17)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 18)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 19)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 20)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 21)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 22)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 28)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 29)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 30)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 31)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 32)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 36)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 37)
(node:2892) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Request.init (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:168:12)
at Request.RP$initInterceptor [as init] (/usr/lib/node_modules/fhemsync/node_modules/request-promise-core/configure/request2.js:45:29)
at new Request (/usr/lib/node_modules/fhemsync/node_modules/request/request.js:127:8)
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:53:10)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
(node:2892) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 38)
[11.5.2020, 02:08:42] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:08:46] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:09:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:09:16] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 02:09:28] [MASTER ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[11.5.2020, 02:09:28] [SLAVE2 ] Failed to fetch devices: RangeError: Maximum call stack size exceeded
[11.5.2020, 02:09:46] [SLAVE2 ] longpoll end: retry in: 30000msec
.
.
Nach restart im FHEMWEB:
[11.5.2020, 23:11:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[11.5.2020, 23:11:22] [SLAVE2 ] longpoll end: retry in: 30000msec
[11.5.2020, 23:11:35] [MAIN ] Starting FHEMSync version 2.8.1...
[11.5.2020, 23:11:35] [MAIN ] Options: {"version":"2.8.1","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[11.5.2020, 23:11:36] [SLAVE1 ] longpoll end: retry in: 200msec
[11.5.2020, 23:11:36] [SLAVE2 ] longpoll end: retry in: 200msec
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: BMP180
[11.5.2020, 23:11:36] [SLAVE2 ] longpoll end: retry in: 200msec
[11.5.2020, 23:11:36] [SLAVE1 ] longpoll end: retry in: 200msec
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: FIRMATA
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: Luefter
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: NAFT.002
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: NAVOC.002
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: TA_28_736020050000
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: UKW_12V
[11.5.2020, 23:11:36] [MAIN ] Monitoring remote device: UKW_ON
[11.5.2020, 23:11:36] [SLAVE2 ] longpoll end: retry in: 200msec
.
.
[code][12.5.2020, 13:21:12] [SLAVE1 ] longpoll end: retry in: 30000msec
[12.5.2020, 13:21:17] [SLAVE2 ] longpoll end: retry in: 30000msec
[12.5.2020, 13:21:23] [MAIN ] Monitoring remote device: FA_26_A2D984000007
[12.5.2020, 13:21:23] [MAIN ] Monitoring remote device: FIRMATA
[12.5.2020, 13:21:23] [MAIN ] Monitoring remote device: FI_26_A3D984001605
[12.5.2020, 13:21:24] [MAIN ] Monitoring remote device: HI_28_A2D984001677
[12.5.2020, 13:21:24] [MAIN ] Monitoring remote device: KH_28_FF5A50811605
[12.5.2020, 13:21:24] [MAIN ] Monitoring remote device: KH_28_FF715C811603
[12.5.2020, 13:21:24] [MAIN ] Monitoring remote device: KH_28_FF8E8C811603
[12.5.2020, 13:21:24] [MAIN ] Monitoring remote device: KH_28_FF976C811605
[12.5.2020, 13:21:24] [MAIN ] Monitoring remote device: KH_28_FFA45D811604
[12.5.2020, 13:21:24] [MAIN ] Monitoring remote device: LI_28_A3D984001605
[12.5.2020, 13:21:24] [MAIN ] Monitoring remote device: Luefter
[12.5.2020, 13:21:24] [MAIN ] Monitoring remote device: NAFT.002
[12.5.2020, 13:21:25] [MAIN ] Monitoring remote device: NAVOC.002
[12.5.2020, 13:21:25] [MAIN ] Monitoring remote device: TA_28_736020050000
[12.5.2020, 13:21:25] [MAIN ] Monitoring remote device: TA_28_FF313C4E0400
[12.5.2020, 13:21:25] [MAIN ] Monitoring remote device: UKW_12V
[12.5.2020, 13:21:25] [MAIN ] Monitoring remote device: UKW_ON
(node:30527) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at request (/usr/lib/node_modules/fhemsync/node_modules/request/index.js:47:16)
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/index.js:100:12
at /usr/lib/node_modules/fhemsync/node_modules/request/
So, am PC geht das besser ::)
Aber das Forum hat scheinbar Probleme alles korrekt darzustellen.
Ich hänge die Logs als Dateien an.
Hab das gestern mehrfach versucht zu markieren und einzufügen aber da stand dann immer etwas anderes im kopierten drin.
FHEMSync hat es sich dann heute wieder erhängt.
Speicher sieht auch nicht schlecht aus. Siehe Bild.
Gruss Gerd
Edit: Logs entfernt und als Datei angehängt.
Das ist eigenartig. Nutzt du auth? Zumindest laut dem Source Code von request könnte der Fehler dort liegen.
Ich habe gerade eine Aenderung gemacht -> Version 2.8.2
Kannst du bitte kurzfristig pruefen ob die bei dir laeuft? Danke fuer deine ausgiebigen Tests!
auth? Hm...
Meinst du basicauth?
Also nicht das ich wüsste.
Ich habe nur ein Passwort beim login.
Aber geht ja bisher.
Zitat von: dominik am 12 Mai 2020, 20:36:36
Ich habe gerade eine Aenderung gemacht -> Version 2.8.2
Kannst du bitte kurzfristig pruefen ob die bei dir laeuft? Danke fuer deine ausgiebigen Tests!
Ja Laptop ist neben Couch an ;D
Danke ebenso
[12.5.2020, 20:41:27] [MAIN ] Options: {"version":"2.8.2","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[12.5.2020, 20:41:27] [SLAVE1 ] Failed to get CSRF-Token: StatusCodeError: 401 - undefined
[12.5.2020, 20:41:27] [SLAVE2 ] Failed to get CSRF-Token: StatusCodeError: 401 - undefined
[12.5.2020, 20:41:28] [SLAVE1 ] longpoll end: retry in: 200msec
[12.5.2020, 20:41:28] [SLAVE2 ] longpoll end: retry in: 200msec
[12.5.2020, 20:41:28] [SLAVE1 ] Failed to fetch devices: StatusCodeError: 401 - undefined
[12.5.2020, 20:41:28] [SLAVE2 ] Failed to fetch devices: StatusCodeError: 401 - undefined
[12.5.2020, 20:41:28] [SLAVE1 ] longpoll end: retry in: 200msec
[12.5.2020, 20:41:28] [SLAVE2 ] longpoll end: retry in: 200msec
[12.5.2020, 20:41:28] [SLAVE1 ] longpoll end: retry in: 200msec
[12.5.2020, 20:41:28] [SLAVE2 ] longpoll end: retry in: 200msec
[12.5.2020, 20:41:28] [SLAVE1 ] longpoll end: retry in: 200msec
[12.5.2020, 20:41:28] [SLAVE2 ] longpoll end: retry in: 200msec
Danke, ich schau gleich nach...
Bitte noch einen Test mit 2.8.3.
[12.5.2020, 20:43:49] [SLAVE2 ] longpoll end: retry in: 30000msec
[12.5.2020, 20:57:47] [MAIN ] Starting FHEMSync version 2.8.3...
[12.5.2020, 20:57:47] [MAIN ] Options: {"version":"2.8.3","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
(node:15176) UnhandledPromiseRejectionWarning: TypeError: this.request is not a function
at FHEM.execute (/usr/lib/node_modules/fhemsync/fhemsync.js:337:24)
at FHEM.getFHEMSyncDevice (/usr/lib/node_modules/fhemsync/fhemsync.js:348:24)
at main (/usr/lib/node_modules/fhemsync/fhemsync.js:489:41)
(node:15176) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 2)
(node:15176) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
[12.5.2020, 20:58:08] [MAIN ] Starting FHEMSync version 2.8.3...
[12.5.2020, 20:58:08] [MAIN ] Options: {"version":"2.8.3","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
(node:15199) UnhandledPromiseRejectionWarning: TypeError: this.request is not a function
at FHEM.execute (/usr/lib/node_modules/fhemsync/fhemsync.js:337:24)
at FHEM.getFHEMSyncDevice (/usr/lib/node_modules/fhemsync/fhemsync.js:348:24)
at main (/usr/lib/node_modules/fhemsync/fhemsync.js:489:41)
(node:15199) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 2)
(node:15199) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
2.8.4 bitte noch, dann sollte es klappen :)
[12.5.2020, 21:25:45] [SLAVE2 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:46] [SLAVE1 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:47] [MAIN ] Starting FHEMSync version 2.8.4...
[12.5.2020, 21:25:47] [MAIN ] Options: {"version":"2.8.4","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[12.5.2020, 21:25:47] [SLAVE1 ] Failed to get CSRF-Token: StatusCodeError: 401 - undefined
[12.5.2020, 21:25:47] [SLAVE2 ] Failed to get CSRF-Token: StatusCodeError: 401 - undefined
[12.5.2020, 21:25:48] [SLAVE1 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:48] [SLAVE2 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:48] [SLAVE1 ] Failed to fetch devices: StatusCodeError: 401 - undefined
[12.5.2020, 21:25:48] [SLAVE2 ] Failed to fetch devices: StatusCodeError: 401 - undefined
[12.5.2020, 21:25:48] [SLAVE1 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:48] [SLAVE2 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:48] [SLAVE1 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:48] [SLAVE2 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:49] [SLAVE1 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:49] [SLAVE2 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:49] [SLAVE1 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:49] [SLAVE2 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:49] [SLAVE1 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:49] [SLAVE2 ] longpoll end: retry in: 200msec
[12.5.2020, 21:25:49] [SLAVE1 ] longpoll end: retry in: 200msec
Falls das Wichtig ist und ich was falsch eingestellt haben sollte
defmod fhemsync FHEMSYNC
attr fhemsync FHEMSync-auth crypt:550058174d500976666666309490206
attr fhemsync FHEMSync-log ./log/fhemsync-%Y-%m-%d.log
attr fhemsync FHEMSync-port 8083
attr fhemsync FHEMSync-selfsignedcert true
attr fhemsync FHEMSync-server 127.0.0.1
attr fhemsync FHEMSync-ssl true
attr fhemsync FHEMSync-webname fhem
attr fhemsync devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
attr fhemsync nrarchive 10
attr fhemsync remote-server 192.168.xxx.30
attr fhemsync remote2-server 192.168.xxx.32
attr fhemsync room FHEMSync
attr fhemsync stateFormat fhemsync
attr fhemsync verbose 0
Das ist interessant, ich dachte anhand des Logs, dass die Remote Server ein Auth benoetigen. Sind deine Remote Server wirklich ohne Auth erreichbar und nur der Main Server benoetigt auth?
Ich hab halt bei beiden Remote ein User und PW zum anmelden.
Da es bisher funktioniert (hat) bin ich davon ausgegangen das alles korrekt eingestellt ist :=)
Kannst du bitte bei beiden noch ein user:auth hinterlegen und dann nochmals testen? Verwendest du dort den gleichen User und Pass wie fuer den Master? Das wuerde erklaeren, dass es bislang funktionierte.
//Edit: Und ssl und selfsignedcert nicht vergessen, falls das die Remotes ebenfalls nutzen.
Zitat von: dominik am 12 Mai 2020, 21:49:41
Kannst du bitte bei beiden noch ein user:auth hinterlegen und dann nochmals testen? Verwendest du dort den gleichen User und Pass wie fuer den Master? Das wuerde erklaeren, dass es bislang funktionierte.
//Edit: Und ssl und selfsignedcert nicht vergessen, falls das die Remotes ebenfalls nutzen.
Hm...ich hab da bisher nicht viel wegen Absicherung gemacht. Ich ab Allowed IPs eingetragen.
User und Pass sind bei allen drei Rechnern gleich.
Wo muss das "user:auth" rein? Im WEB-Modul?
"ssl und selfsignedcert" sagt mir jetzt auch nix ?
Zeit ist nun auch knapp. Wir geht ins Bett. Ich kann dann erst morgen Abend erst wieder weiter machen.
Gruss Gerd
attr fhemsync remote-auth username:password
attr fhemsync remote-ssl true
attr fhemsync remote-selfsignedcert true
attr fhemsync remote2-auth username:password
attr fhemsync remote2-ssl true
attr fhemsync remote2-selfsignedcert true
Damit sollte es dann klappen. Kein Thema, gerne dann morgen. Danke!
Ok Danke.
Morgen dann wenn nix dazwischen kommt .
Gn8
Hallo Dominik.
Ich hatte das gestern noch eingetragen. War aber ohne Erfolg.
Habe aber heute kein Nerv mehr.
Bis morgen Abend.
Gruß Gerd
Hallo Dominik,
heute habe ich wieder etwas Zeit. Leider Funktioniert nun gar nix mehr :(
[14.5.2020, 20:14:01] [MAIN ] Starting FHEMSync version 2.8.4...
[14.5.2020, 20:14:01] [MAIN ] Options: {"version":"2.8.4","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[14.5.2020, 20:17:05] [MAIN ] Starting FHEMSync version 2.8.4...
[14.5.2020, 20:17:06] [MAIN ] Options: {"version":"2.8.4","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[14.5.2020, 20:20:28] [MAIN ] Starting FHEMSync version 2.8.4...
[14.5.2020, 20:20:29] [MAIN ] Options: {"version":"2.8.4","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[14.5.2020, 20:20:47] [SLAVE1 ] Failed to get CSRF-Token: RequestError: Error: Client network socket disconnected before secure TLS connection was established
Das letzte mit SLAVE1 kam erst nach dem ich heute ein Update mit restart von FHEM gemacht habe.
Ansonsten erfolgen nun gar keine Logeintraege ?!
Hier mein RAW (Master)
defmod fhemsync FHEMSYNC
attr fhemsync FHEMSync-auth crypt:55005817455555555555555509490206
attr fhemsync FHEMSync-log ./log/fhemsync-%Y-%m-%d.log
attr fhemsync FHEMSync-port 8083
attr fhemsync FHEMSync-selfsignedcert true
attr fhemsync FHEMSync-server 127.0.0.1
attr fhemsync FHEMSync-ssl true
attr fhemsync FHEMSync-webname fhem
attr fhemsync devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
attr fhemsync nrarchive 10
attr fhemsync remote-auth crypt:550058174555555555555555555490206
attr fhemsync remote-selfsignedcert true
attr fhemsync remote-server 192.168.xxx.30
attr fhemsync remote-ssl true
attr fhemsync remote2-auth crypt:55005817455555555555555555490206
attr fhemsync remote2-selfsignedcert true
attr fhemsync remote2-server 192.168.xxx.32
attr fhemsync remote2-ssl true
attr fhemsync room FHEMSync
attr fhemsync stateFormat fhemsync
attr fhemsync verbose 0
setstate fhemsync running /usr/bin/fhemsync
setstate fhemsync 2020-05-14 20:20:27 fhemsync running /usr/bin/fhemsync
Irgend eine Idee?
Gruss Gerd
Hi,
teste bitte mit verbose 3, du hast aktuell 0, daher kein Log. Ich glaube es funktioniert sogar, weil sonst Fehler im Log sein sollten.
Hm...also da passiert nun gar nix mehr ?!
Verbose 3 ist an.
Ich sehe auch nix wenn ich FHEMSync restarte ?!
Im FHEM Log:
2020.05.14 20:34:33 3: fhemsync: read: end of file reached while sysread
2020.05.14 20:34:33 3: fhemsync: stopped
2020.05.14 20:34:33 3: fhemsync: starting
2020.05.14 20:34:33 3: fhemsync: using logfile: ./log/fhemsync-2020-05-14.log
2020.05.14 20:34:53 3: fhemsync: read: end of file reached while sysread
2020.05.14 20:34:53 3: fhemsync: stopped
2020.05.14 20:34:53 3: fhemsync: starting
2020.05.14 20:34:53 3: fhemsync: using logfile: ./log/fhemsync-2020-05-14.log
Im FHEMSyncLog:
{"version":"2.8.4","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true}
[14.5.2020, 20:34:34] [MAIN ] Starting FHEMSync version 2.8.4...
[14.5.2020, 20:34:35] [MAIN ] Options: {"version":"2.8.4","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true,"verbose":true}
[14.5.2020, 20:34:55] [MAIN ] Starting FHEMSync version 2.8.4...
[14.5.2020, 20:34:55] [MAIN ] Options: {"version":"2.8.4","fhem":true,"port":true,"webname":true,"auth":true,"device":true,"selfSignedCert":true,"verbose":true}
Hmm..mach bitte mal verbose=5
Läuft nun.
SSL und die zugehörigen Attribute wurden gelöscht da kein HTTPS akiv ist.
Danke Dominik
Die Modul-Version 0.9.9 und fhemsync.js Version 2.8.9 (Update vom 20.05.20) laufen bei mir bisher ohne Probleme.
Danke Dominik :D
Gruss Gerd
Zitat von: Maista am 21 Mai 2020, 14:59:03
Die Modul-Version 0.9.9 und fhemsync.js Version 2.8.9 (Update vom 20.05.20) laufen bei mir bisher ohne Probleme.
Danke Dominik :D
Gruss Gerd
Version 2.8.10 gibt's.
Danke Dominik.
Zitat von: mnennstiel am 05 April 2020, 01:25:43
Hallo Dominik,
könntest du das Modul auf mehrere Remote-Raspis erweitern? Ich möchte von meinem Haupt-Raspi gerne Daten von mehreren Raspis im Haus sammeln.
Gruss Maik
Hat sich da schon was getan?
Gruß Maik
Zitat von: mnennstiel am 11 Juni 2020, 18:16:25
Hat sich da schon was getan?
Gruß Maik
Ja tut schon eine Weile.
Hab zwei Slave.
Gruß Gerd
Hallo Gerd,
wie funktioniert das? Wie muss ich das parametrieren?
Zitat von: mnennstiel am 11 Juni 2020, 22:38:43
Hallo Gerd,
wie funktioniert das? Wie muss ich das parametrieren?
OK - habe das aktuelle Modul FHEMSYNC nochmal runtergeladen und reloaded.
Modul kann jetzt 5 Remote-Raspis handeln - super!!!
@Dominik : Wäre es möglich den Ordner der Hauptinstanz, in der die FHEMSYNC Devices landen, variabel als Attribut zu setzen?
Ich habe bei mir den FHEMSYNC - Ordner in einem anderen Ordner verschachtelt. Daher muss ich den ROOM immer anpassen.
Gruss Maik
Hallo Maik
Falls du das noch benötigst
defmod fhemsync FHEMSYNC
attr fhemsync FHEMSync-auth crypt:<PW muss gesetzt werden>
attr fhemsync FHEMSync-log ./log/fhemsync-%Y-%m-%d.log
attr fhemsync FHEMSync-port 8083
attr fhemsync FHEMSync-selfsignedcert false
attr fhemsync FHEMSync-server 127.0.0.1
attr fhemsync FHEMSync-ssl false
attr fhemsync FHEMSync-webname fhem
attr fhemsync devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
attr fhemsync nrarchive 10
attr fhemsync remote-auth crypt: <PW muss gesetzt werden>
attr fhemsync remote-server 192.168.178.xx
attr fhemsync remote2-auth crypt:<PW muss gesetzt werden>
attr fhemsync remote2-server 192.168.178.xx
attr fhemsync room FHEMSync
attr fhemsync stateFormat fhemsync
attr fhemsync verbose 0
Hab kein HTTPS aktiv. In den "Web allowed" der Slave s habe ich noch die IP Adresse des Masters eingetragen.
Gruß Gerd
Ich hoffe mir kann jemand helfen.
Mein Ziel ist es auf zwei FHEM-Installationen gewisse Dinge gegenseitig zu syncronisieren.
Hier die Definition des Masters:
Internals:
CFGFN
FUUID 5f2dbb0b-f33f-0a06-cd92-77bb2550e8e77bb5
LAST_START 2020-08-08 10:56:47
LAST_STOP 2020-08-08 10:57:13
NAME s1nachs2
NR 420
NTFY_ORDER 50-s1nachs2
STARTS 42
STATE stopped
TYPE FHEMSYNC
VERSION 0.9.9
logfile ./log/fhemsync-%Y-%m-%d.log
CoProcess:
cmdFn FHEMSYNC_getCMD
name fhemsync
state stopped
READINGS:
2020-08-08 10:57:13 fhemsync stopped
2020-08-08 10:56:50 version 2.8.10
Attributes:
FHEMSync-auth crypt:xxxxxxxx
FHEMSync-log ./log/fhemsync-%Y-%m-%d.log
FHEMSync-port 8083
FHEMSync-selfsignedcert true
FHEMSync-server 127.0.0.1
FHEMSync-ssl false
FHEMSync-webname fhem
devStateIcon stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
nrarchive 10
remote-auth crypt:xxxxxxx
remote-filter room=FHEMSyncS1
remote-port 8083
remote-selfsignedcert true
remote-server 192.168.0.61
remote-ssl true
remote-webname fhem
room FHEMSync
stateFormat fhemsync
verbose 5
Gestartet habe ich mit einem MAX! Fensterkontakt. Dieser war dann unkontrollierbar (Auf-Zu-Auf-Zu.......).
Danach habe ich einen Dummy auf dem Slave erstellt.
Internals:
CFGFN
FUUID 5f2dcb5f-f33f-b917-8ec3-5aa5b0c866c196ed
NAME fsync
NR 2566
STATE off
TYPE dummy
READINGS:
2020-08-08 10:57:13 state off
Attributes:
room FHEMSyncS1,Anfang
setList on off
Wenn ich nun auf dem Slave den Schalter betätige kommt bei verbose 5 folgendes im Logfile (Auszug nur eine Sekunde:
[code][8.8.2020, 10:57:12] [MASTER ] ["fsync","on","<div id=\"fsync\" title=\"on\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync off&room=FHEMSync&fwcsrf=csrf_460676519129777\"><svg class=\" on\" data-txt=\"on\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"537pt\" viewBox=\"0 0 468 537\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,537) scale(0.181395,-0.181395)\" stroke=\"none\"> <path d=\"M957 2932 c-14 -16 -17 -43 -17 -174 0 -135 2 -157 18 -171 28 -25 72 -26 96 -1 13 13 16 43 16 173 0 140 -2 160 -18 174 -25 22 -75 21 -95 -1z\"/> <path d=\"M1506 2928 c-13 -18 -16 -53 -16 -174 0 -138 2 -152 20 -169 24 -22 77 -22 99 0 13 12 17 44 19 151 4 147 1 174 -24 198 -24 24 -80 20 -98 -6z\"/> <path d=\"M278 2834 c-29 -15 -44 -50 -34 -81 3 -11 73 -85 154 -166 127 -126 153 -147 180 -147 34 0 72 38 72 73 0 34 -312 341 -342 337 -2 -1 -15 -7 -30 -16z\"/> <path d=\"M2235 2840 c-34 -14 -315 -305 -315 -327 0 -35 38 -73 72 -73 27 0 53 21 180 148 82 81 151 157 155 170 12 51 -44 100 -92 82z\"/> <path d=\"M1039 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M2110 2123 c-49 -19 -64 -68 -34 -111 15 -22 19 -22 238 -22 211 0 224 1 241 20 23 25 24 76 1 98 -14 15 -44 17 -224 19 -114 1 -214 -1 -222 -4z\"/> <path d=\"M16 2098 c-22 -31 -20 -71 5 -94 19 -17 39 -19 240 -19 236 0 241 1 254 55 4 18 0 34 -15 53 l-21 27 -224 0 c-220 0 -224 0 -239 -22z\"/> <path d=\"M26 1559 c-32 -25 -35 -70 -6 -99 19 -19 33 -20 238 -20 207 0 220 1 237 20 26 29 24 79 -4 102 -21 16 -44 18 -231 18 -195 0 -209 -1 -234 -21z\"/> <path d=\"M2080 1560 c-23 -23 -26 -68 -6 -96 13 -18 30 -19 233 -22 203 -3 221 -2 243 16 32 26 32 78 1 104 -21 16 -44 18 -237 18 -201 0 -215 -1 -234 -20z\"/> <path d=\"M20 1010 c-29 -29 -26 -74 7 -100 26 -20 36 -21 240 -18 184 3 215 5 229 20 23 22 22 73 -1 98 -17 19 -30 20 -237 20 -205 0 -219 -1 -238 -20z\"/> <path d=\"M2077 1012 c-22 -25 -21 -75 1 -95 16 -15 48 -17 238 -17 120 0 224 4 231 8 32 20 32 94 0 114 -7 4 -111 8 -233 8 -201 0 -222 -2 -237 -18z\"/> <path d=\"M998 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M1023 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M1023 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M1101 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [MASTER ] ["fsync-state","on","on"]
[8.8.2020, 10:57:12] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_460676519129777","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"https:","slashes":true,"auth":null,"host":"127.0.0.1:8083","port":"8083","hostname":"127.0.0.1","hash":null,"search":"?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","query":"XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","pathname":"/fhem","path":"/fhem?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","href":"https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777"},"method":"GET","headers":{"authorization":"Basic YWRtaW46ZXU2OW1lbEY=","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[8.8.2020, 10:57:12] [MASTER ] ["fsync","off","<div id=\"fsync\" title=\"off\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync on&room=FHEMSync&fwcsrf=csrf_460676519129777\"><svg class=\" off\" data-txt=\"off\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"617pt\" viewBox=\"0 0 468 617\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,617) scale(0.221801,-0.221801)\" stroke=\"none\"> <path d=\"M756 2765 c-9 -25 -7 -128 3 -144 5 -8 16 -11 25 -8 24 10 19 160 -5 165 -9 2 -19 -4 -23 -13z\"/> <path d=\"M1310 2695 c0 -77 2 -86 18 -83 14 3 17 15 17 83 0 68 -3 80 -17 83 -16 3 -18 -6 -18 -83z\"/> <path d=\"M1806 2581 c-47 -47 -67 -74 -63 -85 4 -9 10 -16 14 -16 17 0 155 148 149 159 -14 21 -32 10 -100 -58z\"/> <path d=\"M220 2622 c0 -23 125 -147 139 -138 21 13 11 32 -52 94 -64 64 -87 75 -87 44z\"/> <path d=\"M809 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M13 2065 c-11 -29 12 -36 118 -33 96 3 104 4 104 23 0 19 -8 20 -108 23 -91 2 -108 0 -114 -13z\"/> <path d=\"M1877 2066 c-11 -27 17 -36 113 -36 100 0 127 8 117 34 -5 13 -25 16 -116 16 -83 0 -110 -3 -114 -14z\"/> <path d=\"M10 1510 c0 -19 6 -20 116 -20 105 0 115 2 112 18 -3 15 -18 17 -116 20 -107 3 -112 2 -112 -18z\"/> <path d=\"M1876 1522 c-2 -4 -1 -14 3 -20 5 -9 38 -12 117 -10 89 2 109 6 109 18 0 12 -20 16 -112 18 -61 1 -114 -1 -117 -6z\"/> <path d=\"M21 981 c-8 -5 -11 -16 -8 -25 5 -13 24 -16 112 -16 88 0 107 3 112 16 10 26 -16 34 -112 34 -50 0 -96 -4 -104 -9z\"/> <path d=\"M1882 978 c-8 -8 -9 -15 -1 -25 8 -9 40 -13 108 -13 101 0 128 8 118 34 -5 13 -24 16 -110 16 -66 0 -107 -4 -115 -12z\"/> <path d=\"M768 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M793 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M793 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M871 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [MASTER ] ["fsync-state","off","off"]
[8.8.2020, 10:57:12] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_460676519129777","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"https:","slashes":true,"auth":null,"host":"127.0.0.1:8083","port":"8083","hostname":"127.0.0.1","hash":null,"search":"?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777","query":"XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777","pathname":"/fhem","path":"/fhem?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777","href":"https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777"},"method":"GET","headers":{"authorization":"Basic YWRtaW46ZXU2OW1lbEY=","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync","on","<div id=\"fsync\" title=\"on\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync off&room=FHEMSyncS1&fwcsrf=csrf_864756115167164\"><svg class=\" on\" data-txt=\"on\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"537pt\" viewBox=\"0 0 468 537\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,537) scale(0.181395,-0.181395)\" stroke=\"none\"> <path d=\"M957 2932 c-14 -16 -17 -43 -17 -174 0 -135 2 -157 18 -171 28 -25 72 -26 96 -1 13 13 16 43 16 173 0 140 -2 160 -18 174 -25 22 -75 21 -95 -1z\"/> <path d=\"M1506 2928 c-13 -18 -16 -53 -16 -174 0 -138 2 -152 20 -169 24 -22 77 -22 99 0 13 12 17 44 19 151 4 147 1 174 -24 198 -24 24 -80 20 -98 -6z\"/> <path d=\"M278 2834 c-29 -15 -44 -50 -34 -81 3 -11 73 -85 154 -166 127 -126 153 -147 180 -147 34 0 72 38 72 73 0 34 -312 341 -342 337 -2 -1 -15 -7 -30 -16z\"/> <path d=\"M2235 2840 c-34 -14 -315 -305 -315 -327 0 -35 38 -73 72 -73 27 0 53 21 180 148 82 81 151 157 155 170 12 51 -44 100 -92 82z\"/> <path d=\"M1039 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M2110 2123 c-49 -19 -64 -68 -34 -111 15 -22 19 -22 238 -22 211 0 224 1 241 20 23 25 24 76 1 98 -14 15 -44 17 -224 19 -114 1 -214 -1 -222 -4z\"/> <path d=\"M16 2098 c-22 -31 -20 -71 5 -94 19 -17 39 -19 240 -19 236 0 241 1 254 55 4 18 0 34 -15 53 l-21 27 -224 0 c-220 0 -224 0 -239 -22z\"/> <path d=\"M26 1559 c-32 -25 -35 -70 -6 -99 19 -19 33 -20 238 -20 207 0 220 1 237 20 26 29 24 79 -4 102 -21 16 -44 18 -231 18 -195 0 -209 -1 -234 -21z\"/> <path d=\"M2080 1560 c-23 -23 -26 -68 -6 -96 13 -18 30 -19 233 -22 203 -3 221 -2 243 16 32 26 32 78 1 104 -21 16 -44 18 -237 18 -201 0 -215 -1 -234 -20z\"/> <path d=\"M20 1010 c-29 -29 -26 -74 7 -100 26 -20 36 -21 240 -18 184 3 215 5 229 20 23 22 22 73 -1 98 -17 19 -30 20 -237 20 -205 0 -219 -1 -238 -20z\"/> <path d=\"M2077 1012 c-22 -25 -21 -75 1 -95 16 -15 48 -17 238 -17 120 0 224 4 231 8 32 20 32 94 0 114 -7 4 -111 8 -233 8 -201 0 -222 -2 -237 -18z\"/> <path d=\"M998 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M1023 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M1023 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M1101 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync-state","on","on"]
[8.8.2020, 10:57:12] [SLAVE1 ] update reading: fsync-state => on
[8.8.2020, 10:57:12] [MAIN ] starting FHEM_execute_await
[8.8.2020, 10:57:12] [MASTER ] executing: https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync","off","<div id=\"fsync\" title=\"off\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync on&room=FHEMSyncS1&fwcsrf=csrf_864756115167164\"><svg class=\" off\" data-txt=\"off\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"617pt\" viewBox=\"0 0 468 617\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,617) scale(0.221801,-0.221801)\" stroke=\"none\"> <path d=\"M756 2765 c-9 -25 -7 -128 3 -144 5 -8 16 -11 25 -8 24 10 19 160 -5 165 -9 2 -19 -4 -23 -13z\"/> <path d=\"M1310 2695 c0 -77 2 -86 18 -83 14 3 17 15 17 83 0 68 -3 80 -17 83 -16 3 -18 -6 -18 -83z\"/> <path d=\"M1806 2581 c-47 -47 -67 -74 -63 -85 4 -9 10 -16 14 -16 17 0 155 148 149 159 -14 21 -32 10 -100 -58z\"/> <path d=\"M220 2622 c0 -23 125 -147 139 -138 21 13 11 32 -52 94 -64 64 -87 75 -87 44z\"/> <path d=\"M809 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M13 2065 c-11 -29 12 -36 118 -33 96 3 104 4 104 23 0 19 -8 20 -108 23 -91 2 -108 0 -114 -13z\"/> <path d=\"M1877 2066 c-11 -27 17 -36 113 -36 100 0 127 8 117 34 -5 13 -25 16 -116 16 -83 0 -110 -3 -114 -14z\"/> <path d=\"M10 1510 c0 -19 6 -20 116 -20 105 0 115 2 112 18 -3 15 -18 17 -116 20 -107 3 -112 2 -112 -18z\"/> <path d=\"M1876 1522 c-2 -4 -1 -14 3 -20 5 -9 38 -12 117 -10 89 2 109 6 109 18 0 12 -20 16 -112 18 -61 1 -114 -1 -117 -6z\"/> <path d=\"M21 981 c-8 -5 -11 -16 -8 -25 5 -13 24 -16 112 -16 88 0 107 3 112 16 10 26 -16 34 -112 34 -50 0 -96 -4 -104 -9z\"/> <path d=\"M1882 978 c-8 -8 -9 -15 -1 -25 8 -9 40 -13 108 -13 101 0 128 8 118 34 -5 13 -24 16 -110 16 -66 0 -107 -4 -115 -12z\"/> <path d=\"M768 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M793 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M793 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M871 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync-state","off","off"]
[8.8.2020, 10:57:12] [SLAVE1 ] update reading: fsync-state => off
[8.8.2020, 10:57:12] [MAIN ] starting FHEM_execute_await
[8.8.2020, 10:57:12] [MASTER ] executing: https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777
[8.8.2020, 10:57:12] [MASTER ] ["fsync","on","<div id=\"fsync\" title=\"on\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync off&room=FHEMSync&fwcsrf=csrf_460676519129777\"><svg class=\" on\" data-txt=\"on\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"537pt\" viewBox=\"0 0 468 537\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,537) scale(0.181395,-0.181395)\" stroke=\"none\"> <path d=\"M957 2932 c-14 -16 -17 -43 -17 -174 0 -135 2 -157 18 -171 28 -25 72 -26 96 -1 13 13 16 43 16 173 0 140 -2 160 -18 174 -25 22 -75 21 -95 -1z\"/> <path d=\"M1506 2928 c-13 -18 -16 -53 -16 -174 0 -138 2 -152 20 -169 24 -22 77 -22 99 0 13 12 17 44 19 151 4 147 1 174 -24 198 -24 24 -80 20 -98 -6z\"/> <path d=\"M278 2834 c-29 -15 -44 -50 -34 -81 3 -11 73 -85 154 -166 127 -126 153 -147 180 -147 34 0 72 38 72 73 0 34 -312 341 -342 337 -2 -1 -15 -7 -30 -16z\"/> <path d=\"M2235 2840 c-34 -14 -315 -305 -315 -327 0 -35 38 -73 72 -73 27 0 53 21 180 148 82 81 151 157 155 170 12 51 -44 100 -92 82z\"/> <path d=\"M1039 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M2110 2123 c-49 -19 -64 -68 -34 -111 15 -22 19 -22 238 -22 211 0 224 1 241 20 23 25 24 76 1 98 -14 15 -44 17 -224 19 -114 1 -214 -1 -222 -4z\"/> <path d=\"M16 2098 c-22 -31 -20 -71 5 -94 19 -17 39 -19 240 -19 236 0 241 1 254 55 4 18 0 34 -15 53 l-21 27 -224 0 c-220 0 -224 0 -239 -22z\"/> <path d=\"M26 1559 c-32 -25 -35 -70 -6 -99 19 -19 33 -20 238 -20 207 0 220 1 237 20 26 29 24 79 -4 102 -21 16 -44 18 -231 18 -195 0 -209 -1 -234 -21z\"/> <path d=\"M2080 1560 c-23 -23 -26 -68 -6 -96 13 -18 30 -19 233 -22 203 -3 221 -2 243 16 32 26 32 78 1 104 -21 16 -44 18 -237 18 -201 0 -215 -1 -234 -20z\"/> <path d=\"M20 1010 c-29 -29 -26 -74 7 -100 26 -20 36 -21 240 -18 184 3 215 5 229 20 23 22 22 73 -1 98 -17 19 -30 20 -237 20 -205 0 -219 -1 -238 -20z\"/> <path d=\"M2077 1012 c-22 -25 -21 -75 1 -95 16 -15 48 -17 238 -17 120 0 224 4 231 8 32 20 32 94 0 114 -7 4 -111 8 -233 8 -201 0 -222 -2 -237 -18z\"/> <path d=\"M998 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M1023 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M1023 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M1101 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [MASTER ] ["fsync-state","on","on"]
[8.8.2020, 10:57:12] [MASTER ] ["fsync","off","<div id=\"fsync\" title=\"off\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync on&room=FHEMSync&fwcsrf=csrf_460676519129777\"><svg class=\" off\" data-txt=\"off\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"617pt\" viewBox=\"0 0 468 617\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,617) scale(0.221801,-0.221801)\" stroke=\"none\"> <path d=\"M756 2765 c-9 -25 -7 -128 3 -144 5 -8 16 -11 25 -8 24 10 19 160 -5 165 -9 2 -19 -4 -23 -13z\"/> <path d=\"M1310 2695 c0 -77 2 -86 18 -83 14 3 17 15 17 83 0 68 -3 80 -17 83 -16 3 -18 -6 -18 -83z\"/> <path d=\"M1806 2581 c-47 -47 -67 -74 -63 -85 4 -9 10 -16 14 -16 17 0 155 148 149 159 -14 21 -32 10 -100 -58z\"/> <path d=\"M220 2622 c0 -23 125 -147 139 -138 21 13 11 32 -52 94 -64 64 -87 75 -87 44z\"/> <path d=\"M809 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M13 2065 c-11 -29 12 -36 118 -33 96 3 104 4 104 23 0 19 -8 20 -108 23 -91 2 -108 0 -114 -13z\"/> <path d=\"M1877 2066 c-11 -27 17 -36 113 -36 100 0 127 8 117 34 -5 13 -25 16 -116 16 -83 0 -110 -3 -114 -14z\"/> <path d=\"M10 1510 c0 -19 6 -20 116 -20 105 0 115 2 112 18 -3 15 -18 17 -116 20 -107 3 -112 2 -112 -18z\"/> <path d=\"M1876 1522 c-2 -4 -1 -14 3 -20 5 -9 38 -12 117 -10 89 2 109 6 109 18 0 12 -20 16 -112 18 -61 1 -114 -1 -117 -6z\"/> <path d=\"M21 981 c-8 -5 -11 -16 -8 -25 5 -13 24 -16 112 -16 88 0 107 3 112 16 10 26 -16 34 -112 34 -50 0 -96 -4 -104 -9z\"/> <path d=\"M1882 978 c-8 -8 -9 -15 -1 -25 8 -9 40 -13 108 -13 101 0 128 8 118 34 -5 13 -24 16 -110 16 -66 0 -107 -4 -115 -12z\"/> <path d=\"M768 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M793 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M793 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M871 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [MASTER ] ["fsync-state","off","off"]
[8.8.2020, 10:57:12] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_460676519129777","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"https:","slashes":true,"auth":null,"host":"127.0.0.1:8083","port":"8083","hostname":"127.0.0.1","hash":null,"search":"?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","query":"XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","pathname":"/fhem","path":"/fhem?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","href":"https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777"},"method":"GET","headers":{"authorization":"Basic YWRtaW46ZXU2OW1lbEY=","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync","on","<div id=\"fsync\" title=\"on\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync off&room=FHEMSyncS1&fwcsrf=csrf_864756115167164\"><svg class=\" on\" data-txt=\"on\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"537pt\" viewBox=\"0 0 468 537\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,537) scale(0.181395,-0.181395)\" stroke=\"none\"> <path d=\"M957 2932 c-14 -16 -17 -43 -17 -174 0 -135 2 -157 18 -171 28 -25 72 -26 96 -1 13 13 16 43 16 173 0 140 -2 160 -18 174 -25 22 -75 21 -95 -1z\"/> <path d=\"M1506 2928 c-13 -18 -16 -53 -16 -174 0 -138 2 -152 20 -169 24 -22 77 -22 99 0 13 12 17 44 19 151 4 147 1 174 -24 198 -24 24 -80 20 -98 -6z\"/> <path d=\"M278 2834 c-29 -15 -44 -50 -34 -81 3 -11 73 -85 154 -166 127 -126 153 -147 180 -147 34 0 72 38 72 73 0 34 -312 341 -342 337 -2 -1 -15 -7 -30 -16z\"/> <path d=\"M2235 2840 c-34 -14 -315 -305 -315 -327 0 -35 38 -73 72 -73 27 0 53 21 180 148 82 81 151 157 155 170 12 51 -44 100 -92 82z\"/> <path d=\"M1039 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M2110 2123 c-49 -19 -64 -68 -34 -111 15 -22 19 -22 238 -22 211 0 224 1 241 20 23 25 24 76 1 98 -14 15 -44 17 -224 19 -114 1 -214 -1 -222 -4z\"/> <path d=\"M16 2098 c-22 -31 -20 -71 5 -94 19 -17 39 -19 240 -19 236 0 241 1 254 55 4 18 0 34 -15 53 l-21 27 -224 0 c-220 0 -224 0 -239 -22z\"/> <path d=\"M26 1559 c-32 -25 -35 -70 -6 -99 19 -19 33 -20 238 -20 207 0 220 1 237 20 26 29 24 79 -4 102 -21 16 -44 18 -231 18 -195 0 -209 -1 -234 -21z\"/> <path d=\"M2080 1560 c-23 -23 -26 -68 -6 -96 13 -18 30 -19 233 -22 203 -3 221 -2 243 16 32 26 32 78 1 104 -21 16 -44 18 -237 18 -201 0 -215 -1 -234 -20z\"/> <path d=\"M20 1010 c-29 -29 -26 -74 7 -100 26 -20 36 -21 240 -18 184 3 215 5 229 20 23 22 22 73 -1 98 -17 19 -30 20 -237 20 -205 0 -219 -1 -238 -20z\"/> <path d=\"M2077 1012 c-22 -25 -21 -75 1 -95 16 -15 48 -17 238 -17 120 0 224 4 231 8 32 20 32 94 0 114 -7 4 -111 8 -233 8 -201 0 -222 -2 -237 -18z\"/> <path d=\"M998 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M1023 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M1023 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M1101 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync-state","on","on"]
[8.8.2020, 10:57:12] [SLAVE1 ] update reading: fsync-state => on
[8.8.2020, 10:57:12] [MAIN ] starting FHEM_execute_await
[8.8.2020, 10:57:12] [MASTER ] executing: https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777
[8.8.2020, 10:57:12] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_460676519129777","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"https:","slashes":true,"auth":null,"host":"127.0.0.1:8083","port":"8083","hostname":"127.0.0.1","hash":null,"search":"?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777","query":"XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777","pathname":"/fhem","path":"/fhem?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777","href":"https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777"},"method":"GET","headers":{"authorization":"Basic YWRtaW46ZXU2OW1lbEY=","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync","off","<div id=\"fsync\" title=\"off\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync on&room=FHEMSyncS1&fwcsrf=csrf_864756115167164\"><svg class=\" off\" data-txt=\"off\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"617pt\" viewBox=\"0 0 468 617\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,617) scale(0.221801,-0.221801)\" stroke=\"none\"> <path d=\"M756 2765 c-9 -25 -7 -128 3 -144 5 -8 16 -11 25 -8 24 10 19 160 -5 165 -9 2 -19 -4 -23 -13z\"/> <path d=\"M1310 2695 c0 -77 2 -86 18 -83 14 3 17 15 17 83 0 68 -3 80 -17 83 -16 3 -18 -6 -18 -83z\"/> <path d=\"M1806 2581 c-47 -47 -67 -74 -63 -85 4 -9 10 -16 14 -16 17 0 155 148 149 159 -14 21 -32 10 -100 -58z\"/> <path d=\"M220 2622 c0 -23 125 -147 139 -138 21 13 11 32 -52 94 -64 64 -87 75 -87 44z\"/> <path d=\"M809 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M13 2065 c-11 -29 12 -36 118 -33 96 3 104 4 104 23 0 19 -8 20 -108 23 -91 2 -108 0 -114 -13z\"/> <path d=\"M1877 2066 c-11 -27 17 -36 113 -36 100 0 127 8 117 34 -5 13 -25 16 -116 16 -83 0 -110 -3 -114 -14z\"/> <path d=\"M10 1510 c0 -19 6 -20 116 -20 105 0 115 2 112 18 -3 15 -18 17 -116 20 -107 3 -112 2 -112 -18z\"/> <path d=\"M1876 1522 c-2 -4 -1 -14 3 -20 5 -9 38 -12 117 -10 89 2 109 6 109 18 0 12 -20 16 -112 18 -61 1 -114 -1 -117 -6z\"/> <path d=\"M21 981 c-8 -5 -11 -16 -8 -25 5 -13 24 -16 112 -16 88 0 107 3 112 16 10 26 -16 34 -112 34 -50 0 -96 -4 -104 -9z\"/> <path d=\"M1882 978 c-8 -8 -9 -15 -1 -25 8 -9 40 -13 108 -13 101 0 128 8 118 34 -5 13 -24 16 -110 16 -66 0 -107 -4 -115 -12z\"/> <path d=\"M768 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M793 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M793 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M871 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync-state","off","off"]
[8.8.2020, 10:57:12] [SLAVE1 ] update reading: fsync-state => off
[8.8.2020, 10:57:12] [MAIN ] starting FHEM_execute_await
[8.8.2020, 10:57:12] [MASTER ] executing: https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777
[8.8.2020, 10:57:12] [MASTER ] ["fsync","on","<div id=\"fsync\" title=\"on\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync off&room=FHEMSync&fwcsrf=csrf_460676519129777\"><svg class=\" on\" data-txt=\"on\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"537pt\" viewBox=\"0 0 468 537\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,537) scale(0.181395,-0.181395)\" stroke=\"none\"> <path d=\"M957 2932 c-14 -16 -17 -43 -17 -174 0 -135 2 -157 18 -171 28 -25 72 -26 96 -1 13 13 16 43 16 173 0 140 -2 160 -18 174 -25 22 -75 21 -95 -1z\"/> <path d=\"M1506 2928 c-13 -18 -16 -53 -16 -174 0 -138 2 -152 20 -169 24 -22 77 -22 99 0 13 12 17 44 19 151 4 147 1 174 -24 198 -24 24 -80 20 -98 -6z\"/> <path d=\"M278 2834 c-29 -15 -44 -50 -34 -81 3 -11 73 -85 154 -166 127 -126 153 -147 180 -147 34 0 72 38 72 73 0 34 -312 341 -342 337 -2 -1 -15 -7 -30 -16z\"/> <path d=\"M2235 2840 c-34 -14 -315 -305 -315 -327 0 -35 38 -73 72 -73 27 0 53 21 180 148 82 81 151 157 155 170 12 51 -44 100 -92 82z\"/> <path d=\"M1039 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M2110 2123 c-49 -19 -64 -68 -34 -111 15 -22 19 -22 238 -22 211 0 224 1 241 20 23 25 24 76 1 98 -14 15 -44 17 -224 19 -114 1 -214 -1 -222 -4z\"/> <path d=\"M16 2098 c-22 -31 -20 -71 5 -94 19 -17 39 -19 240 -19 236 0 241 1 254 55 4 18 0 34 -15 53 l-21 27 -224 0 c-220 0 -224 0 -239 -22z\"/> <path d=\"M26 1559 c-32 -25 -35 -70 -6 -99 19 -19 33 -20 238 -20 207 0 220 1 237 20 26 29 24 79 -4 102 -21 16 -44 18 -231 18 -195 0 -209 -1 -234 -21z\"/> <path d=\"M2080 1560 c-23 -23 -26 -68 -6 -96 13 -18 30 -19 233 -22 203 -3 221 -2 243 16 32 26 32 78 1 104 -21 16 -44 18 -237 18 -201 0 -215 -1 -234 -20z\"/> <path d=\"M20 1010 c-29 -29 -26 -74 7 -100 26 -20 36 -21 240 -18 184 3 215 5 229 20 23 22 22 73 -1 98 -17 19 -30 20 -237 20 -205 0 -219 -1 -238 -20z\"/> <path d=\"M2077 1012 c-22 -25 -21 -75 1 -95 16 -15 48 -17 238 -17 120 0 224 4 231 8 32 20 32 94 0 114 -7 4 -111 8 -233 8 -201 0 -222 -2 -237 -18z\"/> <path d=\"M998 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M1023 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M1023 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M1101 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [MASTER ] ["fsync-state","on","on"]
[8.8.2020, 10:57:12] [MASTER ] ["fsync","off","<div id=\"fsync\" title=\"off\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync on&room=FHEMSync&fwcsrf=csrf_460676519129777\"><svg class=\" off\" data-txt=\"off\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"617pt\" viewBox=\"0 0 468 617\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,617) scale(0.221801,-0.221801)\" stroke=\"none\"> <path d=\"M756 2765 c-9 -25 -7 -128 3 -144 5 -8 16 -11 25 -8 24 10 19 160 -5 165 -9 2 -19 -4 -23 -13z\"/> <path d=\"M1310 2695 c0 -77 2 -86 18 -83 14 3 17 15 17 83 0 68 -3 80 -17 83 -16 3 -18 -6 -18 -83z\"/> <path d=\"M1806 2581 c-47 -47 -67 -74 -63 -85 4 -9 10 -16 14 -16 17 0 155 148 149 159 -14 21 -32 10 -100 -58z\"/> <path d=\"M220 2622 c0 -23 125 -147 139 -138 21 13 11 32 -52 94 -64 64 -87 75 -87 44z\"/> <path d=\"M809 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M13 2065 c-11 -29 12 -36 118 -33 96 3 104 4 104 23 0 19 -8 20 -108 23 -91 2 -108 0 -114 -13z\"/> <path d=\"M1877 2066 c-11 -27 17 -36 113 -36 100 0 127 8 117 34 -5 13 -25 16 -116 16 -83 0 -110 -3 -114 -14z\"/> <path d=\"M10 1510 c0 -19 6 -20 116 -20 105 0 115 2 112 18 -3 15 -18 17 -116 20 -107 3 -112 2 -112 -18z\"/> <path d=\"M1876 1522 c-2 -4 -1 -14 3 -20 5 -9 38 -12 117 -10 89 2 109 6 109 18 0 12 -20 16 -112 18 -61 1 -114 -1 -117 -6z\"/> <path d=\"M21 981 c-8 -5 -11 -16 -8 -25 5 -13 24 -16 112 -16 88 0 107 3 112 16 10 26 -16 34 -112 34 -50 0 -96 -4 -104 -9z\"/> <path d=\"M1882 978 c-8 -8 -9 -15 -1 -25 8 -9 40 -13 108 -13 101 0 128 8 118 34 -5 13 -24 16 -110 16 -66 0 -107 -4 -115 -12z\"/> <path d=\"M768 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M793 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M793 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M871 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [MASTER ] ["fsync-state","off","off"]
[8.8.2020, 10:57:12] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_460676519129777","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"https:","slashes":true,"auth":null,"host":"127.0.0.1:8083","port":"8083","hostname":"127.0.0.1","hash":null,"search":"?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","query":"XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","pathname":"/fhem","path":"/fhem?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","href":"https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777"},"method":"GET","headers":{"authorization":"Basic YWRtaW46ZXU2OW1lbEY=","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[8.8.2020, 10:57:12] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_460676519129777","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"https:","slashes":true,"auth":null,"host":"127.0.0.1:8083","port":"8083","hostname":"127.0.0.1","hash":null,"search":"?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777","query":"XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777","pathname":"/fhem","path":"/fhem?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777","href":"https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777"},"method":"GET","headers":{"authorization":"Basic YWRtaW46ZXU2OW1lbEY=","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync","on","<div id=\"fsync\" title=\"on\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync off&room=FHEMSyncS1&fwcsrf=csrf_864756115167164\"><svg class=\" on\" data-txt=\"on\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"537pt\" viewBox=\"0 0 468 537\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,537) scale(0.181395,-0.181395)\" stroke=\"none\"> <path d=\"M957 2932 c-14 -16 -17 -43 -17 -174 0 -135 2 -157 18 -171 28 -25 72 -26 96 -1 13 13 16 43 16 173 0 140 -2 160 -18 174 -25 22 -75 21 -95 -1z\"/> <path d=\"M1506 2928 c-13 -18 -16 -53 -16 -174 0 -138 2 -152 20 -169 24 -22 77 -22 99 0 13 12 17 44 19 151 4 147 1 174 -24 198 -24 24 -80 20 -98 -6z\"/> <path d=\"M278 2834 c-29 -15 -44 -50 -34 -81 3 -11 73 -85 154 -166 127 -126 153 -147 180 -147 34 0 72 38 72 73 0 34 -312 341 -342 337 -2 -1 -15 -7 -30 -16z\"/> <path d=\"M2235 2840 c-34 -14 -315 -305 -315 -327 0 -35 38 -73 72 -73 27 0 53 21 180 148 82 81 151 157 155 170 12 51 -44 100 -92 82z\"/> <path d=\"M1039 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M2110 2123 c-49 -19 -64 -68 -34 -111 15 -22 19 -22 238 -22 211 0 224 1 241 20 23 25 24 76 1 98 -14 15 -44 17 -224 19 -114 1 -214 -1 -222 -4z\"/> <path d=\"M16 2098 c-22 -31 -20 -71 5 -94 19 -17 39 -19 240 -19 236 0 241 1 254 55 4 18 0 34 -15 53 l-21 27 -224 0 c-220 0 -224 0 -239 -22z\"/> <path d=\"M26 1559 c-32 -25 -35 -70 -6 -99 19 -19 33 -20 238 -20 207 0 220 1 237 20 26 29 24 79 -4 102 -21 16 -44 18 -231 18 -195 0 -209 -1 -234 -21z\"/> <path d=\"M2080 1560 c-23 -23 -26 -68 -6 -96 13 -18 30 -19 233 -22 203 -3 221 -2 243 16 32 26 32 78 1 104 -21 16 -44 18 -237 18 -201 0 -215 -1 -234 -20z\"/> <path d=\"M20 1010 c-29 -29 -26 -74 7 -100 26 -20 36 -21 240 -18 184 3 215 5 229 20 23 22 22 73 -1 98 -17 19 -30 20 -237 20 -205 0 -219 -1 -238 -20z\"/> <path d=\"M2077 1012 c-22 -25 -21 -75 1 -95 16 -15 48 -17 238 -17 120 0 224 4 231 8 32 20 32 94 0 114 -7 4 -111 8 -233 8 -201 0 -222 -2 -237 -18z\"/> <path d=\"M998 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M1023 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M1023 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M1101 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync-state","on","on"]
[8.8.2020, 10:57:12] [SLAVE1 ] update reading: fsync-state => on
[8.8.2020, 10:57:12] [MAIN ] starting FHEM_execute_await
[8.8.2020, 10:57:12] [MASTER ] executing: https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync","off","<div id=\"fsync\" title=\"off\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync on&room=FHEMSyncS1&fwcsrf=csrf_864756115167164\"><svg class=\" off\" data-txt=\"off\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"617pt\" viewBox=\"0 0 468 617\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,617) scale(0.221801,-0.221801)\" stroke=\"none\"> <path d=\"M756 2765 c-9 -25 -7 -128 3 -144 5 -8 16 -11 25 -8 24 10 19 160 -5 165 -9 2 -19 -4 -23 -13z\"/> <path d=\"M1310 2695 c0 -77 2 -86 18 -83 14 3 17 15 17 83 0 68 -3 80 -17 83 -16 3 -18 -6 -18 -83z\"/> <path d=\"M1806 2581 c-47 -47 -67 -74 -63 -85 4 -9 10 -16 14 -16 17 0 155 148 149 159 -14 21 -32 10 -100 -58z\"/> <path d=\"M220 2622 c0 -23 125 -147 139 -138 21 13 11 32 -52 94 -64 64 -87 75 -87 44z\"/> <path d=\"M809 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M13 2065 c-11 -29 12 -36 118 -33 96 3 104 4 104 23 0 19 -8 20 -108 23 -91 2 -108 0 -114 -13z\"/> <path d=\"M1877 2066 c-11 -27 17 -36 113 -36 100 0 127 8 117 34 -5 13 -25 16 -116 16 -83 0 -110 -3 -114 -14z\"/> <path d=\"M10 1510 c0 -19 6 -20 116 -20 105 0 115 2 112 18 -3 15 -18 17 -116 20 -107 3 -112 2 -112 -18z\"/> <path d=\"M1876 1522 c-2 -4 -1 -14 3 -20 5 -9 38 -12 117 -10 89 2 109 6 109 18 0 12 -20 16 -112 18 -61 1 -114 -1 -117 -6z\"/> <path d=\"M21 981 c-8 -5 -11 -16 -8 -25 5 -13 24 -16 112 -16 88 0 107 3 112 16 10 26 -16 34 -112 34 -50 0 -96 -4 -104 -9z\"/> <path d=\"M1882 978 c-8 -8 -9 -15 -1 -25 8 -9 40 -13 108 -13 101 0 128 8 118 34 -5 13 -24 16 -110 16 -66 0 -107 -4 -115 -12z\"/> <path d=\"M768 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M793 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M793 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M871 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [SLAVE1 ] ["fsync-state","off","off"]
[8.8.2020, 10:57:12] [SLAVE1 ] update reading: fsync-state => off
[8.8.2020, 10:57:12] [MAIN ] starting FHEM_execute_await
[8.8.2020, 10:57:12] [MASTER ] executing: https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=csrf_460676519129777
[8.8.2020, 10:57:12] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_460676519129777","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"https:","slashes":true,"auth":null,"host":"127.0.0.1:8083","port":"8083","hostname":"127.0.0.1","hash":null,"search":"?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","query":"XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","pathname":"/fhem","path":"/fhem?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777","href":"https://127.0.0.1:8083/fhem?XHR=1&cmd=setreading%20fsync%20state%20on&fwcsrf=csrf_460676519129777"},"method":"GET","headers":{"authorization":"Basic YWRtaW46ZXU2OW1lbEY=","accept-encoding":"gzip, deflate","accept":"application/json"}}}
[8.8.2020, 10:57:12] [MASTER ] ["fsync","on","<div id=\"fsync\" title=\"on\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync off&room=FHEMSync&fwcsrf=csrf_460676519129777\"><svg class=\" on\" data-txt=\"on\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"537pt\" viewBox=\"0 0 468 537\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,537) scale(0.181395,-0.181395)\" stroke=\"none\"> <path d=\"M957 2932 c-14 -16 -17 -43 -17 -174 0 -135 2 -157 18 -171 28 -25 72 -26 96 -1 13 13 16 43 16 173 0 140 -2 160 -18 174 -25 22 -75 21 -95 -1z\"/> <path d=\"M1506 2928 c-13 -18 -16 -53 -16 -174 0 -138 2 -152 20 -169 24 -22 77 -22 99 0 13 12 17 44 19 151 4 147 1 174 -24 198 -24 24 -80 20 -98 -6z\"/> <path d=\"M278 2834 c-29 -15 -44 -50 -34 -81 3 -11 73 -85 154 -166 127 -126 153 -147 180 -147 34 0 72 38 72 73 0 34 -312 341 -342 337 -2 -1 -15 -7 -30 -16z\"/> <path d=\"M2235 2840 c-34 -14 -315 -305 -315 -327 0 -35 38 -73 72 -73 27 0 53 21 180 148 82 81 151 157 155 170 12 51 -44 100 -92 82z\"/> <path d=\"M1039 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M2110 2123 c-49 -19 -64 -68 -34 -111 15 -22 19 -22 238 -22 211 0 224 1 241 20 23 25 24 76 1 98 -14 15 -44 17 -224 19 -114 1 -214 -1 -222 -4z\"/> <path d=\"M16 2098 c-22 -31 -20 -71 5 -94 19 -17 39 -19 240 -19 236 0 241 1 254 55 4 18 0 34 -15 53 l-21 27 -224 0 c-220 0 -224 0 -239 -22z\"/> <path d=\"M26 1559 c-32 -25 -35 -70 -6 -99 19 -19 33 -20 238 -20 207 0 220 1 237 20 26 29 24 79 -4 102 -21 16 -44 18 -231 18 -195 0 -209 -1 -234 -21z\"/> <path d=\"M2080 1560 c-23 -23 -26 -68 -6 -96 13 -18 30 -19 233 -22 203 -3 221 -2 243 16 32 26 32 78 1 104 -21 16 -44 18 -237 18 -201 0 -215 -1 -234 -20z\"/> <path d=\"M20 1010 c-29 -29 -26 -74 7 -100 26 -20 36 -21 240 -18 184 3 215 5 229 20 23 22 22 73 -1 98 -17 19 -30 20 -237 20 -205 0 -219 -1 -238 -20z\"/> <path d=\"M2077 1012 c-22 -25 -21 -75 1 -95 16 -15 48 -17 238 -17 120 0 224 4 231 8 32 20 32 94 0 114 -7 4 -111 8 -233 8 -201 0 -222 -2 -237 -18z\"/> <path d=\"M998 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M1023 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M1023 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M1101 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [MASTER ] ["fsync-state","on","on"]
[8.8.2020, 10:57:12] [MASTER ] ["fsync","off","<div id=\"fsync\" title=\"off\" class=\"col2\"><a href=\"/fhem?cmd.fsync=set fsync on&room=FHEMSync&fwcsrf=csrf_460676519129777\"><svg class=\" off\" data-txt=\"off\" version=\"1.0\" xmlns=\"http://www.w3.org/2000/svg\" width=\"468pt\" height=\"617pt\" viewBox=\"0 0 468 617\" preserveAspectRatio=\"xMidYMid meet\"> <metadata> Created by potrace 1.8, written by Peter Selinger 2001-2007 </metadata> <g transform=\"translate(0,617) scale(0.221801,-0.221801)\" stroke=\"none\"> <path d=\"M756 2765 c-9 -25 -7 -128 3 -144 5 -8 16 -11 25 -8 24 10 19 160 -5 165 -9 2 -19 -4 -23 -13z\"/> <path d=\"M1310 2695 c0 -77 2 -86 18 -83 14 3 17 15 17 83 0 68 -3 80 -17 83 -16 3 -18 -6 -18 -83z\"/> <path d=\"M1806 2581 c-47 -47 -67 -74 -63 -85 4 -9 10 -16 14 -16 17 0 155 148 149 159 -14 21 -32 10 -100 -58z\"/> <path d=\"M220 2622 c0 -23 125 -147 139 -138 21 13 11 32 -52 94 -64 64 -87 75 -87 44z\"/> <path d=\"M809 2257 c-70 -20 -136 -63 -174 -115 -65 -87 -65 -91 -65 -643 0 -479 1 -500 21 -553 25 -67 87 -134 160 -173 l54 -28 250 0 c235 0 253 1 296 21 63 29 125 94 158 163 l26 56 0 520 0 520 -28 56 c-32 66 -99 132 -165 162 -43 20 -65 22 -267 24 -153 2 -234 -2 -266 -10z m486 -146 c48 -22 69 -44 90 -94 13 -31 15 -107 15 -517 0 -526 0 -523 -59 -573 -48 -40 -90 -47 -299 -47 -205 0 -226 4 -280 54 -51 46 -50 40 -53 567 l-3 495 23 40 c24 43 64 72 115 85 17 4 117 7 221 8 164 0 195 -2 230 -18z\"/> <path d=\"M13 2065 c-11 -29 12 -36 118 -33 96 3 104 4 104 23 0 19 -8 20 -108 23 -91 2 -108 0 -114 -13z\"/> <path d=\"M1877 2066 c-11 -27 17 -36 113 -36 100 0 127 8 117 34 -5 13 -25 16 -116 16 -83 0 -110 -3 -114 -14z\"/> <path d=\"M10 1510 c0 -19 6 -20 116 -20 105 0 115 2 112 18 -3 15 -18 17 -116 20 -107 3 -112 2 -112 -18z\"/> <path d=\"M1876 1522 c-2 -4 -1 -14 3 -20 5 -9 38 -12 117 -10 89 2 109 6 109 18 0 12 -20 16 -112 18 -61 1 -114 -1 -117 -6z\"/> <path d=\"M21 981 c-8 -5 -11 -16 -8 -25 5 -13 24 -16 112 -16 88 0 107 3 112 16 10 26 -16 34 -112 34 -50 0 -96 -4 -104 -9z\"/> <path d=\"M1882 978 c-8 -8 -9 -15 -1 -25 8 -9 40 -13 108 -13 101 0 128 8 118 34 -5 13 -24 16 -110 16 -66 0 -107 -4 -115 -12z\"/> <path d=\"M768 693 c-36 -41 -30 -95 11 -108 60 -19 520 -91 539 -84 53 20 66 99 19 119 -28 12 -482 90 -524 90 -16 0 -37 -8 -45 -17z\"/> <path d=\"M793 530 c-41 -17 -58 -85 -28 -110 15 -12 492 -100 543 -100 33 0 62 34 62 74 0 18 -6 38 -13 44 -6 6 -120 29 -252 51 -132 23 -251 43 -265 46 -14 2 -35 0 -47 -5z\"/> <path d=\"M793 360 c-46 -18 -59 -90 -20 -114 21 -13 478 -96 531 -96 14 0 35 9 46 20 26 26 27 85 2 99 -10 5 -126 28 -258 51 -131 22 -248 42 -259 44 -11 2 -30 0 -42 -4z\"/> <path d=\"M871 154 c-26 -33 -27 -55 -3 -82 13 -16 51 -26 176 -47 158 -27 159 -27 185 -7 31 22 41 81 18 101 -16 13 -271 61 -323 61 -23 0 -39 -8 -53 -26z\"/> </g> </svg></a></div>"]
[8.8.2020, 10:57:12] [MASTER ] ["fsync-state","off","off"]
[8.8.2020, 10:57:12] [MASTER ] response: {"statusCode":200,"headers":{"content-length":"20","cache-control":"no-cache, no-store, must-revalidate","content-encoding":"gzip","x-fhem-csrftoken":"csrf_460676519129777","content-type":"text/plain; charset=UTF-8"},"request":{"uri":{"protocol":"https:","slashes":true,"auth":null,"host":"127.0.0.1:8083","port":"8083","hostname":"127.0.0.1","hash":null,"search":"?XHR=1&cmd=setreading%20fsync%20state%20off&fwcsrf=cs
Hi,
hast du FHEMSync auch NUR am Master definiert und NICHT am Slave? FHEMSync darf nur auf dem Master laufen.
Das zu synchronisierende Device darf NUR am Slave im Raum FHEMSync angelegt werden, am Master wird es dann automatisch erstellt.
Zitat von: dominik am 08 August 2020, 12:17:43
Hi,
hast du FHEMSync auch NUR am Master definiert und NICHT am Slave? FHEMSync darf nur auf dem Master laufen.
Das zu synchronisierende Device darf NUR am Slave im Raum FHEMSync angelegt werden, am Master wird es dann automatisch erstellt.
Hallo Dominik.
Ja ist nur am Master definiert, wobei ich zuerst auf beiden Servern einen eigenen Master definiert hatte mit eigenen Räumen "FHEMSync1 und FHEMSync2".
Den Zweiten habe ich jedoch gelöscht, da es doch nicht so einfach war als ich gedacht hatte. Auf beiden FHEM-Servern ist das fhemsync npm installiert.
Danke und Gruß
Uli
Update:
npm fhemsync jetzt auf dem zweiten Server deinstalliert.
Leider keine Besserung.
Für mich sieht es so aus als ob der Slave zum Master den Befehl sendet und dann der Master zurück an den Slave. Und dadurch ein Ping-Pong entsteht.
Bei Verbose 4 und einem FS20 Device kommt folgendes im Logfile:
[8.8.2020, 22:47:41] [MASTER ] received set command: FS20_1e1b04, on
[8.8.2020, 22:47:41] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:41] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:41] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:41] [SLAVE1 ] update reading: FS20_1e1b04-state => off
[8.8.2020, 22:47:41] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => COMMANDSET,on
[8.8.2020, 22:47:42] [MASTER ] received set command: FS20_1e1b04, on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:42] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => off
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => COMMANDSET,on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
[8.8.2020, 22:47:43] [SLAVE1 ] update reading: FS20_1e1b04-state => on
Wäre toll wenn mir jemand helfen könnte. Fhem2fhem erzeugt extreme Systemlast. Mehrere 100Mb Ramusage und ein Prozessor zu 100% ausgelastet.
Gruß
Uli
Moin Uli
Wieso sollte dein Slave an den Master Daten senden?
Bei mir verbindet sich der Master zum Slave und holt dich dort die Daten von Devises die im Raum "FHEMSync"
stehen.
Das funktioniert bisher ohne Probleme und mit zwei Slave.
Gruß Gerd
Zitat von: Maista am 08 August 2020, 23:18:57
Moin Uli
Wieso sollte dein Slave an den Master Daten senden?
Bei mir verbindet sich der Master zum Slave und holt dich dort die Daten von Devises die im Raum "FHEMSync"
stehen.
Das funktioniert bisher ohne Probleme und mit zwei Slave.
Gruß Gerd
Hi Gerd,
habe den "Fehler" gefunden. FHEMSync und fhem2fhem kommen sich in die Quere.
Danke für deine Gedanken. Doppelte synchronisation macht keinen Sinn.
Gruß
Uli
Moin Uli
Ja da hast Du wohl Recht ;D
Manchmal ist es doch einfach ;)
Gruß Gerd
Leider muss ich immer wieder feststellen, dass FHEMSync abstürzt.
Meine Vermutung ist, dass diese Fehlermeldungen im Log der Grund sind:
(node:11827) UnhandledPromiseRejectionWarning: RequestError: Error: read ECONNRESET
at new RequestError (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/errors.js:14:15)
at Request.plumbing.callback (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:87:29)
at Request.RP$callback [as _callback] (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:46:31)
at self.callback (/usr/local/lib/node_modules/fhemsync/node_modules/request/request.js:185:22)
at emitOne (events.js:116:13)
at Request.emit (events.js:211:7)
at Request.onRequestError (/usr/local/lib/node_modules/fhemsync/node_modules/request/request.js:877:8)
at emitOne (events.js:116:13)
at ClientRequest.emit (events.js:211:7)
at TLSSocket.socketErrorListener (_http_client.js:387:9)
at emitOne (events.js:116:13)
at TLSSocket.emit (events.js:211:7)
at emitErrorNT (internal/streams/destroy.js:64:8)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickCallback (internal/process/next_tick.js:180:9)
(node:11827) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 554)
(node:11827) UnhandledPromiseRejectionWarning: RequestError: Error: read ECONNRESET
at new RequestError (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/errors.js:14:15)
at Request.plumbing.callback (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:87:29)
at Request.RP$callback [as _callback] (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:46:31)
at self.callback (/usr/local/lib/node_modules/fhemsync/node_modules/request/request.js:185:22)
at emitOne (events.js:116:13)
at Request.emit (events.js:211:7)
at Request.onRequestError (/usr/local/lib/node_modules/fhemsync/node_modules/request/request.js:877:8)
at emitOne (events.js:116:13)
at ClientRequest.emit (events.js:211:7)
at TLSSocket.socketErrorListener (_http_client.js:387:9)
at emitOne (events.js:116:13)
at TLSSocket.emit (events.js:211:7)
at emitErrorNT (internal/streams/destroy.js:64:8)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickCallback (internal/process/next_tick.js:180:9)
(node:11827) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 555)
(node:11827) UnhandledPromiseRejectionWarning: RequestError: Error: read ECONNRESET
at new RequestError (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/errors.js:14:15)
at Request.plumbing.callback (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:87:29)
at Request.RP$callback [as _callback] (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:46:31)
at self.callback (/usr/local/lib/node_modules/fhemsync/node_modules/request/request.js:185:22)
at emitOne (events.js:116:13)
at Request.emit (events.js:211:7)
at Request.onRequestError (/usr/local/lib/node_modules/fhemsync/node_modules/request/request.js:877:8)
at emitOne (events.js:116:13)
at ClientRequest.emit (events.js:211:7)
at TLSSocket.socketErrorListener (_http_client.js:387:9)
at emitOne (events.js:116:13)
at TLSSocket.emit (events.js:211:7)
at emitErrorNT (internal/streams/destroy.js:64:8)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickCallback (internal/process/next_tick.js:180:9)
(node:11827) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 556)
Davor und danach ist alles gut und er synct ohne Probleme. Er bringt diese Meldungen sporadisch. Es kann sein, dass mal ne viertel Stunde keine dieser Meldungen im Log erscheint. Ein wirklicher reject findet eigentlich nicht statt, da die einzelnen Geräte ohne Fehler gesynct werden. Es taucht zwischendurch auf.
Ich hoffe jemand hat einen Tip.
Danke
Gruß
Uli
Leider funktioniert meine FHEMSync auch wieder nicht - könnte Hilfe benötigen.
FHEMSync-auth
XXXXX
FHEMSync-log
./log/fhemsync-%Y-%m-%w.log
FHEMSync-port
8083
FHEMSync-selfsignedcert
true
FHEMSync-server
127.0.0.1
FHEMSync-ssl
true
FHEMSync-webname
fhem
devStateIcon
stopped:control_home@red:start stopping:control_on_off@orange running.*:control_on_off@green:stop
icon
it_raspberry
nrarchive
10
remote-auth
XXXX
remote-selfsignedcert
true
remote-server
192.168.178.113
remote-ssl
true
remote-webname
fhem
remote2-auth
XXXXX
remote2-selfsignedcert
true
remote2-server
192.168.178.93
remote2-ssl
true
remote2-webname
fhem
room
98_System->FHEMSync
stateFormat
fhemsync
verbose
5
In der Gesamtlog steht :
2020.09.06 19:18:25 3: fhemsync: read: end of file reached while sysread
2020.09.06 19:18:25 3: fhemsync: stopped
2020.09.06 19:18:25 3: fhemsync: starting
2020.09.06 19:18:25 3: fhemsync: using logfile: ./log/fhemsync-2020-09-0.log
Fhemsync ist im running und grün - es werden aber keine neuen Devices von den Remote-Raspis empfangen.
Ich habe ein paar Devices von früher im Fhemsync vom Master, die über einen Remote früher angelegt wurden - die werden noch aktuallisiert. Außerdem ist in der FHEMSync-Log-Datei nix drin - leer!
In den Remotes habe ich in WEB attr allowfrom 192.168 eingetragen.
Gruß Maik
Hi, poste mal das fhemsync Log (bitte darauf zu achten dort die Passwoerter zu entfernen), da im Debug Mode ziemlich viel geloggt wird.
OH MAN - ich habe es mittlerweile selber gefunden!
Ich hatte noch irgendwo im Master in einem DOIF den Device-Namen eingetragen. Das hat verhindert, dass das Gerät synchronisiert wurde!
Danke erstmal erledigt!
Aber vielleicht nochmal eine andere Bitte - wenn ich das synchronisierte Gerät über den Device-Overview steuern will, sprich EIN oder AUS schalten will, ist das nicht möglich!
Wahrscheinlich ist generell ein Device nicht zu steuern ?
In meinem Fall handelt es sich um ein ZIGBEE Device.
Gruss Maik
Probier es mal am Slave. Wenn es dort nicht steuern geht, wird es am Master auch nicht steuern gehen. Tut es dort, dann sollte es am Master auch gehen.
Also im Slave geht es - allerdings für dein besseres Verständniss, gibt es kein SET <Gerät> ON oder OFF im DEVICEOVERVIEW.
Einzigst über den WebCmd wird ON:OFF deklariert, das wie gesagt auch funktioniert!
Die Fehlermeldung, die im Master kommt ist "Unknown argument off, choose one of remove:noArg" ...zur Info "REMOVE" gibt es im DEVICEOVERVIEW des Slaves.
Ah, das erklärt es. Dann mach am Slave noch ein setList Attribut mit in off, oder den Maintainer anschreiben, dass on/off mit aufgenommen werden soll.
Ehrlich gesagt, weiss ich nicht, wie ich das setList Attribut in den Slave einfügen soll. Das Attribut ist dort nicht verfügbar.
In einem Dummy kann ich das setList Attribut setzen aber ich weiss nicht, wie ich es in meinem Gerät deklarieren kann.
Oder geht das nur über den Dummy-Umweg?
Hallo,
mal eine Frage - kann ich den State ON OFF auch in der Master-Instanz beeinflussen?
Wenn ja - dann funktioniert es bei mir irgendwie nicht. Habe einen Dummy im Remote programmiert, der im Master auf ON gesetzt werden soll!
Dummy:
genericDeviceType
switch
room
FHEMSync
setList
on off
Bitte um Hilfe
mfG
Maik
Meinst du wenn du am Remote Device auf off setzt, damit es am Master auch off ist? Das geht definitiv, da der Status im Master nur durch Remote gesetzt wird.
Kann ich im master nicht auch setzen?
Ich muss über den Master den Remote ein Signal schicken.
Meine Alexa Skill ist mit dem master verknüpft und kann auch nur mit einem Raspi verknüpft sein, und soll aber im Remote was auslösen
Ja klar, du kannst sowohl im Master als auch Remote setzen. Sollte dann in die jeweils andere Richtung synchronisiert werden.
Eigentlich ist das dummy Device ein Device wie jedes andere, du musst es nur in den FHEMSync Raum am Remote Device geben und dann das erstellte Device am Master verwenden. Hast du vielleicht am Master schon vorher so ein Device mit gleichem Namen gehabt?
Funktioniert nicht - wenn ich es im master setzen will kommt ein "f"
Es werden auch keine Liga mehr geschrieben!
Dann dürfte der Sync generell gerade nicht funktionieren bei dir, oder? Weil es macht ja keinen Unterschied ob du den neuen dummy im Master setzt oder ein bestehendes Device.
Doch wenn ich im Remote setze erscheint die Änderung auch im master
Kann ich das fhemsync irgendwie General resetten?
Einfach Neustart machen, fhemsync speichert keinerlei Daten, daher ist nichts zum loeschen oder so.
Hier jetzt doch einen Auszug aus der LogFile:
(node:28501) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 6)
(node:28501) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'execute' of undefined
at FHEM.executeSlaveSetCmd [as cmdSetFct] (/usr/lib/node_modules/fhemsync/fhemsync.js:584:23)
(node:28501) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 7)
(node:1566) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'execute' of undefined
at FHEM.executeSlaveSetCmd [as cmdSetFct] (/usr/lib/node_modules/fhemsync/fhemsync.js:584:23)
(Use `node --trace-warnings ...` to show where the warning was created)
(node:1566) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:1566) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
Zitat von: mnennstiel am 26 September 2020, 23:25:22
Hier jetzt doch einen Auszug aus der LogFile:
(node:28501) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 6)
(node:28501) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'execute' of undefined
at FHEM.executeSlaveSetCmd [as cmdSetFct] (/usr/lib/node_modules/fhemsync/fhemsync.js:584:23)
(node:28501) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 7)
(node:1566) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'execute' of undefined
at FHEM.executeSlaveSetCmd [as cmdSetFct] (/usr/lib/node_modules/fhemsync/fhemsync.js:584:23)
(Use `node --trace-warnings ...` to show where the warning was created)
(node:1566) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:1566) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
Kann mir einer sagen was das bedeutet?
Läuft das etwas falsch bei Nodes?
Hallo zusammen,
mein erster Test mit FHEMSync ist deutlich erfolgreicher als mit FHEM2FHEM und RFHEM. Ich muss meinen Server im Haus umziehen und daher die USB-Devices (Jeelink und ZWave) in einen Raspi auslagern. Danke für das Modul!
Zitat von: GreenFHEMfan am 27 September 2020, 19:08:15
Kann mir einer sagen was das bedeutet?
Läuft das etwas falsch bei Nodes?
Fürs Protokoll und andere Neueinsteiger. Auch wenn es schon eine Weile her ist: der Fehler lässt sich beheben, wenn man beim Master im fhemsync-device das Attribut FHEMSync-auth (richtig) setzt.
Bei der Installation vom npm kommen drei Warn-Messages:
npm WARN deprecated request@2.88.2: request has been deprecated, see https://github.com/request/request/issues/3142
npm WARN deprecated request-promise@4.2.6: request-promise has been deprecated because it extends the now deprecated request package, see https://github.com/request/request/issues/3142
npm WARN deprecated har-validator@5.1.5: this library is no longer supported
Ist das bekannt, muss gefixt werden und wird auch, oder wie geht's mit FHEMSync weiter?
Viele Grüße
Jan
Hi,
ich habe jetzt schon länger nichts mehr an FHEMSync weiterentwickelt. Aktuell investiere ich viel Zeit in die Entwicklung von fhempy (https://github.com/dominikkarall/fhempy). fhempy hat den Vorteil, dass man keine 2 oder mehr FHEM Installationen benötigt, da man fhempy Module einfach auf Peers (also weiterer Hardware) auslagern kann. So habe ich für alle meine Bluetooth Module fhempy Module geschrieben. Damit ist alles zentral in einer FHEM Instanz managebar, obwohl die Module auf den Peers laufen.
Somit wird für mich FHEMSync nicht mehr so relevant werden. Die deprecated Warnings kann ich aber noch beseitigen, da ich sowieso bei gassistant auch diese Libraries austauschen muss.
Moin Dominik
Schade. Dann kannst du immerhin meine Berichtigungen der Hilfe noch mit einpflegen :-\
In wie weit nutzt mir das fhempy Modul dann um es als Ersatz für FHEMsync zu nutzen?
Gruß Gerd
Bitte noch um Link welche Berichtigung in der Hilfe du meinst? Dann aktualisiere ich das im 1. Post.
fhempy ist nur ein Ersatz für FHEMSync wenn man mit den fhempy Modulen auskommt, das wird aber nur bei wenigen der Fall sein.
Hatte ich dir 2020 per Telegram geschickt.
Du wolltest das bei Gelegenheit einpflegen ;)
Hab dir den Verweis per Telegram geschickt
Hallo Dominik,
traurig zu hören das du den support einstellst. Ich verweden dein FHEMSYNC mit meiner Fheminstanze in der Garage.
Damit funktioniert das Steuern einer Relaisplatine und Taster so was von optimal....
Seit dem heutigen Fhemupdate startet aber fhem aufgrund von fhemsync nicht mehr. Ich erhalte im LOG fhemsync: CoProcess: no such function: cmdFn
Vielleicht hättest du ja die Zeit dir das nochmal anzusehen.
Danke im Vorraus
spacy
Ich kann ja unterstützen wenn es irgendwann nicht mehr gehen sollte. Ich würde nur keine neuen Features einbauen.
Der CoProcess Fehler ist eigenartig, cmdFn liegt nämlich vor, kannst du bitte nochmals die Scripte aus dem 1. Post installieren?
Hallo Dominik,
sorry, ich war da zu schnell und habe den Tag mein Log nicht richtig interpretiert. Nicht Fhemsync ist mein Problem, es lag am Heating Control was ich noch im Einsatz hatte.
Danke
Hi. Jetzt hatte ich endlich was gefunden, was meine Probleme löst.
Hab nach langem Tüffteln die Verbindung hinbekommen, mein erstes Testdevice in den FHEMsync-room geschoben und wollte jetzt hier mein Problem posten.
Und dann lese ich, dass das eh nicht mehr von dir Supportet wird.
Vielleicht kennst du noch einer Alternative zu deinem Modul.
MfG
Manley
Schilder mal das Problem oder poste das Log, vielleicht kann ich ja noch helfen. Ich habe es noch selbst im Einsatz.
FYI: FHEM2FHEM hat kürzlich ein paar Zusatzfunktionen spendiert bekommen, so dass wir uns zumindest entschieden haben auch RFHEM in Rente zu schicken.
FHEMSync kann soweit ich das verstehe noch etwas mehr, aber vielleicht helfen die neusten FHEM2FHEM Änderungen (set cmd, Attribute keepalive und loopThreshold) ja doch mit deinen Anwendungsfällen.
Danke für die schnelle Antwort.
In den Logs ist leider nichts.
Ich habe mir einfach mal eine Dummy-Lampe gemacht mit ON und OFF.
Im RemoteFHEM kann ich die Lampe schalten und sie wird auch im Master geschaltet.
Wenn ich am Master schalte kommt "Unknown argument ON, choose one of"
Das erstellte Device nimmt irgendwie kein set an.
Remote:
Internals:
FUUID 60aa08c3-f33f-ac20-faaa-9c9d1146c4971177
NAME lampe
NR 15
STATE ON
TYPE dummy
READINGS:
2021-10-02 20:48:53 state ON
Attributes:
room FHEMSync,MinecraftTest
webCmd ON:OFF
Master:
Internals:
DEF dummy lampe
FUUID 61589670-f33f-9a7c-d885-3ce9529e01218c8d
FVERSION 10_FHEMSYNC_DEVICE.pm:?/2021-10-02
NAME lampe
NR 238
REMOTENAME lampe
REMOTETYPE dummy
STATE ON
TYPE FHEMSYNC_DEVICE
READINGS:
2021-10-02 20:48:53 state ON
helper:
setlist
json:
Name lampe
PossibleAttrs alias comment:textField-long eventMap:textField-long group room suppressReading userattr userReadings:textField-long verbose:0,1,2,3,4,5 disable disabledForIntervals readingList setExtensionsEvent:1,0 setList useSetExtensions event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading cmdIcon devStateIcon:textField-long devStateStyle icon sortby webCmd webCmdLabel:textField-long widgetOverride
PossibleSets
Attributes:
room FHEMSync,MinecraftTest
webCmd ON:OFF
Internals:
FUUID 60aa08c3-f33f-ac20-faaa-9c9d1146c4971177
NAME lampe
NR 15
STATE 1
TYPE dummy
Readings:
state:
Time 2021-10-02 20:44:29
Value 1
Attributes:
room FHEMSync
webCmd ON:OFF
Hab gerade festgestellt, dass es z.B. das attr SetList auch nicht gibt.
Mich wundert dass das in FHEM überhaupt funktioniert. Ein dummy ohne setList hat eigentlich keine Commands. webCmd sollte man nur für Commands verwenden die schon definiert sind.
Einfach "setList on off" machen dann klappts :)
Hab gestern zum Testen das erste richtige Device angelernt. Einen Wandthermostaten von MAX. Auch der hat keine setlist. Kann ihn also über den Master nicht steuern. Und da der ja eine vorgegebene setlist hat würde ich da ungern dran rumfuschen. Ich werde das mit der setlist am Dummy nochmal testen wenn ich zu Hause bin.
P.s.: scheiß Arbeit.
Da fällt mir noch ein, einen Dummy muss ich doch mit einem beliebigen Wert setzen können. Wäre ja unvorteilhaft wenn ich für alles eine setlist bräuchte. Z.B. ein Dummy der minütlich die Uhrzeit speichert.
Jedes Device welches Commands akzeptiert, hat auch intern bereits ein setList. Das setList Attribut erlaubt generell noch weitere Commands hinzuzufügen. Ein dummy hat standardmäßig keine Commands.
Das Speichern von Infos in einem dummy geht natürlich. Aber es ist kein Command, daher wird es nicht angezeigt. Das MAX solltest du aber steuern können, hast du es schon getestet?
Das selbe Problem mit dem Max.
Ich kann aber auch dem Dummy keinen Wert geben mit "set lampe 1(etc)".
Remote:
Internals:
DEF WallMountedThermostat 1c0792
FUUID 61589be7-f33f-ac20-9803-1962811ec9cecb04
IODev cm
NAME MAX_1c0792
NR 24
NTFY_ORDER 50-MAX_1c0792
STATE 26.6
SVN 23517
TYPE MAX
TimeSlot 0
addr 1c0792
devtype 3
type WallMountedThermostat
webCmd desiredTemperature
READINGS:
2021-10-02 22:36:35 IODev cm
2021-10-02 19:50:54 PairedTo 000000
2021-10-02 20:00:49 RSSI -49.5
2021-10-02 19:50:54 SerialNr QEQ1648061
2021-10-02 19:57:28 battery ok
2021-10-02 19:57:28 batteryState ok
2021-10-02 19:50:54 boostDuration 25
2021-10-02 19:50:54 boostValveposition 80
2021-10-02 19:50:54 comfortTemperature 21.0
2021-10-02 20:00:49 desiredTemperature 21.0
2021-10-02 19:59:06 deviation 8.6
2021-10-02 19:57:28 displayActualTemperature 1
2021-10-02 19:50:54 ecoTemperature 17.0
2021-10-02 19:50:31 error invalid or missing value for READING .weekProfile
2021-10-02 19:50:54 firmware 1.0
2021-10-02 19:57:28 gateway 1
2021-10-02 19:50:31 groupid 0
2021-10-02 19:50:56 lastTimeSync 2021-10-02 19:50:56
2021-10-02 19:50:54 lastcmd WallThermostatConfig
2021-10-02 19:50:54 maximumTemperature on
2021-10-02 19:50:54 measurementOffset 0.0
2021-10-02 19:50:54 minimumTemperature off
2021-10-02 20:00:49 mode auto
2021-10-02 19:50:56 msgcnt 2
2021-10-02 19:57:28 panel unlocked
2021-10-02 19:59:06 peerIDs 000000
2021-10-02 19:59:06 peerList Broadcast
2021-10-02 19:57:28 rferror 0
2021-10-02 20:00:49 state 21.0
2021-10-02 19:59:06 temperature 26.6
2021-10-02 19:50:54 testresult 255
2021-10-02 19:50:54 weekprofile-0-Sat-temp 17.0 °C / 21.0 °C / 17.0 °C
2021-10-02 19:50:54 weekprofile-0-Sat-time 00:00-06:00 / 06:00-22:00 / 22:00-24:00
2021-10-02 19:50:54 weekprofile-1-Sun-temp 17.0 °C / 21.0 °C / 17.0 °C
2021-10-02 19:50:54 weekprofile-1-Sun-time 00:00-06:00 / 06:00-22:00 / 22:00-24:00
2021-10-02 19:50:54 weekprofile-2-Mon-temp 17.0 °C / 21.0 °C / 17.0 °C / 21.0 °C / 17.0 °C
2021-10-02 19:50:54 weekprofile-2-Mon-time 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
2021-10-02 19:50:54 weekprofile-3-Tue-temp 17.0 °C / 21.0 °C / 17.0 °C / 21.0 °C / 17.0 °C
2021-10-02 19:50:54 weekprofile-3-Tue-time 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
2021-10-02 19:50:54 weekprofile-4-Wed-temp 17.0 °C / 21.0 °C / 17.0 °C / 21.0 °C / 17.0 °C
2021-10-02 19:50:54 weekprofile-4-Wed-time 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
2021-10-02 19:50:54 weekprofile-5-Thu-temp 17.0 °C / 21.0 °C / 17.0 °C / 21.0 °C / 17.0 °C
2021-10-02 19:50:54 weekprofile-5-Thu-time 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
2021-10-02 19:50:54 weekprofile-6-Fri-temp 17.0 °C / 21.0 °C / 17.0 °C / 21.0 °C / 17.0 °C
2021-10-02 19:50:54 weekprofile-6-Fri-time 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
2021-10-02 19:50:54 windowOpenTemperature 12.0
Attributes:
IODev cm
model WallMountedThermostat
room FHEMSync,MAX
stateFormat temperature
Master:
Internals:
CFGFN
DEF MAX MAX_1c0792
FUUID 6159f4f4-f33f-9a7c-1e9b-bf6cade900647cdd
NAME MAX_1c0792
NR 6225
NTFY_ORDER 50-MAX_1c0792
REMOTENAME MAX_1c0792
REMOTETYPE MAX
STATE 26.6
SVN 23517
TYPE FHEMSYNC_DEVICE
TimeSlot 0
addr 1c0792
devtype 3
type WallMountedThermostat
webCmd desiredTemperature
READINGS:
2021-10-03 20:22:45 IODev cm
2021-10-03 20:22:45 PairedTo 000000
2021-10-03 20:22:45 RSSI -49.5
2021-10-03 20:22:45 SerialNr QEQ1648061
2021-10-03 20:22:45 battery ok
2021-10-03 20:22:45 batteryState ok
2021-10-03 20:22:45 boostDuration 25
2021-10-03 20:22:45 boostValveposition 80
2021-10-03 20:22:45 comfortTemperature 21.0
2021-10-03 20:22:45 desiredTemperature 21.0
2021-10-03 20:22:45 deviation 8.6
2021-10-03 20:22:45 displayActualTemperature 1
2021-10-03 20:22:45 ecoTemperature 17.0
2021-10-03 20:22:45 error invalid or missing value for READING .weekProfile
2021-10-03 20:22:45 firmware 1.0
2021-10-03 20:22:45 gateway 1
2021-10-03 20:22:45 groupid 0
2021-10-03 20:22:45 lastTimeSync 2021-10-02 19:50:56
2021-10-03 20:22:45 lastcmd WallThermostatConfig
2021-10-03 20:22:45 maximumTemperature on
2021-10-03 20:22:45 measurementOffset 0.0
2021-10-03 20:22:45 minimumTemperature off
2021-10-03 20:22:45 mode auto
2021-10-03 20:22:45 msgcnt 2
2021-10-03 20:22:45 panel unlocked
2021-10-03 20:22:45 peerIDs 000000
2021-10-03 20:22:45 peerList Broadcast
2021-10-03 20:22:45 rferror 0
2021-10-03 20:22:45 state 21.0
2021-10-03 20:22:45 temperature 26.6
2021-10-03 20:22:45 testresult 255
2021-10-03 20:22:45 weekprofile-0-Sat-temp 17.0 °C / 21.0 °C / 17.0 °C
2021-10-03 20:22:45 weekprofile-0-Sat-time 00:00-06:00 / 06:00-22:00 / 22:00-24:00
2021-10-03 20:22:45 weekprofile-1-Sun-temp 17.0 °C / 21.0 °C / 17.0 °C
2021-10-03 20:22:45 weekprofile-1-Sun-time 00:00-06:00 / 06:00-22:00 / 22:00-24:00
2021-10-03 20:22:45 weekprofile-2-Mon-temp 17.0 °C / 21.0 °C / 17.0 °C / 21.0 °C / 17.0 °C
2021-10-03 20:22:45 weekprofile-2-Mon-time 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
2021-10-03 20:22:45 weekprofile-3-Tue-temp 17.0 °C / 21.0 °C / 17.0 °C / 21.0 °C / 17.0 °C
2021-10-03 20:22:45 weekprofile-3-Tue-time 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
2021-10-03 20:22:45 weekprofile-4-Wed-temp 17.0 °C / 21.0 °C / 17.0 °C / 21.0 °C / 17.0 °C
2021-10-03 20:22:45 weekprofile-4-Wed-time 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
2021-10-03 20:22:45 weekprofile-5-Thu-temp 17.0 °C / 21.0 °C / 17.0 °C / 21.0 °C / 17.0 °C
2021-10-03 20:22:45 weekprofile-5-Thu-time 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
2021-10-03 20:22:45 weekprofile-6-Fri-temp 17.0 °C / 21.0 °C / 17.0 °C / 21.0 °C / 17.0 °C
2021-10-03 20:22:45 weekprofile-6-Fri-time 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
2021-10-03 20:22:45 windowOpenTemperature 12.0
helper:
getlist show_savedConfig: 
setlist deviceRename wakeUp:noArg factoryReset:noArg groupid associate:fakeShutterContact deassociate:fakeShutterContact desiredTemperature:eco,comfort,boost,auto,off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on comfortTemperature:off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on ecoTemperature:off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on measurementOffset:-3.5,-3.0,-2.5,-2.0,-1.5,-1.0,-0.5,0.0,0.5,1.0,1.5,2.0,2.5,3.0,3.5 boostDuration:0,10,15,20,25,30,5,60 boostValveposition maximumTemperature:off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on minimumTemperature:off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on windowOpenTemperature:off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on saveConfig weekProfile restoreReadings: restoreDevice: displayActualTemperature:0,1 attrTemplate:?,MAX_WallMountedThermostat_dark
json:
Name MAX_1c0792
PossibleAttrs alias comment:textField-long eventMap:textField-long group room suppressReading userattr userReadings:textField-long verbose:0,1,2,3,4,5 IODev CULdev actCycle do_not_notify:1,0 ignore:0,1 dummy:0,1 keepAuto:0,1 debug:0,1 scanTemp:0,1 skipDouble:0,1 externalSensor model:HeatingThermostat,HeatingThermostatPlus,WallMountedThermostat,ShutterContact,PushButton,Cube,PlugAdapter autosaveConfig:1,0 peers sendMode:peers,group,Broadcast dTempCheck:0,1 windowOpenCheck:0,1 DbLog_log_onoff:0,1 event-aggregator event-min-interval event-on-change-reading event-on-update-reading oldreadings stateFormat:textField-long timestamp-on-change-reading cmdIcon devStateIcon:textField-long devStateStyle icon sortby webCmd webCmdLabel:textField-long widgetOverride
PossibleSets deviceRename wakeUp:noArg factoryReset:noArg groupid associate:fakeShutterContact deassociate:fakeShutterContact desiredTemperature:eco,comfort,boost,auto,off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on comfortTemperature:off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on ecoTemperature:off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on measurementOffset:-3.5,-3.0,-2.5,-2.0,-1.5,-1.0,-0.5,0.0,0.5,1.0,1.5,2.0,2.5,3.0,3.5 boostDuration:0,10,15,20,25,30,5,60 boostValveposition maximumTemperature:off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on minimumTemperature:off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on windowOpenTemperature:off,5.0,5.5,6.0,6.5,7.0,7.5,8.0,8.5,9.0,9.5,10.0,10.5,11.0,11.5,12.0,12.5,13.0,13.5,14.0,14.5,15.0,15.5,16.0,16.5,17.0,17.5,18.0,18.5,19.0,19.5,20.0,20.5,21.0,21.5,22.0,22.5,23.0,23.5,24.0,24.5,25.0,25.5,26.0,26.5,27.0,27.5,28.0,28.5,29.0,29.5,30.0,on saveConfig weekProfile restoreReadings: restoreDevice: displayActualTemperature:0,1 attrTemplate:?,MAX_WallMountedThermostat_dark
Attributes:
IODev cm
model WallMountedThermostat
room FHEMSync,MAX
stateFormat temperature
Internals:
DEF WallMountedThermostat 1c0792
FUUID 61589be7-f33f-ac20-9803-1962811ec9cecb04
IODev cm
NAME MAX_1c0792
NR 24
NTFY_ORDER 50-MAX_1c0792
STATE 26.6
SVN 23517
TYPE MAX
TimeSlot 0
addr 1c0792
devtype 3
type WallMountedThermostat
webCmd desiredTemperature
Readings:
IODev
PairedTo:
Time 2021-10-02 19:50:54
Value 000000
RSSI:
Time 2021-10-02 20:00:49
Value -49.5
SerialNr:
Time 2021-10-02 19:50:54
Value QEQ1648061
battery:
Time 2021-10-02 19:57:28
Value ok
batteryState:
Time 2021-10-02 19:57:28
Value ok
boostDuration:
Time 2021-10-02 19:50:54
Value 25
boostValveposition:
Time 2021-10-02 19:50:54
Value 80
comfortTemperature:
Time 2021-10-02 19:50:54
Value 21.0
desiredTemperature:
Time 2021-10-02 20:00:49
Value 21.0
deviation:
Time 2021-10-02 19:59:06
Value 8.6
displayActualTemperature:
Time 2021-10-02 19:57:28
Value 1
ecoTemperature:
Time 2021-10-02 19:50:54
Value 17.0
error:
Time 2021-10-02 19:50:31
Value invalid or missing value for READING .weekProfile
firmware:
Time 2021-10-02 19:50:54
Value 1.0
gateway:
Time 2021-10-02 19:57:28
Value 1
groupid:
Time 2021-10-02 19:50:31
Value 0
lastTimeSync:
Time 2021-10-02 19:50:56
Value 2021-10-02 19:50:56
lastcmd:
Time 2021-10-02 19:50:54
Value WallThermostatConfig
maximumTemperature:
Time 2021-10-02 19:50:54
Value on
measurementOffset:
Time 2021-10-02 19:50:54
Value 0.0
minimumTemperature:
Time 2021-10-02 19:50:54
Value off
mode:
Time 2021-10-02 20:00:49
Value auto
msgcnt:
Time 2021-10-02 19:50:56
Value 2
panel:
Time 2021-10-02 19:57:28
Value unlocked
peerIDs:
Time 2021-10-02 19:59:06
Value 000000
peerList:
Time 2021-10-02 19:59:06
Value Broadcast
rferror:
Time 2021-10-02 19:57:28
Value 0
state:
Time 2021-10-02 20:00:49
Value 21.0
temperature:
Time 2021-10-02 19:59:06
Value 26.6
testresult:
Time 2021-10-02 19:50:54
Value 255
weekprofile-0-Sat-temp:
Time 2021-10-02 19:50:54
Value 17.0 �C / 21.0 �C / 17.0 �C
weekprofile-0-Sat-time:
Time 2021-10-02 19:50:54
Value 00:00-06:00 / 06:00-22:00 / 22:00-24:00
weekprofile-1-Sun-temp:
Time 2021-10-02 19:50:54
Value 17.0 �C / 21.0 �C / 17.0 �C
weekprofile-1-Sun-time:
Time 2021-10-02 19:50:54
Value 00:00-06:00 / 06:00-22:00 / 22:00-24:00
weekprofile-2-Mon-temp:
Time 2021-10-02 19:50:54
Value 17.0 �C / 21.0 �C / 17.0 �C / 21.0 �C / 17.0 �C
weekprofile-2-Mon-time:
Time 2021-10-02 19:50:54
Value 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
weekprofile-3-Tue-temp:
Time 2021-10-02 19:50:54
Value 17.0 �C / 21.0 �C / 17.0 �C / 21.0 �C / 17.0 �C
weekprofile-3-Tue-time:
Time 2021-10-02 19:50:54
Value 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
weekprofile-4-Wed-temp:
Time 2021-10-02 19:50:54
Value 17.0 �C / 21.0 �C / 17.0 �C / 21.0 �C / 17.0 �C
weekprofile-4-Wed-time:
Time 2021-10-02 19:50:54
Value 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
weekprofile-5-Thu-temp:
Time 2021-10-02 19:50:54
Value 17.0 �C / 21.0 �C / 17.0 �C / 21.0 �C / 17.0 �C
weekprofile-5-Thu-time:
Time 2021-10-02 19:50:54
Value 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
weekprofile-6-Fri-temp:
Time 2021-10-02 19:50:54
Value 17.0 �C / 21.0 �C / 17.0 �C / 21.0 �C / 17.0 �C
weekprofile-6-Fri-time:
Time 2021-10-02 19:50:54
Value 00:00-06:00 / 06:00-09:00 / 09:00-17:00 / 17:00-23:00 / 23:00-24:00
windowOpenTemperature:
Time 2021-10-02 19:50:54
Value 12.0
Attributes:
model WallMountedThermostat
room FHEMSync
stateFormat temperature
userattr model stateFormat
Hab auch das im Log gefunden:
2021-10-05 12:49:06 FHEMSYNC_DEVICE MAX_1c0792 x_getlist show_savedConfig:
2021-10-05 12:49:06 FHEMSYNC_DEVICE MAX_1c0792 COMMANDSET,?
Zitat von: GreenFHEMfan am 27 September 2020, 19:08:15
Kann mir einer sagen was das bedeutet?
Läuft das etwas falsch bei Nodes?
Hi,
habe auch diese Fehler im FHEMsync-LOG.
(node:25856) UnhandledPromiseRejectionWarning: RequestError: Error: read ECONNRESET
at new RequestError (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/errors.js:14:15)
at Request.plumbing.callback (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:87:29)
at Request.RP$callback [as _callback] (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:46:31)
at self.callback (/usr/local/lib/node_modules/fhemsync/node_modules/request/request.js:185:22)
at Request.emit (events.js:314:20)
at Request.onRequestError (/usr/local/lib/node_modules/fhemsync/node_modules/request/request.js:877:8)
at ClientRequest.emit (events.js:314:20)
at Socket.socketErrorListener (_http_client.js:427:9)
at Socket.emit (events.js:314:20)
at emitErrorNT (internal/streams/destroy.js:92:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:60:3)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
(node:25856) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:25856) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
Weiß schon jemand was die bedeuten?
Zitat von: punker am 26 Oktober 2021, 10:56:29Zitat von: GreenFHEMfan am 27 September 2020, 19:08:15Kann mir einer sagen was das bedeutet?
Läuft das etwas falsch bei Nodes?
Hi,
habe auch diese Fehler im FHEMsync-LOG.
(node:25856) UnhandledPromiseRejectionWarning: RequestError: Error: read ECONNRESET
at new RequestError (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/errors.js:14:15)
at Request.plumbing.callback (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:87:29)
at Request.RP$callback [as _callback] (/usr/local/lib/node_modules/fhemsync/node_modules/request-promise-core/lib/plumbing.js:46:31)
at self.callback (/usr/local/lib/node_modules/fhemsync/node_modules/request/request.js:185:22)
at Request.emit (events.js:314:20)
at Request.onRequestError (/usr/local/lib/node_modules/fhemsync/node_modules/request/request.js:877:8)
at ClientRequest.emit (events.js:314:20)
at Socket.socketErrorListener (_http_client.js:427:9)
at Socket.emit (events.js:314:20)
at emitErrorNT (internal/streams/destroy.js:92:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:60:3)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
(node:25856) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:25856) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
Weiß schon jemand was die bedeuten?
Könnte jemand zwischenzeitlich den Fehler der Logeinträge lokalisieren bzw. beheben?