Advanced Analytics: TP Analytics Explore or data export?

I want to dig deeper into the measurements and data stored from devices. Similar to what is described in the thread Temperatur med releutslag i samme graf I want to do custom analysis on topics like temperature and energy consumption.

Ideally I want to export raw data from FH to external cloud storage and analytics, preferably BigQuery. However, I’ve not yet found bandwidth to craft any event collector/exporter similar to what ecollector does for influx.

The best solution would be something that runs fully on the Hub, rather than setting up a MQTT client on a separate device.

Thingsplex > Analytics > Explore (or any way or querying the influxdb on the Hub) could have done the trick, but I have not been able to make that work.

Anyone that have solved for similar and have a tip or two to share?

You can configure your own instance of influxDB (in local network or in a cloud) and configure ecollector to stream data to both internal and external InfluxDB instances. It can be configured in Thingsplex. Would it be a solution if you could stream all MQTT events using the following configuration HUB -> Google IoT core (over MQTT) -> PUB/SUB -> BigQuery?

Analytics -> Explore can work but might be too short for advanced analysis

Thanks A,

Configure an additional influxDB in Cloud has been a solution I’ve considered. The solution via IoT Core does however sound more interesting given that the (ideal) end destination is BQ.

I’m not very experienced with MQTT nor IoT Core. Have you tested that solution in any shape or form?

The only downside I can see would be loss of data if the connection is down. Since there isn’t really a need for very fresh data (ie streaming), you could argue that a solution where data is filtered and stored to a file on the hub for batch inserts to any external DB or tool would be preferable.

Perhaps configuring a PI to listen on MQTT locally and pushing to cloud in batches would be the easiest solution.

That said, I’m more than happy to test whatever if you have done similar.

Thanks again!

There is an increasing number of users and companies that are in need of exporting data for more comprehensive analysis. High-level requirements are very similar, but the details are different. In some cases, users are ready to invest into integration work but in many cases, they just want data into their Analytics DB / Data lake/Spreadsheet.

I’m not very experienced with MQTT nor IoT Core. Have you tested that solution in any shape or form?

Only theoretical knowledge and the biggest problem is their specific topic scheme, which means that FH topics must be rewritten to what google accepts. But that should be doable.

I will think about it. Most likely there will be an app in Playgrounds that can be used for that type of data export (batched and streaming).

I’ve tried to configure ecollector to stream data to a InfluxDB @ http://aiven.io/. There’s no errors displayed in TP and it seems to be running. However, no data nor configurations is sent to the desired DB. Any tips or tricks on how to debug what’s wrong? Ie. where can I retrieve Ecollector logs?

Finally, after a few attempts, I created a new config and got it to work. For now, using an external InfluxDB and tools like Grafana at least provide some more options for analysis that the built in explorer.

However, after running for some hours the number of events drastically declines followed by a complete stop and finally the hub crashes. No idea what causing it as there is no logs available. Could there be a memory leak somewhere?

Eg.

SELECT count("value") FROM "gen_raw"."sensor_temp.evt.sensor.report" WHERE $timeFilter GROUP BY time(1h) fill(null)

Gives:

Time                sensor_temp.evt.sensor.report.count
2021-01-27 02:00:00    85
2021-01-27 03:00:00   151
2021-01-27 04:00:00   179
2021-01-27 05:00:00   166
2021-01-27 06:00:00   162
2021-01-27 07:00:00    92
2021-01-27 08:00:00    90
2021-01-27 09:00:00     9
2021-01-27 10:00:00     4
2021-01-27 11:00:00     5
2021-01-27 12:00:00     0
2021-01-27 13:00:00     0
2021-01-27 14:00:00    70    <- hub reboot after crash

Similar issue on the local InfluxDB:

{
  "serv": "ecollector",
  "type": "cmd.tsdb.get_data_points",
  "val_t": "object",
  "val": {
    "proc_id": 1,
    "field_name": "value",
    "measurement_name": "sensor_temp.evt.sensor.report",
    "relative_time": "6h",
    "from_time": "",
    "to_time": "",
    "group_by_time": "1h",
    "fill_type": "null",
    "data_function": "count"
  },
...
}

Returns

{
  "type": "evt.tsdb.data_points_report",
  "serv": "ecollector",
  "val_t": "object",
  "val": {
    "Results": [
      {
        "Series": [
          {
            "name": "sensor_temp.evt.sensor.report",
            "columns": [
              "time",
              "value"
            ],
            "values": [
              [
                1611730800,
                10
              ],
              [
                1611734400,
                15
              ],
              [
                1611738000,
                2
              ],
              [
                1611741600,
                1
              ],
              [
                1611745200,
                0
              ],
              [
                1611748800,
                0
              ],
              [
                1611752400,
                93
              ]
            ]
          }
        ],
        "Messages": null
      }
    ]
  },
...
}

@alivinco Any ideas on how to monitor debug this behaviour?

Interesting. The new version of ecollector is goint to be released next week, which contains many bug fixes and new features. @jonathse if you want to test it earlier, please PM me your email and I’ll explain how to test the new version before the official release.

2 Likes

What is the status here @jonathse , can I ask if you were able to set up a good UI that get its data from ecollector?

Any dashboard or screenshots to share?

2 Likes

Hi,
I’m a bit new to this, so I’m hoping I can draw on some experience from the good people in this thread.

I’m trying to get sensor data to an external influx db in the cloud. Does anyone have any examples on how this has been achieved. Do I need anything running locally on a server in my network, or should I be able to use a host url, that takes the data all the way to my db in the cloud?

All what you need is InfluxDB ( v1.x or 2.x) instance either in cloud or in your local network . Once you have it, you can configure ecollector service in thinsplex ui to forward all your sensor data to your db . I will share print screens shortly

1 Like

Hi,

trying to do the same, have an Influx v2.6.1 running on a Raspberry Pi 4, and try to get Futurehome to send events into it via ecollector. However, the ecollector settings in Thingsplex are overwritten after each reboot of Futurehub. No data is arriving in the Influx instance. Would anyone here have examples to share? Are you using telegraf on the Influx side, or are you pointing directly to the main Influx listener API?

Not answering your question on ecollector, but I gave it all up. Instead I have a tiny application on my PI that subscribes to the main MQTT topic and forwards all messages to BigQuery via Pub/Sub. Has worked well for me.