And now for the Google Nest getting telemetry from an IoT Hub using a vocal request, and enunciating the result.

In previous posts it was demonstrated that Google Home could bridge to a Raspberry Pi and get sensor data to be spoken on a Google Nest. “Hey Google, run sensor on Pi” The RPi reads the sensor, creates a string to enunciate, forwards that the desktop running TRIGGERcmd, which then casts the string to the Nest. How about getting it from an Azure IoT Hub with the RPi sensing its telemetry to an IoT Hub. That would be quite scaleable.

The envisaged architecture is:

  • One or more RPis periodically forwarding telemetry data to an Azure IoT Hub.
  • A Console app running on a desktop that reads the next Device to the Cloud (D2C) message that the IoT Hub receives and forms a string from it that can be spoken and then forwards it to the Google Nest via Curl.
  • A TRIGGERcmd command on the desktop that activates the Console app.

This is a work in progress. 2Do

  • D2C App
    • [Done] Console app to get next uploaded telemetry message, extract and write to temp file as readable text
    • Now runs as single pass and exits. Skips all previous messages for last 24 hours and gets last (or waits for it .. 10s)
  • Device Simulator
    • [Done] Console app to send simulated telemetry to IoT Hub. Set to 10s period.
  • djaus2/DNETCoreGPIO option for device using sensors
    • As per simulator but for BME280 and DHT22 on RPi
  • Integrations with Triggercmd
    • Call D2C app
    • Read text and send to Nest

  Next: >  
<  Prev:   Jekyll
 This Category Links 
Category:IoT Index:IoT
<  Prev:   Google Home Windows Bridge