Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [hono-dev] Agenda for face-to-face meeting

On 01/04/16 09:51, Hudalla Kai (INST/ESY) wrote:
-----Ursprüngliche Nachricht-----
Von: hono-dev-bounces@xxxxxxxxxxx [mailto:hono-dev-bounces@xxxxxxxxxxx] Im
Auftrag von Gordon Sim
Gesendet: Donnerstag, 31. März 2016 19:29
An: hono-dev@xxxxxxxxxxx
Betreff: Re: [hono-dev] Agenda for face-to-face meeting
[...]
* The content-type is described as 'MUST be set to application/octet-stream if
content is to be considered opaque binary data.'. How would the receiver know
how to decode that? Would they infer it from the device-id? Or would that be
'application specific'? Is the intent of that description to discourage more explicit
content-types?

[KH] I guess you are right. My proposal would be only have binary payload in a Data section and to use the content-type and content-encoding properties to describe the structure and meaning of the data.

Makes sense to me.

* On a related point, I would imagine that some solutions would benefit from not
having to map/convert between different data formats themselves, but to see data
from different devices in some uniform format/model. Do you see some form of
canonicalisation and/or conversion of data to be in or out of scope?

[KH] I totally see the benefit of that but would not consider Hono being in charge of doing the normalization. We are discussing this internally at Bosch currently and we think we should try to use Vorto for defining data semantics and normalize data received from devices in the Protocol Adapters the particular type of data would then be indicated by the content-type. Any back-end consumers can then use this information to read (and understand) the payload data.

I'm not pushing for a change in scope, but I'd be interested to understand why you feel this would not fit in Hono.

To be clear I wouldn't advocate that solutions only see normalized data, but allowing a solution to e.g. subscribe to telemetry/json and get all the relevant events in json format, regardless of how they were published, seems useful.

In the same way Hono abstracts the details of the protocol used to submit events (which hypothetically at least might imply a particular data encoding), it could abstract the details of encoding of the events themselves.

Again, though, I'm not trying to change the scope just to understand and rationalise it in my own mind a bit better.


Back to the top