If the CPU in the IoT Device is powerful enough to make offloading actually worthwhile, isn't that CPU way overkill for the IoT Device's primary function?
I can't imagine a lot of companies putting more powerful (that is, more expensive) chips than is necessary to run the device itself.
exactly; if you have spare cycles and are iot, you did it wrong.
plus, iot is usually of a more realtime nature. who wants to risk timing skews or dropped events because some joker wanted to 'use' my super weak iot device for his alien space searches?
hosts are way overpowered, today. but tiny devices? no. they are not usually overpowered at all. and they are NOT your hosting platform! they are meant to do something and not work a night shift just because you college boys don't really understand what the fuck iot is really about.
if you have spare cycles and are iot, you did it wrong
I'm not so sure. What if an ultra-low-power ARM's cost is in the ballpark of an 8051? One might save on the software development and maintenance side by using a more modern and familiar CPU.
Look at desktop/laptop CPUs. They are grossly overpowered for many users. Why would iot devices follow a similar pattern if the costs are right?
We gave you an atomic bomb, what do you want, mermaids?
-- I. I. Rabi to the Atomic Energy Commission
Powerful enough CPUs? (Score:3)
I can't imagine a lot of companies putting more powerful (that is, more expensive) chips than is necessary to run the device itself.
Re:Powerful enough CPUs? (Score:3)
exactly; if you have spare cycles and are iot, you did it wrong.
plus, iot is usually of a more realtime nature. who wants to risk timing skews or dropped events because some joker wanted to 'use' my super weak iot device for his alien space searches?
hosts are way overpowered, today. but tiny devices? no. they are not usually overpowered at all. and they are NOT your hosting platform! they are meant to do something and not work a night shift just because you college boys don't really understand what the fuck iot is really about.
If modern more familiar CPU has a low enough cost (Score:2)
if you have spare cycles and are iot, you did it wrong
I'm not so sure. What if an ultra-low-power ARM's cost is in the ballpark of an 8051? One might save on the software development and maintenance side by using a more modern and familiar CPU.
Look at desktop/laptop CPUs. They are grossly overpowered for many users. Why would iot devices follow a similar pattern if the costs are right?