Aura: Harnessing the Power of IoT Devices For Distributed Computing 56
An anonymous reader points out that a computer science research team from the University of Alabama has put together a new architecture called "Aura," which lets people make use of excess computing power from various smart devices scattered throughout their homes. Ragib Hasan, the team's leader, says this scheme could be integrated with smartphones, letting you offload CPU-intensive tasks to your home devices. He also anticipates the ability to sell off excess capacity — like how people with solar panels can sometimes sell the excess energy they harvest. Alternately, they could be allocated to a distributed computing project of the homeowner's choice, like Seti@home. Of course, several obstacles need to be solved before a system like Aura can be used — smart devices run on a variety of operating systems and often communicate only through a narrow set of protocols. Any unifying effort would also need careful thought about security and privacy matters.
IoT != compute (Score:5, Insightful)
this is stupid. no. just no. ok?
iot is all about low power, dedicated and it is NOT YOUR HOSTING PLATFORM for running your bullshit on.
iot has enough trouble with weak or non-existent security and the devices are just not meant to accept 'workloads' from you.
someone has been smoking from the beowulf bowl...
Re: (Score:2)
I was thinking just this plus, the whole system sounds like some sort of Rube Goldberg designed service, and for what, the combined computing power of a couple atiny chips and one atmega?
Re: (Score:3)
Agree 100%.
Someone has obviously not heard of Amdahl's law https://en.wikipedia.org/wiki/Amdahl's_law
Or thought about the issues with power consumption, data distribution, security, reliability, fault tolerance, and just about anything else.
That and the fact that IoT is NOT about active processing in devices (thats only an enabler to it), it is about the centralisation of control
of those devices 'in the cloud', for whatever benifit that is supposed to bring (mostly to the bottom line of the suppliers by sel
Re: (Score:2)
Low power and low bandwidth. Very very low bandwidth in some cases.
If someone wants spare compute cycles, then use those smart phones that are constantly being used for stupid things.
Bitcoin mining doesn't need much bandwidth (Score:2)
Low power and low bandwidth. Very very low bandwidth in some cases.
Well bitcoin mining doesn't need much bandwidth. :-)
Why 8-bit 8051 over 32-bit ARM ? (Score:2)
Is there a big advantage to going 8-bit 8051 over ultra-low-power 32-bit ARM these days?
Re: (Score:1)
Re: (Score:2)
From my work in the consumer electronics industry designing embedded chips, fractions of a penny add into big numbers when multiplied by millions or billions of parts. See the PIC [wikipedia.org]. I don't know what the industry is dong today, but 15 years ago people were talking about building hundreds of devices for a penny. An internet connected lightbulb doesn't need to be that capable. ON/OFF/BURNOUT are it's three required states.
Yeah, for a commodity product that competes on nothing other than price. However IOT devices may compete on functionality as well and/or start out as niche products not massively deployed products. I think your point may be more true of later generation devices than it is for initial generations.
Cycles are too cheap (Score:5, Informative)
The "problem" is that even cheap phone processors have far more processing power than needed. Anything that requires real processing power already is offloaded to the net. There is no need to scavenge cycles from other processors.
I have a bunch of Arduinos and Raspberry Pi processors doing a bunch of stuff (mostly collecting data) and they all are overkill for the task at hand. They mostly send data to servers and/or retrieve massaged data for presentation. I can't imagine any of these processors ever becoming overloaded and needing assistance.
Re: (Score:1)
I sell my excess solar back to the grid at a rate which is a really bad deal for me - only 6c per kWh, which is al any of the utilities will pay for it
I expect selling my 'spare' computing cycles will be a similarly crap deal.
One day I hope there will be an energy storage solution which will allowe me to better usilise this excess solar capacity.
Meanwhile, I switch offwhatever cpu's I don't actually need running, so there aren't really any spare cycles to be had, and if there were, I wouldn't want to burn t
Re:Cycles are too cheap (Score:4, Insightful)
So they are paying you more than the wholesale cost (aka what they buy it at) for electricity and you are upset?
Re: (Score:1)
Re: (Score:2)
The "problem" is that even cheap phone processors have far more processing power than needed. Anything that requires real processing power already is offloaded to the net. There is no need to scavenge cycles from other processors.
The "problem" is that for your phone to work it needs a bare minimum of processing power. 99% of the time it doesn't need that processing power, which gives lots of spare CPU cycles.
Re: (Score:2)
This is not typical for IoT. A smart phone is not an internet-of-things style device. These are tiny processors, extremely low power with none to waste, very low bandwidth so that it takes longer to send parameters and receive the answers than to just do it locally.
Re: (Score:2)
The problem is, Arduino cycles are expensive compared to something like a i7, when you compare actual performance per watt.
Low power devices for connected devices save power by not doing stuff. If you make them run their CPUs, they use more power and they are FAR less efficient at actually running then their big brothers like a desktop or server class intel chip.
It is ridiculously inefficient to use the spare CPU on your phone, Raspberry Pi or Arduino, just buy the proper CPU for the task. And lets be rea
Great! (Score:3)
I can run a Chinese and Russian bitcoin job on my lightbulbs!
Powerful enough CPUs? (Score:3)
I can't imagine a lot of companies putting more powerful (that is, more expensive) chips than is necessary to run the device itself.
Re: (Score:3)
exactly; if you have spare cycles and are iot, you did it wrong.
plus, iot is usually of a more realtime nature. who wants to risk timing skews or dropped events because some joker wanted to 'use' my super weak iot device for his alien space searches?
hosts are way overpowered, today. but tiny devices? no. they are not usually overpowered at all. and they are NOT your hosting platform! they are meant to do something and not work a night shift just because you college boys don't really understand what th
If modern more familiar CPU has a low enough cost (Score:2)
if you have spare cycles and are iot, you did it wrong
I'm not so sure. What if an ultra-low-power ARM's cost is in the ballpark of an 8051? One might save on the software development and maintenance side by using a more modern and familiar CPU.
Look at desktop/laptop CPUs. They are grossly overpowered for many users. Why would iot devices follow a similar pattern if the costs are right?
Fast is not a problem, nor are "wasted computrons" (Score:4, Interesting)
If the CPU in the IoT Device is powerful enough to make offloading actually worthwhile, isn't that CPU way overkill for the IoT Device's primary function?
Not at all. The CPU is fast to reduce latency. This not only meets response targets, but it also means the CPU can shut down after a very short time, saving power.
This is especially important on battery powered devices. If the CPU is off except for a couple of milliseconds every few seconds, a battery can last for years.
The CPU is also fast because it's made of small components close together. It's built using current large-chip fabrication technology. Making it physically small means many chips per die, which means low cost per chip. If that makes it fast, so much the better .
As long as you're not using extra power to increase the speed further, there's no problem with a processor being "too fast". That just means it can go to sleep sooner. In fact, slowing it down can be expensive: Slower means not only that the power is on longer, but it also usually means bigger components which require more electrons to change their voltage. The more electrons delivered by the battery, the more if it is used up. Oops!
Granted that the processors are powerful and cheap, and have a lot of computation potential. But there are other downsides to trying to use IoT devices for a computing resource.
One is that the volatile memory, which uses scarce power just holding its state is very small, and the permanent memory, though it may be moderately large, is flash: VERY slow, VERY power consuming to do a write (and the processor stops while you're writing flash, screwing things up for its primary purpose).
Much of the current generation IoT devices run on either the Texas Instruments CC2541 (8051 processor, 8kB RAM, 256kB flash) and its relatives, or the Nordic nRF51822 (32-bit ARM® Cortexâ M0 CPU, 32kB/16kB RAM, 256kB/128kB flash) and its family, and the next generation is an incremental improvement rather than a breakthrough. You can do a lot in a quarter megabyte of code space (if you're willing to work at it a bit like we did in the early days of computing). But there's not a lot of elbow room there.
The tiny memories mean you don't have a lot of resource to throw at operating systems and extra work. In fact, though the communication stacks are pretty substantial (and use up a LOT of the flash!), the OSes are pretty rudimentary: Mostly custom event loop abstraction layers, talking to applications that are mostly event and callback handlers. Development environments encourage custom loads that don't have any pieces of libraries or system services that aren't actually used by the applications.
Another downside is the lack of bandwidth for communicating between them. (Bluetooth Low Energy, for example, runs at one megaBIT per second, has a lot of overhead and tiny packets, and divides three "advertising" (connection establishment) channels, in the cracks between 2.4GHz WiFI chnnels, among ALL the machines in radio "earshot".) Maybe they can do a lot of deep thought - but getting the work to, and the results from, all those little guys will be a bottleneck.
Maybe Moore's Law and the economic advantage of saving programmer time may make this change in the future. But I'm not holding my breath waiting for "smart" lightbulbs to have large, standardized, OSes making that "wasted" CPU power available to parasitic worms.
Low leakage: Power saving is king! (Score:2)
The CPU is also fast because it's made of small components close together. It's built using current large-chip fabrication technology. re-optimized for low leakage, of course.
When a substantial fraction of the target applications are intended to run for years on a fractional amp-hour lithium button or harvested ambient energy, power saving is critical.
Re: (Score:2)
These are university researchers. Just like most slashdot readers, they may not understand how the real world works.
April Fools? (Score:4, Insightful)
Is this a mis-placed April Fools post?
CPUs don't 'have' power. They consume power. A powerful CPU is one that has the potential to consume a lot of power doing some form of calculation. The point in IoT embedded controllers is to consume as little power as possible. If they are loaded up with tasks that have nothing to do with their embedded purpose, they will consume more power (watt) and since they're not optimized for the task, they will do so inefficiently.
Sony promised us this with the PS3 (Score:2)
They said that they would build networks of cell processors in our homes that would cooperate on tasks. But the truth is that you need a really great network to make it worthwhile. IoT devices are likely to be on high-latency networks, and won't want to participate with one another. Most of them will have piddly little amounts of horsepower not really useful for anything compared even to a low-end cellphone of today. Someday this will make sense, but this is not that day.
Re: (Score:2)
PS3s are still better for this sort of thing. Or PS4s, or Xbones. Or your PC. Any of those devices have roughly a thousand times the computational power it typically needs when idling, or doing whatever lightweight tasks that take up most of it's time.
The only way this would make any remote sort of sense is if you have far, far more IoT devices in your house, enough to where you can outperform your PC or videogame console. I can't even imagine what you'd need all that sort of computational horsepower fo
No. Just no (Score:3)
Re: (Score:2)
My god ... (Score:1)
My god, we've come full circle.
So 20 years ago or so, the concept of ubiquitous computing was floating around. You know, where everything follows you, and CPU loads could pushed onto other idle machines because it all had excess capacity 90% of the time.
And then the network was the computer. And then the computer was the cloud.
And now we're back to offloading CPU into a bunch of ubiquitous devices.
What next, client server computing, mainframes, and dumb terminals?
It's like some strange time loop.
Re: (Score:1)
I don't get the rant.
What I got from the article is they want to offload computing to peoples hand held devices.
Which is stupid. Why should anyone burn up their data limit for shared computing?
No, No, and NO (Score:1)
short lived hack (Score:5, Insightful)
IoT meme already past sell-by date (Score:2)
A person at a meeting with only a smartphone could offload to Aura the process of recalculating a spreadsheet for a presentation, eliminating the need for a laptop
This is what I love about all the buzzword enriched nonsense. Use cases presented are not only completely worthless but so half baked and nonsensical that they are actually funny.
Hasanâ(TM)s plan, of course, anticipates a world with a vast number of Internet of Things devices, where lightbulbs, refrigerators, thermostats and other products will come with small processors and network connectivity.
Oh the dreams of marketeers...
By 2020, the world will have 26 billion such devices in operation, according to technology analyst firm Gartner.
More likely they spend $26 billion in advertising to get people to care about their worthless and annoying gimmicks and still fail.
Terribly inefficient (Score:1)
"Low powered" CPUs tend to burn more energy per cycle than high performance CPUs when forced to run at full load. Combined with the overhead of distributing tasks over the internet you'd be spending much more money on power compared to doing it in a datacenter.
MidasNet (Score:1)
This is a terrible idea (Score:2)
Already Been Done (Score:2)
https://en.wikipedia.org/wiki/... [wikipedia.org]
This doesn't exist in the 'Real World', today (Score:2)
Millions of computers that would have been considered a supercomputer a couple decades ago are connected by high speed connections, yet there is no real market for unused cpu cycles even though there has been several attempts. Most of these computers are probably under 5% utilization unless they have a virus.
I am beginning to thing IOT stands for 'idiot on tranquilizers'
A good mental excercise with no practical value. (Score:2)