Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Cloud Software Technology

Aura: Harnessing the Power of IoT Devices For Distributed Computing 56

An anonymous reader points out that a computer science research team from the University of Alabama has put together a new architecture called "Aura," which lets people make use of excess computing power from various smart devices scattered throughout their homes. Ragib Hasan, the team's leader, says this scheme could be integrated with smartphones, letting you offload CPU-intensive tasks to your home devices. He also anticipates the ability to sell off excess capacity — like how people with solar panels can sometimes sell the excess energy they harvest. Alternately, they could be allocated to a distributed computing project of the homeowner's choice, like Seti@home. Of course, several obstacles need to be solved before a system like Aura can be used — smart devices run on a variety of operating systems and often communicate only through a narrow set of protocols. Any unifying effort would also need careful thought about security and privacy matters.
This discussion has been archived. No new comments can be posted.

Aura: Harnessing the Power of IoT Devices For Distributed Computing

Comments Filter:
  • IoT != compute (Score:5, Insightful)

    by TheGratefulNet ( 143330 ) on Tuesday June 16, 2015 @09:25PM (#49926427)

    this is stupid. no. just no. ok?

    iot is all about low power, dedicated and it is NOT YOUR HOSTING PLATFORM for running your bullshit on.

    iot has enough trouble with weak or non-existent security and the devices are just not meant to accept 'workloads' from you.

    someone has been smoking from the beowulf bowl...

    • by anagama ( 611277 )

      I was thinking just this plus, the whole system sounds like some sort of Rube Goldberg designed service, and for what, the combined computing power of a couple atiny chips and one atmega?

    • Agree 100%.

      Someone has obviously not heard of Amdahl's law https://en.wikipedia.org/wiki/Amdahl's_law
      Or thought about the issues with power consumption, data distribution, security, reliability, fault tolerance, and just about anything else.

      That and the fact that IoT is NOT about active processing in devices (thats only an enabler to it), it is about the centralisation of control
      of those devices 'in the cloud', for whatever benifit that is supposed to bring (mostly to the bottom line of the suppliers by sel

    • Low power and low bandwidth. Very very low bandwidth in some cases.

      If someone wants spare compute cycles, then use those smart phones that are constantly being used for stupid things.

    • To be fair, low power embedded CPUs can be quite capable these days and are likely to only get more capable.

      Is there a big advantage to going 8-bit 8051 over ultra-low-power 32-bit ARM these days?
      • From my work in the consumer electronics industry designing embedded chips, fractions of a penny add into big numbers when multiplied by millions or billions of parts. See the PIC [wikipedia.org]. I don't know what the industry is dong today, but 15 years ago people were talking about building hundreds of devices for a penny. An internet connected lightbulb doesn't need to be that capable. ON/OFF/BURNOUT are it's three required states.
        • From my work in the consumer electronics industry designing embedded chips, fractions of a penny add into big numbers when multiplied by millions or billions of parts. See the PIC [wikipedia.org]. I don't know what the industry is dong today, but 15 years ago people were talking about building hundreds of devices for a penny. An internet connected lightbulb doesn't need to be that capable. ON/OFF/BURNOUT are it's three required states.

          Yeah, for a commodity product that competes on nothing other than price. However IOT devices may compete on functionality as well and/or start out as niche products not massively deployed products. I think your point may be more true of later generation devices than it is for initial generations.

  • Cycles are too cheap (Score:5, Informative)

    by mspohr ( 589790 ) on Tuesday June 16, 2015 @09:25PM (#49926429)

    The "problem" is that even cheap phone processors have far more processing power than needed. Anything that requires real processing power already is offloaded to the net. There is no need to scavenge cycles from other processors.
    I have a bunch of Arduinos and Raspberry Pi processors doing a bunch of stuff (mostly collecting data) and they all are overkill for the task at hand. They mostly send data to servers and/or retrieve massaged data for presentation. I can't imagine any of these processors ever becoming overloaded and needing assistance.

    • by vivian ( 156520 )

      I sell my excess solar back to the grid at a rate which is a really bad deal for me - only 6c per kWh, which is al any of the utilities will pay for it
      I expect selling my 'spare' computing cycles will be a similarly crap deal.
      One day I hope there will be an energy storage solution which will allowe me to better usilise this excess solar capacity.
      Meanwhile, I switch offwhatever cpu's I don't actually need running, so there aren't really any spare cycles to be had, and if there were, I wouldn't want to burn t

    • The "problem" is that even cheap phone processors have far more processing power than needed. Anything that requires real processing power already is offloaded to the net. There is no need to scavenge cycles from other processors.

      The "problem" is that for your phone to work it needs a bare minimum of processing power. 99% of the time it doesn't need that processing power, which gives lots of spare CPU cycles.

    • This is not typical for IoT. A smart phone is not an internet-of-things style device. These are tiny processors, extremely low power with none to waste, very low bandwidth so that it takes longer to send parameters and receive the answers than to just do it locally.

    • The problem is, Arduino cycles are expensive compared to something like a i7, when you compare actual performance per watt.

      Low power devices for connected devices save power by not doing stuff. If you make them run their CPUs, they use more power and they are FAR less efficient at actually running then their big brothers like a desktop or server class intel chip.

      It is ridiculously inefficient to use the spare CPU on your phone, Raspberry Pi or Arduino, just buy the proper CPU for the task. And lets be rea

  • by jimmydevice ( 699057 ) on Tuesday June 16, 2015 @09:26PM (#49926431)

    I can run a Chinese and Russian bitcoin job on my lightbulbs!

  • by Weirsbaski ( 585954 ) on Tuesday June 16, 2015 @09:30PM (#49926447)
    If the CPU in the IoT Device is powerful enough to make offloading actually worthwhile, isn't that CPU way overkill for the IoT Device's primary function?

    I can't imagine a lot of companies putting more powerful (that is, more expensive) chips than is necessary to run the device itself.
    • exactly; if you have spare cycles and are iot, you did it wrong.

      plus, iot is usually of a more realtime nature. who wants to risk timing skews or dropped events because some joker wanted to 'use' my super weak iot device for his alien space searches?

      hosts are way overpowered, today. but tiny devices? no. they are not usually overpowered at all. and they are NOT your hosting platform! they are meant to do something and not work a night shift just because you college boys don't really understand what th

      • if you have spare cycles and are iot, you did it wrong

        I'm not so sure. What if an ultra-low-power ARM's cost is in the ballpark of an 8051? One might save on the software development and maintenance side by using a more modern and familiar CPU.

        Look at desktop/laptop CPUs. They are grossly overpowered for many users. Why would iot devices follow a similar pattern if the costs are right?

    • by Ungrounded Lightning ( 62228 ) on Tuesday June 16, 2015 @10:43PM (#49926721) Journal

      If the CPU in the IoT Device is powerful enough to make offloading actually worthwhile, isn't that CPU way overkill for the IoT Device's primary function?

      Not at all. The CPU is fast to reduce latency. This not only meets response targets, but it also means the CPU can shut down after a very short time, saving power.

      This is especially important on battery powered devices. If the CPU is off except for a couple of milliseconds every few seconds, a battery can last for years.

      The CPU is also fast because it's made of small components close together. It's built using current large-chip fabrication technology. Making it physically small means many chips per die, which means low cost per chip. If that makes it fast, so much the better .

      As long as you're not using extra power to increase the speed further, there's no problem with a processor being "too fast". That just means it can go to sleep sooner. In fact, slowing it down can be expensive: Slower means not only that the power is on longer, but it also usually means bigger components which require more electrons to change their voltage. The more electrons delivered by the battery, the more if it is used up. Oops!

      Granted that the processors are powerful and cheap, and have a lot of computation potential. But there are other downsides to trying to use IoT devices for a computing resource.

      One is that the volatile memory, which uses scarce power just holding its state is very small, and the permanent memory, though it may be moderately large, is flash: VERY slow, VERY power consuming to do a write (and the processor stops while you're writing flash, screwing things up for its primary purpose).

      Much of the current generation IoT devices run on either the Texas Instruments CC2541 (8051 processor, 8kB RAM, 256kB flash) and its relatives, or the Nordic nRF51822 (32-bit ARM® Cortexâ M0 CPU, 32kB/16kB RAM, 256kB/128kB flash) and its family, and the next generation is an incremental improvement rather than a breakthrough. You can do a lot in a quarter megabyte of code space (if you're willing to work at it a bit like we did in the early days of computing). But there's not a lot of elbow room there.

      The tiny memories mean you don't have a lot of resource to throw at operating systems and extra work. In fact, though the communication stacks are pretty substantial (and use up a LOT of the flash!), the OSes are pretty rudimentary: Mostly custom event loop abstraction layers, talking to applications that are mostly event and callback handlers. Development environments encourage custom loads that don't have any pieces of libraries or system services that aren't actually used by the applications.

      Another downside is the lack of bandwidth for communicating between them. (Bluetooth Low Energy, for example, runs at one megaBIT per second, has a lot of overhead and tiny packets, and divides three "advertising" (connection establishment) channels, in the cracks between 2.4GHz WiFI chnnels, among ALL the machines in radio "earshot".) Maybe they can do a lot of deep thought - but getting the work to, and the results from, all those little guys will be a bottleneck.

      Maybe Moore's Law and the economic advantage of saving programmer time may make this change in the future. But I'm not holding my breath waiting for "smart" lightbulbs to have large, standardized, OSes making that "wasted" CPU power available to parasitic worms.

      • The CPU is also fast because it's made of small components close together. It's built using current large-chip fabrication technology. re-optimized for low leakage, of course.

        When a substantial fraction of the target applications are intended to run for years on a fractional amp-hour lithium button or harvested ambient energy, power saving is critical.

    • These are university researchers. Just like most slashdot readers, they may not understand how the real world works.

  • April Fools? (Score:4, Insightful)

    by Bing Tsher E ( 943915 ) on Tuesday June 16, 2015 @09:32PM (#49926451) Journal

    Is this a mis-placed April Fools post?

    CPUs don't 'have' power. They consume power. A powerful CPU is one that has the potential to consume a lot of power doing some form of calculation. The point in IoT embedded controllers is to consume as little power as possible. If they are loaded up with tasks that have nothing to do with their embedded purpose, they will consume more power (watt) and since they're not optimized for the task, they will do so inefficiently.

  • They said that they would build networks of cell processors in our homes that would cooperate on tasks. But the truth is that you need a really great network to make it worthwhile. IoT devices are likely to be on high-latency networks, and won't want to participate with one another. Most of them will have piddly little amounts of horsepower not really useful for anything compared even to a low-end cellphone of today. Someday this will make sense, but this is not that day.

    • PS3s are still better for this sort of thing. Or PS4s, or Xbones. Or your PC. Any of those devices have roughly a thousand times the computational power it typically needs when idling, or doing whatever lightweight tasks that take up most of it's time.

      The only way this would make any remote sort of sense is if you have far, far more IoT devices in your house, enough to where you can outperform your PC or videogame console. I can't even imagine what you'd need all that sort of computational horsepower fo

  • by Snotnose ( 212196 ) on Tuesday June 16, 2015 @10:06PM (#49926591)
    I myself don't subscribe to the IoT model, mainly because I don't trust the security. Doing something like this on my thermostat? I trust the security even less.
    • by Megane ( 129182 )
      Just wait until people start pushing ads to these things. One day you open your refrigerator door it plays the Dr. Pepper jingle. Wouldn't you like to be a Pepper, too?
  • My god, we've come full circle.

    So 20 years ago or so, the concept of ubiquitous computing was floating around. You know, where everything follows you, and CPU loads could pushed onto other idle machines because it all had excess capacity 90% of the time.

    And then the network was the computer. And then the computer was the cloud.

    And now we're back to offloading CPU into a bunch of ubiquitous devices.

    What next, client server computing, mainframes, and dumb terminals?

    It's like some strange time loop.

    • I don't get the rant.

      What I got from the article is they want to offload computing to peoples hand held devices.

      Which is stupid. Why should anyone burn up their data limit for shared computing?

  • I don't want various and sundry crapware running on my refrigerator, TV set, phone, or anything else thankyouverymuch,
  • short lived hack (Score:5, Insightful)

    by liquid_schwartz ( 530085 ) on Tuesday June 16, 2015 @11:22PM (#49926829)
    Many, perhaps even most, of the IoT devices are battery powered. Mostly CR2032 coin cells. These have ~150mAH to 240mAH depending on how you use them. Your nodes will die off in about a day of running non-stop. This website mostly thinks in terms of embedded==(Arduino || Rasberry Pi) when in reality most of the IoT devices will be Arm Cortex M0+/M3/M4 devices that spend the vast majority of their lives in low power sleep modes drawing a microamp or two.
  • A person at a meeting with only a smartphone could offload to Aura the process of recalculating a spreadsheet for a presentation, eliminating the need for a laptop

    This is what I love about all the buzzword enriched nonsense. Use cases presented are not only completely worthless but so half baked and nonsensical that they are actually funny.

    Hasanâ(TM)s plan, of course, anticipates a world with a vast number of Internet of Things devices, where lightbulbs, refrigerators, thermostats and other products will come with small processors and network connectivity.

    Oh the dreams of marketeers...

    By 2020, the world will have 26 billion such devices in operation, according to technology analyst firm Gartner.

    More likely they spend $26 billion in advertising to get people to care about their worthless and annoying gimmicks and still fail.

  • by Anonymous Coward

    "Low powered" CPUs tend to burn more energy per cycle than high performance CPUs when forced to run at full load. Combined with the overhead of distributing tasks over the internet you'd be spending much more money on power compared to doing it in a datacenter.

  • Everyday we could all send one dollar to one new person, until were all billionaires!
  • If your IOT devices actually have spare processing power, and it will have no impact on their primary function or more importantly, their power usage, then they are poorly designed. You know what kinds of tasks IOT devices would be great for? DDOSing someone. That's about it.
  • Millions of computers that would have been considered a supercomputer a couple decades ago are connected by high speed connections, yet there is no real market for unused cpu cycles even though there has been several attempts. Most of these computers are probably under 5% utilization unless they have a virus.

    I am beginning to thing IOT stands for 'idiot on tranquilizers'

  • Maybe offloading that stuff to the monster gaming rig in the corner doing nothing most of the time, but to have several smart phones, tablets, and other devices trying to run things, that is dumb. 1st off no matter how cool phones and tablets have become they are lightweights when it comes to crunching data. It would take 5-10 devices to begin to be useful and when they are doing that they are not doing what you wanted them for in the first place. 2nd off the companies that provide services for phones te

Beware of all enterprises that require new clothes, and not rather a new wearer of clothes. -- Henry David Thoreau

Working...