The Industry IoT Consortium has published a whitepaper advising developers on optimal times to use cloud and edge computing for AI-based industrial applications.
With advances in technology, developers can perform industrial AI vision analysis directly on a camera or process this data on a nearby computer, at an on-premise server or at a remote data centre.
In its whitepaper, entitled Optimal Use of Cloud and Edge in Industrial Machine-Vision the organisation spells out the benefits of both Cloud and Edge, suggesting which technology should be used for which application.
IIC technology working group co-chair Daniel Young – also a senior manager at Toshiba, explained:
“Understanding where image processing should occur is an engineering decision based on many different factors.
“For example, cloud computing offers industrial applications flexibility and scalability for machine learning models, while edge computing is best for real-time industrial tasks.”
The organisation – whose aim is to transform business by accelerating the Industrial Internet of Things (IIOT) – has split the paper into four sections: Industrial vision applications; Edge computing in industrial machine vision; Cloud computing in industrial machine vision and Deciding where the Edge lies in industrial machine vision.