Next: “OpenFog” Computing Swarms

openfog-1

This is coming up fast: OpenFog computing.

It can be positioned between wide area, distant  Cloud Services and in-house IoT devices. For a number of reasons this zone on the edge of networks can be made to work much better and can make use of the millions of PC’s, laptops, tablets and smartphones. Big question is if we can organize it in such a way that the many different requirements from very diverse user groups can be met  in a stable & scaleable way.

  1. it’s Fog, as in a cloud that is closer to the ground (closer to where the data is generated)

  2. In reply to

    , fog computing is horizontal architecture that bridges the cloud-to-thing continuum. See our definition

 

IMHO OpenFog is crucial for G5, IoT as well as fast moving vehicles and drones. I am curious if the dis-intermediated Telco’s will get their brains around this or see this train pass in the distance also so they will miss this?

jaap van till, TheConnectivist

OpenFog2.png

The following article from VanillaPlus.com  gives more info about what the OpenFog Consortium is doing.

====================================

Fog matters

14 December, 2016 at 9:00 AM

Posted by: Sheetal Kumbhar

2

Fog matters

Lynne Canavan, executive director, OpenFog Consortium

The decade of fog computing has begun. Ushered in by a specific set of high-velocity digital business problems and growth opportunities, fog computing is rapidly gaining traction.

For those involved in the Internet of Things (IoT), 5G, artificial intelligence and virtual reality, fog computing is more than an interesting approach: It’s a necessary one.

Why? In today’s digital world, you can’t run everything in the cloud. There are latency, mobility, geographic, network bandwidth, reliability, security and privacy challenges. Nor can you run everything at the edge with intelligent endpoints, due to energy, space, capacity, environmental, reliability, modularity, and security challenges, says Lynne Canavan, executive director, OpenFog Consortium.

Fog computing addresses these gaps by bridging the continuum from cloud to things. It distributes compute, communication, control, storage and decision making closer to where the data is originated, enabling dramatically faster processing time and lowering network costs. Fog is an extension of the traditional cloud-based computing model where implementations of the architecture can reside in multiple layers of a network’s topology. By adding layers of fog nodes, applications can be partitioned to run at the optimal network level.

In particular, fog computing supports time-critical applications that require sub-millisecond reaction time. Autonomous vehicles, emergency responsiveness, drones and virtual reality are among the dozens of applications that require rapid latency. For example, a drone can travel at 100 miles per hour, or roughly 147 feet per second.

During its journey, it requires continuous software updates, produces massive amounts of data that require computation and communication. If you consider that the best cloud round trip latency is around 80 milliseconds, the drone would fly about 12 feet between cloud messages. Fog nodes can reduce the latency to such a degree that a drone will only travel two inches before the next update is delivered.

Fog computing also supports data-intensive, remote operations. In oil and gas exploration, real-time subsurface imaging and monitoring reduces the drilling of exploratory wells, saving money and minimising environmental damage. Thousands of seismic sensors generate the high-resolution imaging required to discover risks and opportunities.

Fog computing manages the energy, bandwidth and computing needed for timely risk and opportunity analysis in this geographically-challenged, disruption-prone and data-intensive process. Instead of collecting data in the cloud for post-processing, fog nodes form mesh networks to stream data processing tasks and communicate with each other to compute the subsurface image in the network.

The fog computing algorithm is resilient to network disruption and adapts to energy and bandwidth changes. To work, fog computing must enable rapid, trusted and secure transmissions.

This requires an open, interoperable architecture that will ultimately enable end users to choose interoperable solutions from a diverse, vibrant supplier ecosystem. Creating the open architecture, and testing it via fog computing use cases and testbeds, is the work of the OpenFog Consortium.pillar chart - blog

The OpenFog Consortium was founded in November 2015 by ARM, Cisco, Dell, Intel, Microsoft and Princeton University, based on the shared vision that an open fog computing architecture is necessary in today’s increasingly connected world.

Our mission is to drive industry and academic leadership in fog computing architecture, testbed development, and a variety of interoperability and composability deliverables that seamlessly leverage cloud and edge architectures to enable end-to-end scenarios. Through an open membership ecosystem of industry, end users, universities and research organisations, OpenFog is collectively applying a broad coalition of knowledge to the technical and market challenges ahead.

Today, 51 member organisations from 14 countries are collectively working on the OpenFog reference architecture, to be released in early 2017. This framework is based on eight foundational pillars: Security, scalability, openness, autonomy, RAS (reliability, availability and serviceability), agility, hierarchy, and programmability.

From the silicon layer through to the operating system, OpenFog members are defining and testing functional and component level interoperability for fog-to-fog communication, by applying the architecture to specific use cases. Standards development organisations will then use this architecture to create the specific standards.

Driving business growth through fog-enabled applications is the ultimate goal of our work. The OpenFog architecture is the underlying framework to build and test new concepts and products in real-world use cases and testbeds. Smarter cities, drone-enabled supply chains, remote energy extraction and exploration, smart traffic, video surveillance, virtual reality, environmental conservation and emergency response are just a sample of the emerging use cases that are enabled and improved through fog computing.

The future is looking very foggy indeed.

This author of this blog is Lynne Canavan, executive director, OpenFog Consortium

==========================================================

More info can be found on:

https://www.openfogconsortium.org/

https://www.openfogconsortium.org/wp-content/uploads/OpenFog-Architecture-Overview-WP-2-2016.pdf

And on WikiPedia:

https://en.wikipedia.org/wiki/OpenFog_Consortium which states:

===

The OpenFog Consortium (sometimes stylized as Open Fog Consortium) is an consortium of high tech industry companies and academic institutions across the world aimed at the standardization and promotion of fog computing in various capacities and fields.

The consortium was founded by Cisco Systems, Intel, Microsoft,Princeton University, Dell, and ARM Holdings in 2015 and now has 47 members across the North America, Asia, and Europe, includingForbes 500 companies and noteworthy academic institutions.[1]

======= Another article of Lynne Canavan reblogged here============

Fog in the Factory

The fog is starting to roll in on the factory floor.

Manufacturers who have embraced the Industrial Internet of Things (IIoT) have made gains in performance and operational efficiencies, thanks to connected sensors that measure and manage equipment performance throughout the factory. Tiny sensors stream performance data to the cloud, helping to troubleshoot before equipment fails.

In these digital factories, IIoT is exposing the performance gaps of traditional cloud-only systems. IIoT requires a connected factory that is capable of providing reliable, real-time accessibility to the data being captured and analyzed by automated systems and processes. The amount of data from industrial manufacturing sites that operate drones, industrial robots and industrial control systems is now measured in petabytes, dwarfing all previous networking demands.

Given the requirements for real-time communication flows throughout – and beyond – the factory floor, it’s becoming clear that cloud-only approaches can no longer keep up with the necessary volume, latency, mobility, reliability, security, privacy and network bandwidth challenges. Streaming the massive amounts of data to the cloud and back, under the assumption of 100% availability of broadband service with less than millisecond latency, is not possible in today’s cloud-only computing models.

Ushering in the fog

Fog computing is the horizontal architecture that distributes compute, communication, control and storage closer to where the data is originated, enabling dramatically faster processing time and lowering network costs. It augments, not replaces, investments in cloud to enable an efficient, cost-effective, and constructive use of IIoT in manufacturing environments. Fog extends the traditional cloud-based computing model where implementations of the architecture can reside in multiple layers of a network’s topology. Fog scales both horizontally (fog node to fog node) and vertically (between fog layers), and can dynamically respond to network load changes and failures more effectively than a simple extension of the cloud.

While many use the term ‘fog’ and ‘edge’ interchangeably, there are key differences. Fog computing always uses edge computing, but edge does not always require fog capabilities. Fog is inclusive of cloud, whereas edge is defined by the exclusion of cloud. Fog pools the resources and data sources between devices that reside at the edge in north-south, east-west hierarchies, where edge tends to be limited to a small number of layers. Any device with computing, storage, and network connectivity can be a fog node. Examples include industrial controllers, switches, routers, embedded servers, and video surveillance cameras.

At the OpenFog Consortium, we term the advantages of fog computing as SCALE: Security; Cognition; Agility; Latency; and Efficiency:

Security –Most cloud-based security services focus on providing perimeter-based protection for threat detection to industrial control systems. New control requests are redirected to the cloud for authentication and authorization processing. Security can be further compromised by the common practice of installing hardware and software updates during a system’s scheduled down time, rather than any time a security compromise is detected. If threats breach the protections, the common response has been to manually take the system offline to be cleaned up. These processes are inadequate for today’s connected factories. Fog addresses these IIoT security challenges through its distributed architecture. Factories can protect the fog nodes using the same corporate IT policy, controls, and procedures. The fog nodes can quickly identify unusual activity and can mitigate threats or attacks before they are passed through to the system without disruption of service.

Cognition – The fog architecture can best determine where to carry out the computing, storage and control functions along the cloud-to-thing continuum. Decisions can be made on the device or via a nearby fog node, avoiding the need to transport the data solely to the cloud for decision making. Smart sensors can make autonomous decisions and trade-offs regarding manufacturing execution. Multiple connected machines can communicate within production environments and learn from their decisions, improving performance over time.

Agility – Sensors and systems generate data that is turbulent, bursty, and often created in huge volumes, and must be rapidly interpreted and turned into actionable insights and decisions. The predicted scale of the data generated by automated systems means that it is highly unlikely that humans alone can make the astute operational decisions for the benefit of the business. In addition, the architecture and connectivity must be flexible. Fog works over wireline/optical and wireless networks and also inside these networks. This makes it ideally suited for industrial elements based on wired SCADA systems, OPC-UA interfaces, Modbus, and so on, which can be connected to fog nodes.

Latency – Latency is the most cited benefit of a fog-based approach. Many industrial control systems require end-to-end latencies within a few milliseconds, falling outside the range for mainstream cloud services. While a one-second delay is simply annoying in gaming or in talking with Siri, it is an eternity to an idle piece of manufacturing. Milliseconds matter when you are trying to prevent manufacturing line shutdowns, avert accidents or restore electrical service. Analyzing IoT data close to where it is collected minimizes latency, supporting the time-critical processes of the connected factory in averting disaster or shutdowns

Efficiency – Fog pools resources to allow applications to leverage idling computing, storage and networking resources to enhance overall system efficiency and performance. The fog architecture takes an ‘immersive distribution’ approach. If a network goes down, it immediately switches to other fog resources which are available throughout the network. This enables flexibility of management and ease of integration with existing IoT environments. 

Fog computing relies on rapid, trusted and secure transmissions between devices and systems – in other words, it needs interoperability based on open standards. Developing this open reference architecture framework is the work of the OpenFog Consortium. Eight pillars, as viewed in the above figure, represent the foundational elements to the OpenFog reference architecture. Members of the Consortium from universities, startups and industry giants are collaborating on it to ensure an open, interoperable architecture that will eventually enable manufacturers to choose fog-based solutions for their smart factories from a diverse, vibrant supplier ecosystem.

For manufacturers, letting the fog into the factory is not just a promising approach – it’s a necessary one to enable the full potential of the Industrial Internet of Things.

###

This blog was written by Lynne Canavan, Executive Director, OpenFog Consortium.

About your Guest Blogger:

Lynne Canavan is the Executive Director of the OpenFog Consortium, the global organization formed to accelerate the adoption of fog computing in order to solve the bandwidth, latency and communications challenges associated with IoT. OpenFog was founded by ARM, Cisco, Dell, Microsoft, Intel and Princeton University Edge Computing Laboratory in November 2015, and has over 50 members today. Lynne can be reached at lynne_canavan@openfogconsortium.org, on LinkedIn at linkedin.com/in/lynnecanavan or on Twitter at @lfcanavan

==================end of reblog============================

Advertisements

About broodjejaap

See ABOUT on TheConnectivist.wordpress.com
This entry was posted in Fog Computing, IoT, Uncategorized. Bookmark the permalink.

One Response to Next: “OpenFog” Computing Swarms

  1. Pingback: About The Future: 1. Cisco’s new positioning | The Connectivist

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s