Have questions? Stuck? Please check our FAQ for some common questions and answers.

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 8 Next »

Team

 

NameOrganizationEmail
Shravan AmbatiCalix Inc

shravan.ambati@calix.com

Sanjana AgarwalOn.Labsanjana@onlab.us

 

 

OVERVIEW

SDN Applications wanting to receive notifications from ONOS must be written as native ONOS Java Application.

Applications written using any other programming language must use the REST APIs of ONOS.

This project aims to provide the same level of visibility to external non java applications,

there by ONOS will be opened up to a wide number of non-Java Applications and significantly expanding the scope of control of ONOS. 


This feature will enable ONOS to publish events to a Kafka topic for all other applications subscribing to that topic.  

The contribution will be in the form of an App called Kafka Integration Application. 

The Application will subscribe to events via Java APIs on ONOS and publish those events to a Kafka Server.

Apache Kafka is a distributed messaging system that supports a pub/sub mechanism among other messaging models.

More information about Apache Kafka can be found here - http://kafka.apache.org/documentation.html .

 

SYSTEM ARCHITECTURE 

The diagram below shows the overall architecture of the system.

 


    1. The External Application registers for KafkaService. It makes POST call with the application name.  
    2. The response could contain server Ip, port etc, needed for connectivity along with the consumer group id generated for this app. 
    3. The External Application makes a POST REST call to the Kafka Integration App. This time to subscribe for an event.
      The contents of the REST call could be - App name, Event Type and the allocated consumer group-id.
    4. If there is no listener for this event type the Event Manager module within the App will register a listener for the specific event with ONOS. 
      If there is a listener already, the event manager will not create a listener.
    5. The response for the POST will be 200 ok instead of 201 created. This is because the resource is created on an
      external entity. The response could contain the Kafka Topic information or just a plain message.
    6. The external app would then use this information to connect to the Kafka server and register itself as a consumer for the topic it had registered earlier.
    7. At some later point in time we receive our first event from ONOS. The Event Manager module will pick up the event. 
      It will convert the data format to the common GPB format and Marshall the data using the GPB library. The Data is then passed along to the Kafka Manager. 
    8. The Kafka Manager module will publish the event to the topic. 
    9. The external app will receive the event in GPB defined message format.

 

REST APIs 

  1. POST (register) - This will be used by non native apps to register for the Kafka service.
  2. POST (subscribe) - This will be used by non native apps to subscribe for a sepcific event type.
  3. GET - Return the list of event types supported by the Kafka Integration App
  4. DELETE (subscribe) - This will be used by non native apps to deregister for a specific event type.
  5. DELETE (register) - This will be used by non native apps to deregister for a kafka service.

CONFIGURATION

The Application loads a json config file at start up. This config file will have Kafka config information.

CLUSTERING SUPPORT

1. In a cluster scenario only the primary and backup controller will be responsible for publishing events to Kafka Server.

2. One of the core assumptions is the order of event stream that is received from Network Devices is the same across all the ONOS controllers.

3. The Leader/Primary will publish event stream to a local store. There will be a shared counter indicating the last published event to Kafka Server. After publishing the event to the Kafka Server, it will update the counter accordingly. All other members will update their local buffers to remove events that have been published. This is based on the shared counter value.

4. In case of Leader failure, the Backup will try to post events that were not sent to the Kafka server. These are basically events present in its local store.

5. As a mechanism to detect duplicated events being sent, we could have seq_id numbers for every event that gets posted. This way the external app can detect duplicates. The sequence number could be a simple timestamp that every event will carry.

DESIGN DECISIONS

  1. There will be one topic per event type. Each external app will be given a unique consumer groupId. 
  2. Event subscription by external apps is a two step process - They must first register and then subscribe to a specific event call. 
  3. As a first step we will only export Device Events  and Link Events to consumers and worry about Packet Ins and Packet Out later. 

  4. Once the framework is in place it should be relatively easy to add support for other event types. 
  5. In the scenario where the external app loses connectivity with the Kafka server, and does not come back up within the retention period (the time duration for which Kafka will retain messages)  the onus is on the non native app to rebuild its current view of the network state via existing ONOS REST APIs.

FAQ

 

  1. Is it possible that before Kafka Integration App tries to create a topic, the external app can try to subscribe to that topic? 
    Yes, this is possible. In such a scenario Kafka will automatically create the topic. This is similar to the case where the
    producer tries to publish a message to a non existent topic. Kafka will create a topic automatically.

  2. What is the purpose of having a Group Id and App name info in the POST REST call? 
    GroupId is necessary to make sure there is no conflict among multiple external apps. When the external apps tries to consume data
    each app should be in a separate consumer group. This way the message is delivered to all the external apps from a single topic. The
    App name is primarily for keeping track of which apps have registered. May be we could have a GET REST call to show the current registered apps.

  3. How does the external app know what event types are supported ?
    A GET REST API will be provided to show the list of events supported

  4. How does the external app know how many partitions were created for the topic?
    This I am thinking will be sent as part of the response to the POST call. The external
    app needs this information, so it can spin up appropriate number of threads. 
    (Num of Consumer Threads = Num of Partitions)

  5. What is the topic name that gets created?
    The topic name is the same as event type.


  • No labels