apim_4xx_support_multiple_analytics_publishers APIM manage workflow with multiple roles APIM 3.0.0 per API based subscription workflow Logging internal HTTP requests Log APIM analytics events to a file Monetization and sample with WSO2 API Manager 2.6.0 Share application and subscription among a set of specific groups or roles WSO2 APIM Correlating analytics event with correlationID APIM analytics distinguish production and sandbox traffic APIM 2.x.x analytics internal and analytics tuneup Configure APIM(Next release) Key Manager User stores APIM(Next release) working with key manager DAS 3.x Parse system variables to Spark Context Revoke OAuth application In APIM 2.1.0 Next WSO2 APIM powered by WSO2 Ballerina Configure WSO2 APIM Analytics on Cluster environment Configure WSO2 DAS 3.1.0 for WSO2 APIM 2.0.0 Analytics WSO2 APIM publishing custom statistics WSO2 APIM Error codes Working with WSO2 message tracer Use DAS admin service to query using Spark SQL Configure WSO2 APIM Analytics using XML WSO2 APIM Generating and Retrieving Custom Statistics Understanding WSO2 APIM Statistics Model Publishing WSO2 APIM 1.10.x Runtime Statistics to DAS with RDBMS Publishing_APIM_1100_Runtime_Statistics_to_DAS Aggregate functions with WSO2 DAS REST API Create a cApp for WSO2 DAS Debugging WSO2 Products using OSGI console. Publishing APIM Runtime Statistics to DAS Deploy cApp on WSO2 DAS How to configure and start the Accumulo minicluster How to setup DNS server on Ubuntu and Ubuntu server How to use Java Reflection how to install apache web server on ubuntu and ubuntu server How to install Mail server on Ubuntu and Ubuntu server How to install squirrelmail webmail client on Ubuntu and Ubuntu Server Pass and return String value to JNI method Pass and return numeric value to JNI method Calling a C Function from the Java Programming Language using JNI AXIS 2 Sample web service Client with maven and eclipse How to setup AXIS 2 with Apache Tomcat AXIS 2 Sample web service with maven and eclipse Robot framework Sample with Selenium Robot framework Custom Library Sample Behaviour-Driven Development with JBehave and Eclipse Play Audio with Netbeans and linking with LibVLC Implement LibVLC based player with QT-part2 Simple Audio playing sample with LibVLC How to install LibVLC on Ubuntu Implement LibVLC based player with QT-part1
WSO2 APIM Generating and Retrieving Custom Statistics
  1. Introduction

    APIM comes with statistics dashboard and it contains the most common usage statistics. But when we go beyond this statistics and basic function of the APIM we may need extra statistics. As an example when we integrating APIM with billing engine for monetization we may need sensitive information than the existing dashboard.

    This document describes how to extend or customize the APIM usage statistics. To learn more about the APIM analytics model and currently available analytics model please refer to this document. APIM there are two ways to collect statistics and those are DAS rest client or external RDBMS. Thus, the way data retrieving process may differ based on the statistics configuration. In this document, we discuss the data retrieving from DAS REST API. But DAS internal summary tables and RDBMS summary data are identical, there is no difference between data. But Data retrieval search query and response types may differ(SQL or Lucene). In this document, we have explained how to use Lucene to retrieve data from DAS REST API.

  2. DAS REST API

    Data retrieved from the WSO2 DAS REST API. WSO2 APIM is configured with WSO2 DAS with REST usage Client. To Configure APIM please follow this document for REST and this document for RDBMS.

    DAS REST API consists of several API for various kind of specific operations. Ex: get the table count, check the existence of a table, Search records and aggregate fusions. In this case, we make attention to the search and aggregate API. For the more about REST API please refer DAS REST API Documents. DAS REST API for search is used for Lucene based searching. It does not provide the aggregate functions. Aggregate REST API provide Aggregate and search functionality. Thus the following examples are based on the aggregate API.

    Here the overview of the Aggregate API.

    Resource Path /analytics/aggregates
    HTTP Method POST
    Request/Response Format application/json
    Authentication Basic

    Hence the API endpoint will be https://localhost:9444/analytics/aggregates

  3. Retrieving data from the existing tables

    APIM summarized event stream data into the following summary tables.

    • API_REQUEST_SUMMARY
    • API_VERSION_USAGE_SUMMARY
    • API_Resource_USAGE_SUMMARY
    • API_RESPONSE_SUMMARY
    • API_FAULT_SUMMARY
    • API_DESTINATION_SUMMARY
    • API_LAST_ACCESS_TIME_SUMMARY
    • API_THROTTLED_OUT_SUMMARY
    Example scenarios.
    1. Ex: 1

      Calculating Given API usage per month

      In this case, when you want to find a particular API usage, you can retrieve it from the DAS directly. API usage is associated with API_REQUEST_SUMMARY. In this case, API request count needs to be calculated using an aggregate function. Thus DAS REST Aggregate API have to use.

        • constraints

      API = CalculatorAPI

      max_request_time= Jan 1-Jan 31(In Long format)

      table : API_REQUEST_SUMMARY

      In this case, we need to find the summation of the total_request_count by grouping API column. In Lucene in order to group, there should be facet column which the first element is an API. here we use column api_version_userId_facet with aggregate level as 1.

        • equivalent lucent query
      {
      	"query": "api:\"CalculatorAPI\" AND  max_request_time: [1451606400000 TO 1454284799000]",
      	"aggregateLevel": 0,
      	"tableName": "API_REQUEST_SUMMARY",
      	"groupByField": "api_version_userId_facet",
      	"aggregateFields": [{
      		"fieldName": "total_request_count",
      		"aggregate": "SUM",
      		"alias": "count"
      	}]
      }
      
        • response
      [{
         "tableName": "API_REQUEST_SUMMARY",
         "timestamp": 1452697208765,
         "values":    {
            "count": 4,
            "api_version_userId_facet": ["CalculatorAPI"]
         }
      }]
      
        • quelanr SQL query if RDBMS used
      select sum(total_request_count) from API_REQUEST_SUMMARY where api = "CalculatorAPI" AND time BETWEEN "2016-01-01 00:00" AND "2016-01-31 23:59:59" group by api;
      
    2. Ex: 2
          the second scenario is if we need to know the number of access of a particular user to a method of a particular API and application.
        • constraints

          API = CalculatorAPI

          max_request_time= Jan 1-Jan 31(In Long format)

          table : API_Resource_USAGE_SUMMARY

        • equivalent lucent query
      {
      	"query": "api:\"CalculatorAPI\" AND  max_request_time: [1451606400000 TO 1454284799000]",
      	"aggregateLevel": 2,
      	"tableName": "API_Resource_USAGE_SUMMARY",
      	"groupByField": "key_api_method_path_facet",
      	"aggregateFields": [{
      		"fieldName": "total_request_count",
      		"aggregate": "SUM",
      		"alias": "count"
      	}]
      }
      
        • response
      [
            {
            "tableName": "API_RESOURCE_USAGE_SUMMARY",
            "timestamp": 1452762105710,
            "values":       {
               "key_api_method_path_facet":          [
                  "_JCMHhhFOL0e18EuAkufXuTYzHga",
                  "CalculatorAPI",
                  "GET"
               ],
               "count": 4
            }
         },
            {
            "tableName": "API_RESOURCE_USAGE_SUMMARY",
            "timestamp": 1452762105740,
            "values":       {
               "key_api_method_path_facet":          [
                  "WHC1fv3nHqhjCldwszKWXEDfb30a",
                  "CalculatorAPI",
                  "GET"
               ],
               "count": 1
            }
         }
      ]
      
        • quelanr SQL query if RDBMS used
      select consumerKey,api,method,sum(total_request_count) from API_Resource_USAGE_SUMMARY where api = "CalculatorAPI" AND time BETWEEN "2016-01-01 00:00" AND "2016-01-31 23:59:59" group by consumerKey,api,method;
      
  4. Retrieving data from derived tables

    say if we need more sharp data like the number of API access for a particular time. say 10 am to 3 pm. with the current APIM statistics implementation and existing tables, we cannot derive such statistics. APIM statistics are generated assuming smallest sensitivity is 1 day. But for this specific case, we need sensitive data which time gap is in hours.

    Since we have no Summary Table with hours we need to create tables and generate summary data.

    Examples:
    1. getting particular API usage between 9 am to 3 pm and In January 2016
        • creating summarized table
      CREATE TEMPORARY TABLE API_PEAK_ACCESS_TIME_SUMMARY USING CarbonAnalytics OPTIONS (tableName "API_PEAK_ACCESS_TIME_SUMMARY",
        schema "tenantDomain string -i,
        api string -i,
        consumerKey string -i,
        total_request_count int -i,
        max_request_time long -i,
        hours int -i,
        consumerKey_api_facet facet -i",
        primaryKeys "tenantDomain,consumerKey,api"
        );
      
      
        • writing summarized rule
        create temporary table APIRequestData USING CarbonAnalytics OPTIONS(tableName "ORG_WSO2_APIMGT_STATISTICS_REQUEST");
      
        insert into table API_PEAK_ACCESS_TIME_SUMMARY select tenantDomain,api,COALESCE(consumerKey,'-'),
        sum(request) as total_request_count,max(requestTime) as max_request_time,
        substring(cast(requestTime/1000 as timestamp),12,2),facet2(consumerKey,api)
        from APIRequestData where context is not NULL group by tenantDomain,api,consumerKey,
        substring(cast(requestTime/1000 as timestamp),0,4),
        substring(cast(requestTime/1000 as timestamp),6,2),
        substring(cast(requestTime/1000 as timestamp),9,2),
        substring(cast(requestTime/1000 as timestamp),12,2);
      
        • retrieving data
      Lucene Request:
      {
      	"query": "api:\"CalculatorAPI\" AND  max_request_time: [1451606400000 TO 1454284799000] AND hours : [9 TO 15]",
      	"aggregateLevel": 1,
      	"tableName": "API_PEAK_ACCESS_TIME_SUMMARY",
      	"groupByField": "consumerKey_api_facet",
      	"aggregateFields": [{
      		"fieldName": "total_request_count",
      		"aggregate": "SUM",
      		"alias": "count"
      	}]
      }
      
      Json Response:
      [{
         "tableName": "API_PEAK_ACCESS_TIME_SUMMARY",
         "timestamp": 1452751664298,
         "values":    {
            "consumerKey_api_facet":       [
               "_JCMHhhFOL0e18EuAkufXuTYzHga",
               "CalculatorAPI"
            ],
            "count": 4
         }
      }]
      
    2. getting total request per specific Application
        • creating summarized table
      CREATE TEMPORARY TABLE APP_REQUEST_COUNT_SUMMARY USING CarbonAnalytics OPTIONS (tableName "APP_REQUEST_COUNT_SUMMARY6",
        schema "tenantDomain string -i,
        applicationName string -i,
        consumerKey string -i,
        total_request_count int -i,
        max_request_time long -i,
        applicationName_consumerKey_facet facet -i",
        primaryKeys "tenantDomain,applicationName,consumerKey"
        );
      
        • writing summarized rule
        create temporary table APIRequestData USING CarbonAnalytics OPTIONS(tableName "ORG_WSO2_APIMGT_STATISTICS_REQUEST");
      
        insert into table APP_REQUEST_COUNT_SUMMARY
        select tenantDomain,applicationName,COALESCE(consumerKey,'-'),
        sum(request) as total_request_count,max(requestTime) as max_request_time,
        facet2(applicationName,consumerKey)
        from APIRequestData where context is not NULL group by tenantDomain,applicationName,consumerKey,
        substring(cast(requestTime/1000 as timestamp),0,4),
        substring(cast(requestTime/1000 as timestamp),6,2),
        substring(cast(requestTime/1000 as timestamp),9,2);
      
        • retrieving data
      Lucene Request:
      {
      	"query": "max_request_time: [1451606400000 TO 1454284799000] AND applicationName:\"DefaultApplication\"",
      	"aggregateLevel": 1,
      	"tableName": "APP_REQUEST_COUNT_SUMMARY6",
      	"groupByField": "applicationName_consumerKey_facet",
      	"aggregateFields": [{
      		"fieldName": "total_request_count",
      		"aggregate": "SUM",
      		"alias": "count"
      	}]
      }
      
      Json Response:
      [{
         "tableName": "APP_REQUEST_COUNT_SUMMARY6",
         "timestamp": 1452758140301,
         "values":    {
            "count": 4,
            "applicationName_consumerKey_facet":       [
               "DefaultApplication",
               "_JCMHhhFOL0e18EuAkufXuTYzHga"
            ]
         }
      }]
      

Add Comment

* Required information
1000
Powered by Commentics

Comments (0)

No comments yet. Be the first!