apim_4xx_support_multiple_analytics_publishers APIM manage workflow with multiple roles APIM 3.0.0 per API based subscription workflow Logging internal HTTP requests Log APIM analytics events to a file Monetization and sample with WSO2 API Manager 2.6.0 Share application and subscription among a set of specific groups or roles WSO2 APIM Correlating analytics event with correlationID APIM analytics distinguish production and sandbox traffic APIM 2.x.x analytics internal and analytics tuneup Configure APIM(Next release) Key Manager User stores APIM(Next release) working with key manager DAS 3.x Parse system variables to Spark Context Revoke OAuth application In APIM 2.1.0 Next WSO2 APIM powered by WSO2 Ballerina Configure WSO2 APIM Analytics on Cluster environment Configure WSO2 DAS 3.1.0 for WSO2 APIM 2.0.0 Analytics WSO2 APIM publishing custom statistics WSO2 APIM Error codes Working with WSO2 message tracer Use DAS admin service to query using Spark SQL Configure WSO2 APIM Analytics using XML WSO2 APIM Generating and Retrieving Custom Statistics Understanding WSO2 APIM Statistics Model Publishing WSO2 APIM 1.10.x Runtime Statistics to DAS with RDBMS Publishing_APIM_1100_Runtime_Statistics_to_DAS Aggregate functions with WSO2 DAS REST API Create a cApp for WSO2 DAS Debugging WSO2 Products using OSGI console. Publishing APIM Runtime Statistics to DAS Deploy cApp on WSO2 DAS How to configure and start the Accumulo minicluster How to setup DNS server on Ubuntu and Ubuntu server How to use Java Reflection how to install apache web server on ubuntu and ubuntu server How to install Mail server on Ubuntu and Ubuntu server How to install squirrelmail webmail client on Ubuntu and Ubuntu Server Pass and return String value to JNI method Pass and return numeric value to JNI method Calling a C Function from the Java Programming Language using JNI AXIS 2 Sample web service Client with maven and eclipse How to setup AXIS 2 with Apache Tomcat AXIS 2 Sample web service with maven and eclipse Robot framework Sample with Selenium Robot framework Custom Library Sample Behaviour-Driven Development with JBehave and Eclipse Play Audio with Netbeans and linking with LibVLC Implement LibVLC based player with QT-part2 Simple Audio playing sample with LibVLC How to install LibVLC on Ubuntu Implement LibVLC based player with QT-part1
Publishing APIM Runtime Statistics to DAS
  1. Prerequisites
  2. Follow are the prerequisites and link to download them.
    • wso2 API Manager 1.9.1 from here
    • wso2 DAS 3.0.0 from here
    • RDBMS Instance (MySQL,MsSQL, H2 etc)
    • API_Manager_Analytics.car from here
    • DB scripts from here
  3. Configure WSO2 DAS

    If APIM and DAS run on the same machine, increase the default service port of DAS by setting offset value in <DAS_HOME>/repository/conf/carbon.xml

        <Offset>1</Offset>
        

    Define the Datasource declaration according to your RDBMS in <DAS_HOME>/repository/conf/datasources/master-datasources.xml. This DB is used to push the summarised data, after analyzing is done by DAS. Later APIM uses this DB to fetch the summary data and display on APIM dashboard. Here we used the MySQL databases as an example. But you can configure it with H2, Oracle etc. Note that you should use the WSO2AM_STATS_DB as the Datasource name always.

    Also note that, Auto Commit option should be disable when working with DAS. You can set this in the JDBC URL or adding line <defaultAutoCommit>false</defaultAutoCommit> to datasource <configuration> tag.

      <datasource>
        <name>WSO2AM_STATS_DB</name>
        <description>The datasource used for setting statistics to API Manager</description>
        <jndiConfig>
          <name>jdbc/WSO2AM_STATS_DB</name>
          </jndiConfig>
        <definition type="RDBMS">
          <configuration>
            <url>jdbc:mysql://localhost:3306/TestStatsDB</url>
            <username>db_username</username>
            <password>db_password</password>
            <driverClassName>com.mysql.jdbc.Driver</driverClassName>
            <maxActive>50</maxActive>
            <maxWait>60000</maxWait>
            <testOnBorrow>true</testOnBorrow>
            <validationQuery>SELECT 1</validationQuery>
            <validationInterval>30000</validationInterval>
            <defaultAutoCommit>false</defaultAutoCommit>
            </configuration>
          </definition>
      </datasource>
    
  4. Configure MySQL database

    If you are used MySQL as the database, download and paste MySQL driver from here

    to <DAS_HOME>/repository/components/lib. Like earlier, APIM stat publishing and analyzing using BAM, DAS does not create the table structure in the database automatically and have to do it manually. Thus find the correct schema declaration script under dbscript folder and import it to the above database.
            ex:
                use the mysql.sql to create schemas in the above DB
            
  5. Install Capp.

    DAS uses the SparkSQL to Analyse data. All the definition about the published data from the APIM and the way it should analyze using spark, are ship to DAS as a .car file.

    • Download the Capp.
    • Start the DAS using bin/wso2server.sh
    • Go to DAS admin console and logging
    • Go the Carbon Applications section under Manage and click add
    • Point the downloaded CApp and upload.
  6. Configure WSO2 API Manager
    • Start APIM
    • go to admin-dashboard using https://localhost:9443/admin-dashboard/
    • log in and click Configure Analytics under the settings section
    • select enable combo box and will appear setting to configure analytics
    • set the Event Receiver Configurations according to the DAS
    • Then click the Add URL group button to save it.
    • Do not enter anything on Data Analyzer Configurations and set them all to empty
    • For Statistics Summary Datasource, add the same DB configuration you added in WSO2AM_STATS_DB Datasource in das.
    • Once done click save

    Note that: If you are using the MySQL copy and paste the MySQL driver library to <AM_HOME>/repository/components/lib

    DAS configuration overview

  7. Invoke Sample API and Get the Statistics

    Let's invoke an API to generate Traffic and see the Statistics

    Deploy Sample Wheather API

    Deploy sample WeatherAPI by login to the APIM Publisher

    Sample Wheather API

    Then login to the Store and subscribe to API you created

    Using Store API console or using Curl invoke the API

    Invoke the API

    Then wait for a few minutes(‹ 5 mins) to generate Analytics

    Then Navigate to Publisher Statistics Section and click on API Usage

    WeatherAPI usage

  8. Data Purge (Optional)

    Data purge is one option to remove historical data in DAS. Since DAS does not allow to delete the DAS table data or Table deletion this option is very important. With data purging, you can achieve high performance on data analyzing without removing analyzed summary data. Here we purge data only on stream data fired by APIM. These data are contained in the following tables.

        ORG_WSO2_APIMGT_STATISTICS_DESTINATION
        ORG_WSO2_APIMGT_STATISTICS_FAULT
        ORG_WSO2_APIMGT_STATISTICS_REQUEST
        ORG_WSO2_APIMGT_STATISTICS_RESPONSE
        ORG_WSO2_APIMGT_STATISTICS_WORKFLOW
        ORG_WSO2_APIMGT_STATISTICS_THROTTLE
            

    Make sure not to purge data other than the above table. it will result in vanishing your summarized historical data. There are two ways to purge data in DAS.

    • Using admin console
        • Go the Data-explorer and select above at a time.
        • Then click schedule data purge button.
        • Then set the time and days you need to purge.
        • Do this all of the above tables and wait for the data purging.

      Data Purge Dialog box

    • Global method

      Note that this will affect all the tenants

        • Open the <DAS_HOME>/repository/conf/analytics/analytics-config.xml
        • change content of <analytics-data-purging> tag as below
        <analytics-data-purging>
          <!-- Below entry will indicate purging is enable or not. If user wants to enable data purging for cluster then this property need to be enable in all nodes -->
          <purging-enable>true</purging-enable>
          <cron-expression>0 0 12 * * ?</cron-expression>
          <!-- Tables that need include to purging. Use regex expression to specify the table name that need include to purging.-->
          <purge-include-table-patterns>
            <table>.*</table>
            <!--<table>.*jmx.*</table>-->
            </purge-include-table-patterns>
          <!-- All records that insert before the specified retention time will be eligible to purge -->
          <data-retention-days>365</data-retention-days>
        </analytics-data-purging>
      

Add Comment

* Required information
1000
Powered by Commentics

Comments (0)

No comments yet. Be the first!