Analysis: Azure Analyze

Analysis: Azure Analyze

#340530

Description

The /analysis/azure/analyze resource is used to collect Microsoft Azure infrastructure data and initiate optimization analysis with the collected data. The following processes occur when the first /analysis/azure/analyze request is triggered:

  1. Set up and initiate data collection of the specified Azure subscription and schedule it to run automatically on a nightly basis.
  2. Initiate analysis on the data collected using the default policy.
    • Subsequent analysis is scheduled to run on a nightly basis after the completion of data collection.
    • Optionally, you can configure the results to be sent to a webhook URI upon analysis completion. See Add webhook to an analysis for details.
  3. While data collection or analysis is in progress, you can check the status (using /analysis/azure/<subscriptionId>/status resource) or wait for the results to be published to an optional webhook URI.

  4. The reporting database update is scheduled to run automatically on a nightly basis after the completion of the analysis. This process produces reports for each instance recommendation, which is useful for analysts or application owners. These reports are only created after the scheduled analysis is completed, and may therefore only be available on the following day for a new analysis. Exact timing depends on the size of your environment.

The /analysis/cloud/azure resource is also used to return a list of Microsoft Azure optimization analyses currently in the Densify.

Ad-Hoc Tasks

Generally you do not need to run once-off tasks as both data collection and analysis tasks are scheduled automatically. In cases where you need make an ad-hoc request in addition to the scheduled job, the functionality exists for this endpoint.

Historical Data Collection

When Densify initiates data collection, normally audits collect only the last 24 hours of data. You can optionally collect up to 30 days of historical data. The historical data provides a more representative set of data on which to base resizing and optimization recommendations. You can run an ad-hoc task to collect the historical data.

Note:  Collection of historical data can take a significant amount of time, depending on the number of instances from which Densifyis collecting data. Contact [email protected] to enable historical data collection and details of the performance impact.

The following settings define the range of historical data to be collected:

  • Start date offset—This is the number of days from the 30-day maximum, used to define the start of the range.
  • End date offset—This is number of days from yesterday, to end the range of data collected.

These parameters allow you to reduce the number of days of historical data to be collected. If, for example, the daily audit has been running for a few days before the historical audit can be executed then you can set the end offset to exclude the number of days that have already been collected. Thirty days is the maximum number of days that you can go back and collect historical data for Azure and GCP environments.

A connection to the specified cloud account must already exist before you can run an ad hoc audit. When you execute an ad hoc refresh an audit task will be configured but a new connection will not be created. If the cloud connection does not already exist and the API POST contains triggerAdhocAudit=true, then you will get an error message.

If there is more than one account associated with the specified account ID (i.e. a payer account with many linked accounts), the Densify API handles it in the same way that analyses are currently rerun using the POST operation.

Once the audit is complete you need to rerun the associated analyses as indicated below or you can wait for the next scheduled execution of the analyses and RDB populate.

Analysis Update

You can make an ad-hoc request to refresh an existing analysis, outside of the scheduled nightly run using /analysis/cloud/<aws|azure|gcp>/analyze. This manual, ad hoc analysis request does not perform data collection or reporting database updates. It only runs the analysis on the existing data collected with the following behavior:

  • The analysis uses the policy that is configured for the analysis. Contact [email protected] to change the configured policy.
  • If a new webhook is provided, the analysis will send results to the new webhook URI. If no webhook is provided, the analysis will send results to the existing webhook, if configured.
  • If the same analysis is already running, the request does not proceed and an appropriate message is returned.
  • If the specified analysis has data collection scheduled within 30 minutes, the request does not proceed and an appropriate message is returned. For example, if data collection is scheduled to run at 12:05 AM, and you initiate a manual, ad hoc analyze request at 11:45 PM, then the analysis will not proceed and an error message is returned.

Prerequisite Configuration

Before you can collect Azure cloud infrastructure data in Densify, you need to create a service principle and configure a secret key. See Microsoft Azure Data Collection Prerequisites for a Service Principal for details on creating and configuring the service principle.

Note:  When using the Densify API only the Azure "Service Principal" can be used to connect to your Azure subscriptions.

If you are using the API, data collection and analysis are created and then refreshed daily on a per subscription basis (1-to-1). You can associate many subscriptions with a service principle, but when using the API to initiate data collection, you must specify a subscription ID and the audit and analysis are created for each subscription, separately.

When using the Connection Wizard in the Densify UI, you do not need the subscription ID, as all subscriptions that are associated with the service principle are collected and listed once the connection has been verified. You can then select one or more of the subscriptions that you want to analyze (1-to-Many). When using the Connection Wizard, data collection and analysis are created and then refreshed daily for all of the subscriptions that you selected when you created the connection.

Note:  When using the Densify API only one subscription will be processed per API request. This is the case, even if more than one subscription is associated with the service principle.

Changing Credentials

When you need to change the credentials for the subscription or the service principle, you need to delete the data collection audit and recreate it. When you delete the audit, only the audit and all associated scheduler entries are removed, so you can recreate the audit with the new credentials and continue without any loss of data.

Resource

/analysis/cloud/azure/analyze

/analysis/cloud/azure

Supported Operations

Table: Azure Analysis Supported Operations

Operation

HTTP Method

Input

Output

Description

Run Azure data collection and analysis

POST /analysis/azure/analyze

Use this resource to:

  1. Collect Azure cloud data via API for the specified subscription.
  2. Run the analyses if data has been collected. An analysis is created for the specified subscription.
  3. (Optional) Send results to webhook receiving application.
  4. Data collection and analysis processes are scheduled to run nightly, after the initial request.

Rerun Azure data analysis

POST /analysis/azure/analyze

This resource operation is used to re-run an analysis that already exists.

You can specify an updated policy and/or webhook to use for the analysis.

Data collection is not run. Data collection only occurs during the first /analyze request, and is then scheduled to run nightly

The updated webhook is saved and will be used in the next scheduled analyses.

You cannot initiate a request if data collection or the analyses are in progress or within 30 minutes of the time that these tasks are scheduled to run.

Run the historical audit

POST /analysis/cloud/
azure/analyze

Request Body Parameter:

This resource operation is used to re-run an audit for which a connection and daily, scheduled audit already exists.

You can optionally specify the number of days of historical data to collect. If not specified the previous 30 days from yesterday's date are collected.

If you initiate an audit request when data collection or analysis is already running or within 30 minutes of the time that these tasks are scheduled to run, then the request will fail and an error message is returned.

List all generated analyses

GET /analysis/cloud/azure/

Path Parameter:

  • N/A

Request Body Parameter:

  • N/A

Lists all analyses that have been created, with their details.

This resource operation is used to obtain the analysis ID that is required for other operations.

Parameters

Azure Path Parameters

Table: Azure Analysis Path Parameters

Parameter Name

Type

Description

analysisId

string

The unique referenced ID of the Azure analysis.

Azure Request Body Parameters

Table: Azure Analysis Parameters

Parameter Name

Type

Description

subscriptionId

string

The subscription ID of the Azure subscription from which to collect data.

Note: When using the Densify API only one subscription will be processed per API request. This is the case, even if more than one subscription is associated with the service principle.

If this is a new subscription ID (i.e. has not been audited before), the post request will trigger the initial 30-day audit of historical data, create and enable a schedule for the daily audit and create the corresponding cloud environment and analyze the data once data collection completes.

If this is subscription ID has been audited and analyzed previously, then the post request will trigger an adhoc environment analysis refresh.

applicationID

string

The ID of the application created within your Active Directory. This application is associated with the user that creates it. It also contains the subscription ID for reference by other methods.

See Microsoft Azure Data Collection Prerequisites for a Service Principal for details on setting up the service principle.

secretKey

string

The client secret for the above listed application.

See Microsoft Azure Data Collection Prerequisites for a Service Principal for details on setting up the service principle.

tenantID

string

The tenant ID corresponds to the Azure Active Directory (AD).

See Microsoft Azure Data Collection Prerequisites for a Service Principal for details on setting up the service principle.

serviceAcctJSON

file

The credentials and details as listed above, provided in a single JSON file.

To update the JSON file, refer to Analysis: Azure Analyze.

connectionName

(optional)

string

Use the connection name to clearly identify this connection within Densify. This name will appear in the Saved Connections list in the Densify UI. By default, the connection name is set to the Subscription ID.

The connection name must be unique within the Azure connection type section, so if the name is already in use, the request fails with an error message.

This connection name can be used for filtering.

Note: The Connection Name is limited to 32-characters.

endDayOffset

(optional)

string

Historical data end day offset.

This parameter is optional and is used to configure the range of the historical audit. It is used in conjunction with the parameters, Trigger ad-hoc audit and Start Day Offset to set the end day of the range of historical days of data to collect.

If no value is specified and the parameter, Trigger ad-hoc audit has been enabled, then the end date is set to yesterday.

If you specify any number other than 0, then that number is used to offset the range's end date from yesterday. i.e. if End Day Offset=5 and yesterday was Dec 1, then the end date will be Nov 25.

When AFv2 is enabled, this setting is not used and the end date is always "yesterday". Do not pass this parameter in your API call.

startDayOffset

(optional)

string

Historical data start day offset.

This parameter is optional and is used to configure the historical audit. It is used in conjunction with the parameters, Trigger ad-hoc audit and End Day Offset to set the start day of the range of historical days of data to collect.

If no value is specified and the parameter, Trigger ad-hoc audit has been enabled, then the start date is set to 30 days previous to yesterday's date.

If you specify a number less than 30, that number is used to offset the start date from 30 days in the past. i.e. if Start Day Offset=10 then the start date will be 10 days offset from 30 days, so 20 days previous to yesterday.

You can use these setting to define a smaller range (i.e. 20 days). You cannot collect more than 30 days of historical data.

triggerAdhocAudit

(optional)

string

The flag to trigger an ad-hoc 30-day historical audit.

This parameter is optional and used to run the historical audit immediately and once only.

Typically the historical audit is run first, when data collection is initiated for the specified subscription /project. If the historical audit has been disabled for performance reasons, it can be run once to collect the historical data as compute resources become available.

A connection to the specified subscription or project must already exist before you can use this flag.

A once-off task will be configured but a new connection cannot be created. If the connection does not already exist AND the API post contains ad hoc=true, then you will see an error message.

webHook

(optional)

  • uri
  • authType
  • authValue

The webhook definition to an external application.

Optimization results are sent to the webhook-defined application when the analysis is complete. See Parameters for details of each parameter in the webhook definition.

Response

Table: Azure Analysis Response Elements

Element

Type

Filter/Sort

Description

analysisName

string

 

The analysis name corresponds to the Azure subscription ID for the collected infrastructure data. An analysis is created for each subscription associated with the service principle provided.

href

string

 

The referenced resource to the recommendations of the analysis.

See Analysis: Azure Recommendations for details on Azure analysis recommendations.

Note: The Impact Analysis and Recommendation Report report is not currently available for VM Scale Sets with maximum size >1.

completed

string

 

The date and time (in milliseconds) when processing of the last analysis completed.

analysisId

string

 

The Densify internal ID of the analysis entity.

phase

string

 

The current phase of the specified analysis.

Possible phases include:

  • analyzing : <percent completed>%
  • not analyzing

message

string

 

The message for the analysis status.

For errors, the message for the following status response is returned.

status

number

 

The HTTP response code of the request. Possible status values include:

  • 200—success with request;
  • 400—invalid parameters;
  • 401—authentication failed;
  • 404—resource not found;
  • 500—internal server error.

Examples

Example: Creating New Azure Data Collection and Analysis

The following example shows you how to initiate Azure data collection and analysis, and send the results to a WebHook.

Example: Refreshing an Existing Analysis

The following example shows you how to run an adhoc analysis using the last set of collected data, the default policy and sending the results to a WebHook.