Analysis: GCP Analyze
Analysis: GCP Analyze
#340520
Description
The /analysis/gcp resource is used to return a list of GCP optimization analyses currently in the Densify system.
An analysis is generated for each GCP project associated with the provided GCP service account used during data collection.
The /analysis/gcp/analyze resource is used to collect Google Cloud Platform infrastructure data and initiate optimization analysis with the data collected. Below are the series of processes that occur when the initial /analysis/gcp/analyze request is triggered:
- Set up and initiate data collection of the GCP projects based on the associated GCP service account and schedule it to run automatically on a nightly basis.
- Initiate analysis on the data collected using the default policy.
- Subsequent analysis is scheduled to run on a nightly basis after data collection.
- You can optionally configure the results to be sent to a webhook URI upon analysis completion. See Add webhook to an analysis for details.
- While data collection or analysis is in progress, you can check for status (using /analysis/gcp/<projectId>/status resource) or wait for the results to be published to an optional webhook URI.
- The reporting database update is scheduled to run automatically on a nightly basis after the completion of the analysis. This process produces reports for each instance recommendation, which is useful for analysts or application owners. These reports are only created after the scheduled analysis is completed, and may therefore only be available on the following day for a new analysis. Exact timing depends on the size of your environment.
Ad-Hoc Tasks
Generally you do not need to run once-off tasks as both data collection and analysis tasks are scheduled automatically. In cases where you need make an ad-hoc request in addition to the scheduled job, the functionality exists for this endpoint.
Historical Data Collection
When Densify initiates data collection, normally audits collect only the last 24 hours of data. You can optionally collect up to 30 days of historical data. The historical data provides a more representative set of data on which to base resizing and optimization recommendations. You can run an ad-hoc task to collect the historical data.
Note: Collection of historical data can take a significant amount of time, depending on the number of instances from which Densifyis collecting data. Contact Support@Densify.com to enable historical data collection and details of the performance impact.
The following settings define the range of historical data to be collected:
- Start date offset—This is the number of days from the 30-day maximum, used to define the start of the range.
- End date offset—This is number of days from yesterday, to end the range of data collected.
These parameters allow you to reduce the number of days of historical data to be collected. If, for example, the daily audit has been running for a few days before the historical audit can be executed then you can set the end offset to exclude the number of days that have already been collected. Thirty days is the maximum number of days that you can go back and collect historical data for Azure and GCP environments.
A connection to the specified cloud account must already exist before you can run an ad hoc audit. When you execute an ad hoc refresh an audit task will be configured but a new connection will not be created. If the cloud connection does not already exist and the API POST contains triggerAdhocAudit=true, then you will get an error message.
If there is more than one account associated with the specified account ID (i.e. a payer account with many linked accounts), the Densify API handles it in the same way that analyses are currently rerun using the POST operation.
Once the audit is complete you need to rerun the associated analyses as indicated below or you can wait for the next scheduled execution of the analyses and RDB populate.
Analysis Update
You can make an ad-hoc request to refresh an existing analysis, outside of the scheduled nightly run using /analysis/cloud/<aws|azure|gcp>/analyze. This manual, ad hoc analysis request does not perform data collection or reporting database updates. It only runs the analysis on the existing data collected with the following behavior:
- The analysis uses the policy that is configured for the analysis. Contact Support@Densify.com to change the configured policy.
- If a new webhook is provided, the analysis will send results to the new webhook URI. If no webhook is provided, the analysis will send results to the existing webhook, if configured.
- If the same analysis is already running, the request does not proceed and an appropriate message is returned.
- If the specified analysis has data collection scheduled within 30 minutes, the request does not proceed and an appropriate message is returned. For example, if data collection is scheduled to run at 12:05 AM, and you initiate a manual, ad hoc analyze request at 11:45 PM, then the analysis will not proceed and an error message is returned.
Prerequisite Configuration
Before you can collect GCP cloud infrastructure data in Densify, you need to create a GCP service account with services enabled and permissions configured. See
Note: When using the Densify API only one project will be processed per API request. This is the case, even if more than one project is associated with the service account.
Resource
Supported Operations
Table: GCP Analysis Supported Operations
Operation |
HTTP Method |
Input |
Output |
Description |
POST /analysis/gcp/ analyze |
|
Use this resource to:
|
||
POST /analysis/gcp/ analyze |
|
This resource operation is used to re-run an analysis that already exists. You can specify an updated policy and/or webhook to use for the analysis. Data collection is not run. Data collection only occurs during the first /analyze request, and is then scheduled to run nightly The updated webhook is saved and will be used in the next scheduled analyses. You cannot initiate a request if data collection or the analyses are in progress or within 30 minutes of the time that these tasks are scheduled to run. |
||
Run the historical audit |
POST /analysis/cloud/gcp/analyze |
Request Body Parameter:
|
This resource operation is used to re-run an audit for which a connection and daily, scheduled audit already exists. You can optionally specify the number of days of historical data to collect. If not specified the previous 30 days from yesterday's date are collected. If you initiate an audit request when data collection or analysis is already running or within 30 minutes of the time that these tasks are scheduled to run, then the request will fail and an error message is returned. |
|
List all generated analyses |
GET /analysis/cloud/gcp |
Path Parameter:
Request Body Parameter:
|
Lists all analyses that have been created with details. |
This resource operation is used to obtain the analysis ID that is required for other operations. |
Table: GCP Analysis Path Parameters
Parameter Name |
Type |
Description |
string |
The unique referenced ID of the GCP analysis. |
Table: GCP Analysis Parameters
Parameter Name |
Type |
Description |
string |
The GCP service account credential details. You must specify the details from the .JSON credential file, inline when making this request
|
|
string |
The projectid for which to create the environment. You must provide the projectid even if the credential file is linked to multiple projects. A multi-project list is not derived from the credential file. |
|
(optional) |
string |
Use the connection name to clearly identify this connection within Densify. This name will appear in the Saved Connections list in theDensify UI. By default, the connection name is set to the GCP project. The connection name must be unique within the GCP connection type section, so if the name is already in use, the request fails with an error message. This connection name can be used for filtering. Note: The Connection Name is limited to 32-characters. |
(optional) |
string |
Historical data end day offset. This parameter is optional and is used to configure the range of the historical audit. It is used in conjunction with the parameters, Trigger ad-hoc audit and Start Day Offset to set the end day of the range of historical days of data to collect. If no value is specified and the parameter, Trigger ad-hoc audit has been enabled, then the end date is set to yesterday. If you specify any number other than 0, then that number is used to offset the range's end date from yesterday. i.e. if End Day Offset=5 and yesterday was Dec 1, then the end date will be Nov 25. When AFv2 is enabled, this setting is not used and the end date is always "yesterday". Do not pass this parameter in your API call. |
(optional) |
string |
Historical data start day offset. This parameter is optional and is used to configure the historical audit. It is used in conjunction with the parameters, Trigger ad-hoc audit and End Day Offset to set the start day of the range of historical days of data to collect. If no value is specified and the parameter, Trigger ad-hoc audit has been enabled, then the start date is set to 30 days previous to yesterday's date. If you specify a number less than 30, that number is used to offset the start date from 30 days in the past. i.e. if Start Day Offset=10 then the start date will be 10 days offset from 30 days, so 20 days previous to yesterday. You can use these setting to define a smaller range (i.e. 20 days). You cannot collect more than 30 days of historical data. |
(optional) |
string |
The flag to trigger an ad-hoc 30-day historical audit. This parameter is optional and used to run the historical audit immediately and once only. Typically the historical audit is run first, when data collection is initiated for the specified subscription /project. If the historical audit has been disabled for performance reasons, it can be run once to collect the historical data as compute resources become available. A connection to the specified subscription or project must already exist before you can use this flag. A once-off task will be configured but a new connection cannot be created. If the connection does not already exist AND the API post contains ad hoc=true, then you will see an error message. |
(optional) |
|
The webhook definition to an external application. Optimization results are sent to the webhook-defined application when the analysis is complete. See Parameters for details of each parameter in the webhook definition. |
Table: GCP Analysis Responce Elements
Element |
Type |
Filter/Sort |
Description |
string |
|
The analysis name corresponds to the GCP project ID for the collected infrastructure data. An analysis is created for each project associated with the service account credential provided. |
|
string |
|
The referenced resource to the recommendations of the analysis. See Analysis: GCP Recommendations for details on GCP analysis recommendations. |
|
string |
|
The date and time (in milliseconds) when the last analysis completed. |
|
string |
|
The Densify internal ID of the analysis entity. |
|
string |
|
The current phase of the specified analysis. Possible phases include:
|
|
string |
|
The message for the analysis status. For errors, the message for the following status response is returned. |
|
number |
|
The HTTP response code of the request. Possible status values include:
|
Examples