Pivotal Greenplum Command Center 2.4.0 Release Notes
Greenplum Command Center version: 2.4.0
Greenplum Workload Manager version: 1.5.0
Published: September, 2016
Pivotal Greenplum Command Center release 2.4.0 contains Greenplum Command Center release 2.4.0 and Greenplum Workload Manager release 1.5.0.
See About Greenplum Command Center 2.4.0 for information about features and improvements in the Command Center 2.4.0 release.
See About Greenplum Workload Manager Release 1.5.0 for information about features and improvements in the Workload Manager 1.5.0 release.
Greenplum Command Center may be installed on the following platforms:
- Red Hat Enterprise Linux 5 or 6, 64-bit
- CentOS 5 or 6, 64-bit
- Greenplum Database 4.3.x
Greenplum Workload Manager may be installed on the following platforms:
- Red Hat Enterprise Linux 5.5+ or 6, 64-bit
- CentOs 5.5+ or 6, 64-bit
- Greenplum Database 4.3.x
Greenplum Command Center includes:
- OpenSSL version 1.0.1o.
- lighttpd web server version 1.4.35
Pivotal Greenplum Command Center and Pivotal Greenplum Workload Manager documentation is available on the Command Center documentation site at http://gpcc.docs.pivotal.io/.
Pivotal Greenplum Command Center is a management tool for Pivotal Greenplum Database. Pivotal Greenplum Command Center monitors system performance metrics, system health, and also provides administrators the ability to perform management tasks such as start, stop, and recovery of systems for Greenplum Database. Pivotal Greenplum Command Center is an interactive graphical web application that can be installed on the master host and used to view and interact with the collected system data from Greenplum Database and optionally from EMC Data Computing Appliance (DCA).
Greenplum Database is a requirement for operating Command Center because Command Center relies on information stored in the Greenplum gpperfmon database. Greenplum Database includes data collection agents that run on the Greenplum Database master host and each segment host. The agents collect data about queries and system utilization and send them to the Greenplum master host at regular intervals. Data stored in the gpperfmon database can be accessed through the Command Center web application or with SQL queries.
Pivotal Greenplum Command Center is currently certified for the EMC Data Computing Appliance (DCA) and Greenplum Database software-only environments. Command Center monitors the following for each environment:
Greenplum Data Computing Appliance
- Greenplum Database Module 4.3.x
- Greenplum Data Integration Accelerator (DIA) Module
- Greenplum Data Computing Appliance Hardware (V1.2.x and V2.x)
If you have been using Greenplum’s earlier monitoring tool, Performance Monitor, with an older DCA release, we recommend you upgrade to a supported version of DCA.
Greenplum Database (Software-only Environments)
- Greenplum Database 4.3.x
Greenplum Command Center release 2.4.0 includes a beta release of the new Greenplum Command Center User Interface.
The GPCC installer has an option to include the beta UI (default yes) and the
gpcmdr command has an option to run the beta (default yes).
The beta UI is installed by default on port 28090. It uses the same login credentials as the existing Command Center Console.
The beta Command Center Console includes the following improvements:
- A non-Flash interface—no dependency on a third-party browser plugin.
- Improved Active Queries list.
- Query Details view with SQL text and explain plan.
- Cluster Metrics view with aggregated charts across all hosts excluding master.
- Host Metrics view with detailed performance metrics for each server.
- Segment Status view with role and health-related information.
- Storage Status view with disk capacity information and history.
- History view with integrated cluster metrics and query list for user-definable timeframe.
- Each view has a unique URL. Navigating forward and backward in the browser works as expected and URLs can be bookmarked.
- URLs for individual query details can be shared and accessed directly.
See Known Issues in the Beta Command Center Console for a list of known issues in the beta UI.
The new Command Center UI is in active development. New features coming soon are:
- Multi-cluster dashboard
- Installation improvements
Note: Greenplum Command Center requires Adobe Flash Player version 11 or higher. If this requirement is not met, you see the following screen:
Pivotal Greenplum Command Center is already installed on the DCA appliance (versions 1.2.x and 2.x).
For more information about setting up, upgrading, and configuring Pivotal Greenplum Command Center on an EMC Greenplum DCA, refer to the appropriate versions of the Greenplum Data Computing Appliance Software Upgrade Guide and Greenplum Data Computing Appliance Installation and Configuration Guide.
Instructions for installing, configuring, and upgrading your system for Pivotal Greenplum Command Center are provided in the latest Pivotal Greenplum Command Center 2.4 Administrator Guide.
For enhanced security, beginning with Greenplum Command Center release 18.104.22.168, the
gpadmin user is not permitted to log in to the Command Center. The Command Center does not accept logins from any user on the host running GPCC configured with
trust authentication in the
You should create new administrative, operator, and regular Command Center users for Command Center. To create a new Command Center user, first you create a Greenplum Database role, then edit the
pg_hba.conf file to give that role access to Command Center. An example of this procedure is provided in the Greenplum Command Center Administrator Guide and more detailed information can be found in the Greenplum Database Administration Guide.
The following table lists issues that were resolved in Pivotal Greenplum Command Center.
|—||Disk Usage is not appearing in the Dashboard or Administration tabs.||2.2.0|
This section lists the known issues in Pivotal Greenplum Command Center 2.0. A work-around is provided where applicable.
Heavy workload on the Greenplum Database can affect responsiveness.
The refresh rate of the Command Center user interface can be adversely affected by simultaneous heavy workload on the underlying Greenplum Database.
|PT-87736724||The Command Center installer does not have support to upgrade existing Command Center instances when upgrading to a new release.|
In Beta Query Details:
- When viewing a running query that completes with a total runtime under 20 seconds, on the next screen refresh, the query will be not found because it is below the threshold execution time for queries to be written to
In Beta Active and Recent Queries:
- Canceled queries will show status “Canceling” until the next 15 second data refresh, even if the cancel completed sooner.
- There is a delay of several seconds after a query finishes before it is displayed in the Recent Queries list.
- The refresh timer may not begin counting down until the first data refresh.
Greenplum Workload Manager 1.5.0 is included in the Pivotal Greenplum Command Center 2.4.0 package, which may be downloaded from http://network.pivotal.io.
Greenplum Workload Manager services collect query execution data and real-time system statistics on each segment host. The Workload Manager rules engine allows you to create rules that specify criteria that trigger an action, for example terminating a query that runs longer than a specified time or consumes too many resources on a segment host.
Real-time query performance can be viewed in the Workload Manager
gptop curses-based GUI.
The following are enhancements and changes in Greenplum Workload Manager 1.5.0.
Removed Option to Install Workload Manager with gpcmdr Command
The Greenplum Command Center 2.4.0
gpcmdr --setup utility no longer offers to install Greenplum Workload Manager. Workload Manager must be installed using its installer,
gp-wlm.bin, which is located in the Command Center installation directory. See Installing Greenplum Workload Manager for installation instructions.
Connections and Sessions
By default, Workload Manager now publishes information about idle Greenplum Database sessions. An idle session represents a client connection without an active or queued query.
To create a rule that is triggered by idle transactions,
session_id:host:pid:current_query = "<IDLE>" in the when clause. For example, the following message records a message when an idle
psql session is detected.
gpdb_record(message="Idle PSQL session") when host:pid:current_query = "<IDLE>" and host:pid:name = "psql"
In Workload Manager 1.0 through 1.4, information about idle sessions was never published, so they could not be detected with a rule. With 1.5.0, idle sessions are published by default.
You can revert to the previous behavior by configuring the following settings to ‘false’.
|gpdb_stats||publish_idle_sessions||Publish information about idle Greenplum Database sessions|
|systemdata||publish_idle_processes||Publish information about idle GPDB processes|
In addition to the new
publish_idle_sessions settings in the table above, the following new configuration setting is available in the
|systemdata||logging:log_level||Configure logging verbosity for the systemdata plugin|
See Configuring Workload Manager Components in the Greenplum Workload Manager User Guide for information about setting configuration values.
Including Additional Datums in Action Scope
Rules can now be created or modified to include datums in the action scope in addition to the datums that trigger the rule action. Previously, only values of datums present in the conditional expression that triggered the rule were captured when the rule was triggered.
In Workload Manager 1.5.0, additional datums specified using the
including keyword are also captured. The context columns in the
gp_wlm_record table or
gp_wlm_events view will contain values for datums that triggered the rules as well as values for datums specified using the
See Adding Rules in the Greenplum Workload Manager User Guide for information about this new feature.
Application Name Field is Set for Workload Manager GPDB Sessions
Greenplum Database sessions initiated by Workload Manager components are now configured to set the
application_name field to 'gp-wlm’. This field is available in the
The Greenplum Workload Manager installer is in the Greenplum Command Center home directory. See “Installing Greenplum Workload Manager” in the Greenplum Workload Manager User Guide for command syntax and usage instructions.
The following are enhancements and changes in Greenplum Workload Manager 1.4.0:
New Features and Improvements
host:pg_cancel_backendaction is added. This action calls the PostgreSQL
pg_cancel_backend()function, which sends a SIGINT signal to the backend process, cancelling the current query. This differs from the
pg_terminate_backend()function, which sends a SIGTERM signal and terminates the session.
host:pg_cancel_backendevents are logged to the
- A new set of datums is added to provide Greenplum segment virtual memory (vmem) statics.
- The timing values that determine how frequently agents publish datums and rules are evaluated have been changed from integer to float values, allowing sub-second times to be specified.
- A new configuration manager allows viewing, describing, and modifying user-settable configuration values for Workload Manager components. In release 1.4.0, the configuration manager supports viewing, describing, and setting various configuration values for Workload Manager.
Fixed a rare bug where some queries cannot be ruled upon if the the session id of the query is not present in the list_backend_priorities() recordset.
In certain environments, environment variables were not being set correctly.
This section lists the known issues in Pivotal Greenplum Workload Manager. A work-around is provided where applicable.
|—||In rare cases it is possible for installations or upgrades to fail at the