Pivotal Greenplum® Command Center 3.0.1 Release Notes
- About This Release
- Supported Platforms
- Pivotal Documentation
- Greenplum Command Center 3.0.1
-
Greenplum Workload Manager 1.6.0
- About Pivotal Greenplum Workload Manager
- Enhancements and Changes in Greenplum Workload Manager Release 1.6.0
- Enhancements and Changes in Greenplum Workload Manager Release 1.5.0
- Greenplum Workload Manager Installation and Upgrade Instructions
- Enhancements in Greenplum Workload Manager Release 1.4.0
- Greenplum Workload Manager Known Issues
Greenplum Command Center version: 3.0.1
Greenplum Workload Manager version: 1.6.0
Published: November, 2016
About This Release
Pivotal Greenplum Command Center release 3.0.1 contains Greenplum Command Center release 3.0.1 and Greenplum Workload Manager release 1.6.0.
Greenplum Command Center release 3.0.1 addresses the following issues in Greenplum Command Center release 3.0.0:
- Hosts with hostnames changed since the initial Greenplum Database installation not appearing in host and cluster metrics
- Manual 4-digit entry into History time fields
- Temporary display of zero value for cluster metrics when under high load
- Count and sort in History queries grid
gpcmdr --restart
not reliably bringing gpmonws back upgpcmdr --version
- History charts can cut off edges of timespan
See About Greenplum Workload Manager Release 1.6.0 for information about features and improvements in the Workload Manager 1.6.0 release.
Supported Platforms
Greenplum Command Center may be installed on the following platforms:
- Red Hat Enterprise Linux 5, 6, or 7, 64-bit
- CentOS 5, 6, or 7, 64-bit
- SUSE Linux Enterprise 11SP4, 64-bit
- Greenplum Database 4.3.x
Greenplum Workload Manager may be installed on the following platforms:
- Red Hat Enterprise Linux 5.5+ or 6, 64-bit
- CentOs 5.5+ or 6, 64-bit
- Greenplum Database 4.3.x
Greenplum Command Center includes:
- OpenSSL version 1.0.1o.
Pivotal Documentation
Pivotal Greenplum Command Center and Pivotal Greenplum Workload Manager documentation is available on the Pivotal Documentation site at http://gpcc.docs.pivotal.io/.
Greenplum Command Center 3.0.1
The Greenplum Command Center 3.0.1 release contains the Command Center 3.0.1 software release and the Greenplum Workload Manager 1.6.0 release.
About Pivotal Greenplum Command Center
Pivotal Greenplum Command Center (GPCC) is a management tool for Pivotal Greenplum Database. GPCC monitors query activity and system performance metrics. Pivotal Greenplum Command Center is an interactive graphical web application that can be installed on the master host and used to view and interact with the collected system data from Greenplum Database.
Greenplum Database is a requirement for operating Command Center because Command Center relies on information stored in the Greenplum gpperfmon database. Greenplum Database includes data collection agents that run on the Greenplum Database master host and each segment host. The agents collect data about queries and system utilization and send them to the Greenplum master host at regular intervals. Data stored in the gpperfmon database can be accessed through the Command Center web application or with SQL queries.
Greenplum Command Center Compatibility
Pivotal Greenplum Command Center is currently certified for the EMC Data Computing Appliance (DCA) and Greenplum Database software-only environments. Command Center monitors the following for each environment:
Greenplum Data Computing Appliance
- Greenplum Database Module 4.3.x
If you have been using Greenplum’s earlier monitoring tool, Performance Monitor, with an older DCA release, we recommend you upgrade to a supported version of DCA.
Greenplum Database (Software-only Environments)
- Greenplum Database 4.3.x
Command Center Installation and Upgrade Instructions
There is now an upgrade path for all 1.x, 2.x, and 3.x versions. See gpcmdr --migrate
for details.
DCA and Greenplum Database Software-only
Instructions for installing, configuring, and upgrading your system for Pivotal Greenplum Command Center are provided in the latest Pivotal Greenplum Command Center 3.0 Administrator Guide.
Accessing Command Center
For enhanced security, beginning with Greenplum Command Center release 1.3.0.1, the gpadmin
user is not permitted to log in to the Command Center. The Command Center does not accept logins from any user on the host running GPCC configured with trust
authentication in the pg_hba.conf
file.
You should create new administrative, operator, and regular Command Center users for Command Center. To create a new Command Center user, first you create a Greenplum Database role, then edit the pg_hba.conf
file to give that role access to Command Center. An example of this procedure is provided in the Greenplum Command Center Administrator Guide and more detailed information can be found in the Greenplum Database Administration Guide.
Greenplum Command Center Resolved Issues
The following table lists issues that were resolved in Pivotal Greenplum Command Center 3.x.
Issue | Description | Fixed In |
---|---|---|
ZD-39405 | Systems with segment host names that changed since GPDB initialization were not shown in cluster metrics. | 3.0.1 |
CMDR-1999 | Failed to start instance without setting of PGHOST and PGPORT. | 3.0.0 |
CMDR-1910 | GPCC hangs on “Loading” because request to “/api” does not respond. | 3.0.0 |
CMDR-1996 | Orphaned instances after upgrade | 3.0.0 |
PT-87736724 | The Command Center installer does not have support to upgrade existing Command Center instances when upgrading to a new release. | 3.0.0 |
Greenplum Command Center Known Issues
This section lists the known issues in Pivotal Greenplum Command Center 3.x.
Issue | Description |
---|---|
CMDR-297 | Heavy workload on the Greenplum Database can affect responsiveness. The refresh rate of the Command Center user interface can be adversely affected by simultaneous heavy workload on the underlying Greenplum Database. |
Greenplum Workload Manager 1.6.0
Greenplum Workload Manager 1.6.0 is included in the Pivotal Greenplum Command Center 3.0.0 package. Download the Greenplum Command Center installer distribution from http://network.pivotal.io.
About Pivotal Greenplum Workload Manager
Greenplum Workload Manager services collect query execution data and real-time system statistics on each segment host. The Workload Manager rules engine allows you to create rules that specify criteria that trigger an action, for example terminating a query that runs longer than a specified time or consumes too many resources on a segment host.
Real-time query performance can be viewed in the Workload Manager gptop
curses-based GUI.
Enhancements and Changes in Greenplum Workload Manager Release 1.6.0
The following are enhancements and changes in Greenplum Workload Manager 1.6.0.
Improved Hostname Handling
During startup, the runtime framework detects and ignores duplicate hostnames. Duplicate hostnames are hostnames that resolve to the same machine.
Improved Response to Cancel Query
The amount of time Workload Manager takes to respond to a query cancellation request has been reduced to a minimum so that the request is delivered to Greenplum Database without delay.
Enhancements and Changes in Greenplum Workload Manager Release 1.5.0
The following are enhancements and changes in Greenplum Workload Manager 1.5.0.
Removed Option to Install Workload Manager with gpcmdr Command
The Greenplum Command Center 2.4.0 gpcmdr --setup
utility no longer offers to install Greenplum Workload Manager. Workload Manager must be installed using its installer, gp-wlm.bin
, which is located in the Command Center installation directory. See Installing Greenplum Workload Manager for installation instructions.
Publishing Connections and Sessions
By default, Workload Manager now publishes information about idle Greenplum Database sessions. An idle session represents a client connection without an active or queued query.
To create a rule that is triggered by idle transactions, session_id:host:pid:current_query = "<IDLE>"
in the when clause. For example, the following message records a message when an idle psql
session is detected.
gpdb_record(message="Idle PSQL session")
when host:pid:current_query = "<IDLE>" and host:pid:name = "psql"
In Workload Manager 1.0 through 1.4, information about idle sessions was never published, so they could not be detected with a rule. With 1.5.0, idle sessions are published by default.
You can revert to the previous behavior by configuring the following settings to ‘false’.
Module | Setting | Description |
---|---|---|
gpdb_stats | publish_idle_sessions | Publish information about idle Greenplum Database sessions |
systemdata | publish_idle_processes | Publish information about idle GPDB processes |
In addition to the new publish_idle_sessions
settings in the table above, the following new configuration setting is available in the gp-wlm
CLI.
Module | Setting | Description |
---|---|---|
systemdata | logging:log_level | Configure logging verbosity for the systemdata plugin |
See Configuring Workload Manager Components in the Greenplum Workload Manager User Guide for information about setting configuration values.
Including Additional Datums in Action Scope
Rules can now be created or modified to include datums in the action scope in addition to the datums that trigger the rule action. Previously, only values of datums present in the conditional expression that triggered the rule were captured when the rule was triggered.
In Workload Manager 1.5.0, additional datums specified using the including
keyword are also captured. The context columns in the gp_wlm_record
table or gp_wlm_events
view will contain values for datums that triggered the rules as well as values for datums specified using the including
keyword.
See Adding Rules in the Greenplum Workload Manager User Guide for information about this new feature.
Application Name Field is Set for Workload Manager GPDB Sessions
Greenplum Database sessions initiated by Workload Manager components are now configured to set the application_name
field to 'gp-wlm’. This field is available in the pg_stat_activity
table.
Greenplum Workload Manager Installation and Upgrade Instructions
The Greenplum Workload Manager installer is in the Greenplum Command Center home directory. See “Installing Greenplum Workload Manager” in the Greenplum Workload Manager User Guide for command syntax and usage instructions.
Enhancements in Greenplum Workload Manager Release 1.4.0
The following are enhancements and changes in Greenplum Workload Manager 1.4.0:
New Features and Improvements
- A
host:pg_cancel_backend
action is added. This action calls the PostgreSQLpg_cancel_backend()
function, which sends a SIGINT signal to the backend process, cancelling the current query. This differs from thepg_terminate_backend()
function, which sends a SIGTERM signal and terminates the session.host:pg_cancel_backend
events are logged to thewlm_event_log
- A new set of datums is added to provide Greenplum segment virtual memory (vmem) statics.
- The timing values that determine how frequently agents publish datums and rules are evaluated have been changed from integer to float values, allowing sub-second times to be specified.
- A new configuration manager allows viewing, describing, and modifying user-settable configuration values for Workload Manager components. In release 1.4.0, the configuration manager supports viewing, describing, and setting various configuration values for Workload Manager.
Fixes
Fixed a rare bug where some queries cannot be ruled upon if the the session id of the query is not present in the list_backend_priorities() recordset.
In certain environments, environment variables were not being set correctly.
Greenplum Workload Manager Known Issues
This section lists the known issues in Pivotal Greenplum Workload Manager. A work-around is provided where applicable.
Issue | Description |
---|---|
— | In rare cases it is possible for installations or upgrades to fail at the cluster-health-check stage. In the event that the cluster is not healthy, re-run the Workload Manager installer with the --force option. See “Installing Greenplum Workload Manager” in the Pivotal Greenplum Workload Manager User Guide for instructions to run the installer at the command line. |