This document supports the version of each product listed and
supports all subsequent versions until the document is
replaced by a new edition. To check for more recent editions of
this document, see http://www.vmware.com/support/pubs.
EN-002055-00
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
You can find the most up-to-date technical documentation on the VMware Web site at:
hp://www.vmware.com/support/
The VMware Web site also provides the latest product updates.
If you have comments about this documentation, submit your feedback to:
About Logging In to vRealize Operations Manager 87
Uninstall vRealize Operations Manager from Linux 88
vRealize Operations Manager Uninstallation from Windows Server 89
The Customer Experience Improvement Program 90
Join or Leave the Customer Experience Improvement Program for
vRealize Operations Manager 90
Updating Your Software91
9
Obtain the Software Update PAK File 91
Create a Snapshot as Part of an Update 92
Install a Software Update 92
Index95
4 VMware, Inc.
About Installation and Configuration for Linux
and Windows
The vRealize Operations Manager Installation and Conguration Guide for Linux and Windows provides
information about installing VMware® vRealize Operations Manager on the Linux or Windows operating
system, including how to create and congure the vRealize Operations Manager cluster.
The vRealize Operations Manager installation process consists of running the vRealize Operations Manager
Enterprise installer on each cluster node, and accessing the product to nishseing up the application.
Intended Audience
This information is intended for anyone who wants to install and congure vRealize Operations Manager on
Linux or Windows machines. The information is wrien for experienced Linux or Windows system
administrators who are familiar with enterprise management applications and datacenter operations.
VMware Technical Publications Glossary
VMware Technical Publications provides a glossary of terms that might be unfamiliar to you. For denitions
of terms as they are used in VMware technical documentation, go to
hp://www.vmware.com/support/pubs.
VMware, Inc.
5
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
6 VMware, Inc.
Preparing for
vRealize Operations Manager
Installation1
You prepare for vRealize Operations Manager installation by evaluating your environment and deploying
enough vRealize Operations Manager cluster nodes to support how you want to use the product.
This chapter includes the following topics:
“About vRealize Operations Manager Linux and Windows Installation,” on page 8
n
“Complexity of Your Environment,” on page 9
n
“vRealize Operations Manager Cluster Nodes,” on page 11
n
“Using IPv6 with vRealize Operations Manager,” on page 14
n
“Sizing the vRealize Operations Manager Cluster,” on page 15
n
“Custom vRealize Operations Manager Certicates,” on page 16
n
“How vRealize Operations Manager Uses Network Ports,” on page 19
n
“vRealize Operations Manager Platform Requirements for Linux,” on page 21
n
“Create a Node by Running the vRealize Operations Manager Linux Installer,” on page 23
n
“vRealize Operations Manager Platform Requirements for Windows,” on page 25
n
“Create a Node by Running the vRealize Operations Manager Windows Installer,” on page 26
n
VMware, Inc.
7
Start
Run installer to create master node.
(optional) Run installer for master replica, data,
or remote collector nodes
Run master node setup
(optional)
Enable master
replica
(optional)
Run data nodes
setup
(optional)
Run remote collector
nodes setup
First-time login to the product
Monitor your environment
Select, license,
and upload
Customer Experience Improvement Program
(optional) Add more solutions
Configure solutions
Configure monitoring policies
Licensing
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
About vRealize Operations Manager Linux and Windows Installation
The vRealize Operations Manager installation process consists of running the vRealize Operations Manager
Enterprise installer on each cluster node, accessing the product to set up cluster nodes according to their
role, and logging in to congure the installation.
When you deploy vRealize Operations Manager, the number and nature of the objects that you want to
monitor might be complex enough to recommend a Professional Services engagement.
Complexity Levels
Every enterprise is dierent in terms of the systems that are present and the level of experience of
deployment personnel. The following table presents a color-coded guide to help you determine where you
are on the complexity scale.
Green
n
Your installation only includes conditions that most users can understand and work with, without
assistance. Continue your deployment.
Yellow
n
Your installation includes conditions that might justify help with your deployment, depending on your
level of experience. Consult your account representative before proceeding, and discuss using
Professional Services.
Red
n
Chapter 1 Preparing for vRealize Operations Manager Installation
Your installation includes conditions that strongly recommend a Professional Services engagement.
Consult your account representative before proceeding, and discuss using Professional Services.
Note that these color-coded levels are not rm rules. Your product experience, which increases as you work
with vRealize Operations Manager and in partnership with Professional Services, must be taken into
account when deploying vRealize Operations Manager.
Table 1‑1. Effect of Deployment Conditions on Complexity
Current or New Deployment
Complexity Level
GreenYou run only one
GreenYour deployment includes a
YellowYou run multiple instances of
ConditionAdditional Notes
Lone instances are usually easy to
vRealize Operations Manager
deployment.
management pack that is listed as
Green according to the compatibility
guide on the VMware Solutions
Exchange Web site.
vRealize Operations Manager.
create in
vRealize Operations Manager.
The compatibility guide indicates
whether the supported management
pack for vRealize Operations Manager
is a compatible 5.x one or a new one
designed for this release. In some
cases, both might work but produce
dierent results. Regardless, users
might need help in adjusting their
conguration so that associated data,
dashboards, alerts, and so on appear as
expected.
Note that the terms solution,
management pack, adapter, and plug-in
are used somewhat interchangeably.
Multiple instances are typically used to
address scaling or operator use
paerns.
VMware, Inc. 9
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
Table 1‑1. Effect of Deployment Conditions on Complexity (Continued)
Current or New Deployment
Complexity Level
YellowYour deployment includes a
YellowYou are deploying
YellowYou are deploying a multiple-node
YellowYour new
YellowYour vRealize Operations Manager
YellowYou want help in understanding the
RedYou run multiple instances of
RedYour deployment includes a
RedYou are deploying multiple
ConditionAdditional Notes
management pack that is listed as
Yellow according to the compatibility
guide on the VMware Solutions
vRealize Operations Manager
instance will include a Linux or
Windows based deployment.
instance will use high availability
(HA).
new or changed features in
vRealize Operations Manager and
how to use them in your
environment.
vRealize Operations Manager, where
at least one includes virtual desktop
infrastructure (VDI).
management pack that is listed as
Red according to the compatibility
guide on the VMware Solutions
Exchange Web site.
vRealize Operations Manager
clusters.
The compatibility guide indicates
whether the supported management
pack for vRealize Operations Manager
is a compatible 5.x one or a new one
designed for this release. In some
cases, both might work but produce
dierent results. Regardless, users
might need help in adjusting their
conguration so that associated data,
dashboards, alerts, and so on appear as
expected.
Remote collector nodes gather data but
leave the storage and processing of the
data to the analytics cluster.
Multiple nodes are typically used for
scaling out the monitoring capability
of vRealize Operations Manager.
Linux and Windows deployments are
not as common as vApp deployments
and often need special consideration.
High availability and its node failover
capability is a unique multiple-node
feature that you might want additional
help in understanding.
vRealize Operations Manager is
dierent than vCenter Operations
Manager in areas such as policies,
alerts, compliance, custom reporting,
or badges. In addition,
vRealize Operations Manager uses one
consolidated interface.
Multiple instances are typically used to
address scaling, operator use paerns,
or because separate VDI (V4V
monitoring) and non-VDI instances are
needed.
The compatibility guide indicates
whether the supported management
pack for vRealize Operations Manager
is a compatible 5.x one or a new one
designed for this release. In some
cases, both might work but produce
dierent results. Regardless, users
might need help in adjusting their
conguration so that associated data,
dashboards, alerts, and so on appear as
expected.
Multiple clusters are typically used to
isolate business operations or
functions.
10 VMware, Inc.
Chapter 1 Preparing for vRealize Operations Manager Installation
Table 1‑1. Effect of Deployment Conditions on Complexity (Continued)
Current or New Deployment
Complexity Level
RedYour current
RedProfessional Services customized
ConditionAdditional Notes
vRealize Operations Manager
deployment required a Professional
Services engagement to install it.
your vRealize Operations Manager
deployment. Examples of
customization include special
integrations, scripting, nonstandard
congurations, multiple level
alerting, or custom reporting.
vRealize Operations Manager Cluster Nodes
All vRealize Operations Manager clusters consist of a master node, an optional replica node for high
availability, optional data nodes, and optional remote collector nodes.
If your environment was complex
enough to justify a Professional
Services engagement in the previous
version, it is possible that the same
conditions still apply and might
warrant a similar engagement for this
version.
If your environment was complex
enough to justify a Professional
Services engagement in the previous
version, it is possible that the same
conditions still apply and might
warrant a similar engagement for this
version.
When you install vRealize Operations Manager, you use a vRealize Operations Manager vApp deployment,
Linux installer, or Windows installer to create role-less nodes. After the nodes are created and have their
names and IP addresses, you use an administration interface to congure them according to their role.
You can create role-less nodes all at once or as needed. A common as-needed practice might be to add nodes
to scale out vRealize Operations Manager to monitor an environment as the environment grows larger.
The following node types make up the vRealize Operations Manager analytics cluster:
Master Node
The initial, required node in vRealize Operations Manager. All other nodes
are managed by the master node.
In a single-node installation, the master node manages itself, has adapters
installed on it, and performs all data collection and analysis.
Data Node
In larger deployments, additional data nodes have adapters installed and
perform collection and analysis.
Larger deployments usually include adapters only on the data nodes so that
master and replica node resources can be dedicated to cluster management.
Replica Node
To use vRealize Operations Manager high availability (HA), the cluster
requires that you convert a data node into a replica of the master node.
The following node type is a member of the vRealize Operations Manager cluster but not part of the
analytics cluster:
Remote Collector Node
Distributed deployments might require a remote collector node that can
navigate rewalls, interface with a remote data source, reduce bandwidth
across data centers, or reduce the load on the vRealize Operations Manager
analytics cluster. Remote collectors only gather objects for the inventory,
without storing data or performing analysis. In addition, remote collector
nodes may be installed on a dierent operating system than the rest of the
cluster.
VMware, Inc. 11
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
General vRealize Operations Manager Cluster Node Requirements
When you create the cluster nodes that make up vRealize Operations Manager, you have general
requirements that you must meet.
General Requirements
vRealize Operations Manager Version. All nodes must run the same vRealize Operations Manager
n
version.
For example, do not add a version 6.1 data node to a cluster of vRealize Operations Manager 6.2 nodes.
Analytics Cluster Deployment Type. In the analytics cluster, all nodes must be the same kind of
n
deployment: vApp, Linux, or Windows.
Do not mix vApp, Linux, and Windows nodes in the same analytics cluster.
Remote Collector Deployment Type. A remote collector node does not need to be the same deployment
n
type as the analytics cluster nodes.
When you add a remote collector of a dierent deployment type, the following combinations are
supported:
vApp analytics cluster and Windows remote collector
n
Linux analytics cluster and Windows remote collector
n
Analytics Cluster Node Sizing. In the analytics cluster, CPU, memory, and disk size must be identical
n
for all nodes.
Master, replica, and data nodes must be uniform in sizing.
Remote Collector Node Sizing. Remote collector nodes may be of dierent sizes from each other or
n
from the uniform analytics cluster node size.
Geographical Proximity. You may place analytics cluster nodes in dierent vSphere clusters, but the
n
nodes must reside in the same geographical location.
Dierent geographical locations are not supported.
Virtual Machine Maintenance. When any node is a virtual machine, you may only update the virtual
n
machine software by directly updating the vRealize Operations Manager software.
For example, going outside of vRealize Operations Manager to access vSphere to update VMware Tools
is not supported.
Redundancy and Isolation. If you expect to enable HA, place analytics cluster nodes on separate hosts.
n
See “About vRealize Operations Manager High Availability,” on page 33.
Requirements for Solutions
Be aware that solutions might have requirements beyond those for vRealize Operations Manager itself. For
example, vRealize Operations Manager for Horizon View has specic sizing guidelines for its remote
collectors.
See your solution documentation, and verify any additional requirements before installing solutions. Note
that the terms solution, management pack, adapter, and plug-in are used somewhat interchangeably.
12 VMware, Inc.
Chapter 1 Preparing for vRealize Operations Manager Installation
When you create the cluster nodes that make up vRealize Operations Manager, the associated setup within
your network environment is critical to inter-node communication and proper operation.
Networking Requirements
I vRealize Operations Manager analytics cluster nodes need frequent communication with one
another. In general, your underlying vSphere architecture might create conditions where some vSphere
actions aect that communication. Examples include, but are not limited to, vMotions, storage vMotions,
HA events, and DRS events.
The master and replica nodes must be addressed by static IP address, or fully qualied domain name
n
(FQDN) with a static IP address.
Data and remote collector nodes may use dynamic host control protocol (DHCP).
You must be able to successfully reverse-DNS all nodes, including remote collectors, to their FQDN,
n
currently the node hostname.
Nodes deployed by OVF have their hostnames set to the retrieved FQDN by default.
All nodes, including remote collectors, must be bidirectionally routable by IP address or FQDN.
n
Analytics cluster nodes must not be separated by network address translation (NAT), load balancer,
n
rewall, or a proxy that inhibits bidirectional communication by IP address or FQDN.
Analytics cluster nodes must not have the same hostname.
n
Place analytics cluster nodes within the same data center and connect them to the same local area
n
network (LAN).
Place analytics cluster nodes on same Layer 2 network and IP subnet.
n
A stretched Layer 2 or routed Layer 3 network is not supported.
Do not span the Layer 2 network across sites, which might create network partitions or network
n
performance issues.
One-way latency between analytics cluster nodes must be 5 ms or lower.
n
Network bandwidth between analytics cluster nodes must be 1 gbps or higher.
n
Do not distribute analytics cluster nodes over a wide area network (WAN).
n
To collect data from a WAN, a remote or separate data center, or a dierent geographic location, use
remote collectors.
Remote collectors are supported through a routed network but not through NAT.
n
vRealize Operations Manager Cluster Node Best Practices
When you create the cluster nodes that make up vRealize Operations Manager, additional best practices
improve performance and reliability in vRealize Operations Manager.
Best Practices
Deploy vRealize Operations Manager analytics cluster nodes in the same vSphere cluster.
n
VMware, Inc. 13
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
If you deploy analytics cluster nodes in a highly consolidated vSphere cluster, you might need resource
n
reservations for optimal performance.
Determine whether the virtual to physical CPU ratio is aecting performance by reviewing CPU ready
time and co-stop.
Deploy analytics cluster nodes on the same type of storage tier.
n
To continue to meet analytics cluster node size and performance requirements, apply storage DRS anti-
n
anity rules so that nodes are on separate datastores.
To prevent unintentional migration of nodes, set storage DRS to manual.
n
To ensure balanced performance from analytics cluster nodes, use ESXi hosts with the same processor
To avoid a performance decrease, vRealize Operations Manager analytics cluster nodes need
n
guaranteed resources when running at scale. The vRealize Operations Manager Knowledge Base
includes sizing spreadsheets that calculate resources based on the number of objects and metrics that
you expect to monitor, use of HA, and so on. When sizing, it is beer to over-allocate than underallocate resources.
See Knowledge Base article 2093783.
Because nodes might change roles, avoid machine names such as Master, Data, Replica, and so on.
n
Examples of changed roles might include making a data node into a replica for HA, or having a replica
take over the master node role.
The NUMA placement is removed in the vRealize Operations Manager 6.3 and later. Procedures related
n
to NUMA seings from the OVA le follow:
Table 1‑2. NUMA Setting
ActionDescription
Set the vRealize Operations Manager cluster status to
oine
Remove the NUMA seing1From the Conguration Parameters, remove the
1Shut down the vRealize Operations Manager cluster.
2Right-click the cluster and click Edit >
Options > Advanced General.
3Click Parameters. In the vSphere
Client, repeat these steps for each VM.
seingnuma.vcpu.preferHT and click OK.
2Click OK.
3Repeat these steps for all the VMs in the vRealize
Operations cluster.
4Power on the cluster.
N To ensure the availability of adequate resources and continued product performance, monitor
vRealize Operations performance by checking its CPU usage, CPU ready and CPU contention time.
Using IPv6 with vRealize Operations Manager
vRealize Operations Manager supports Internet Protocol version 6 (IPv6), the network addressing
convention that will eventually replace IPv4. Use of IPv6 with vRealize Operations Manager requires that
certain limitations be observed.
Using IPv6
All vRealize Operations Manager cluster nodes, including remote collectors, must have IPv6 addresses.
n
Do not mix IPv6 and IPv4.
14 VMware, Inc.
Chapter 1 Preparing for vRealize Operations Manager Installation
All vRealize Operations Manager cluster nodes, including remote collectors, must be vApp or Linux
n
based. vRealize Operations Manager for Windows does not support IPv6.
Use global IPv6 addresses only. Link-local addresses are not supported.
n
If any nodes use DHCP, your DHCP server must be congured to support IPv6.
n
DHCP is only supported on data nodes and remote collectors. Master nodes and replica nodes still
n
require xed addresses, which is true for IPv4 as well.
Your DNS server must be congured to support IPv6.
n
When adding nodes to the cluster, remember to enter the IPv6 address of the master node.
n
When registering a VMware vCenter® instance within vRealize Operations Manager, place square
n
brackets around the IPv6 address of your VMware vCenter Server® system if vCenter is also using IPv6.
For example: [2015:0db8:85a3:0042:1000:8a2e:0360:7334]
Note that, even when vRealize Operations Manager is using IPv6, vCenter Server may still have an IPv4
address. In that case, vRealize Operations Manager does not need the square brackets.
You cannot register an Endpoint Operations Management agent in an environment that supports both
n
IPv4 and IPv6. In the event that you aempt to do so, the following error appears:
Connection failed. Server may be down (or wrong IP/port were used). Waiting for 10 seconds
before retrying.
Sizing the vRealize Operations Manager Cluster
The resources needed for vRealize Operations Manager depend on how large of an environment you expect
to monitor and analyze, how many metrics you plan to collect, and how long you need to store the data.
It is dicult to broadly predict the CPU, memory, and disk requirements that will meet the needs of a
particular environment. There are many variables, such as the number and type of objects collected, which
includes the number and type of adapters installed, the presence of HA, the duration of data retention, and
the quantity of specic data points of interest, such as symptoms, changes, and so on.
VMware expects vRealize Operations Manager sizing information to evolve, and maintains Knowledge Base
articles so that sizing calculations can be adjusted to adapt to usage data and changes in versions of
vRealize Operations Manager.
Knowledge Base article 2093783
The Knowledge Base articles include overall maximums, plus spreadsheet calculators in which you enter the
number of objects and metrics that you expect to monitor. To obtain the numbers, some users take the
following high-level approach, which uses vRealize Operations Manager itself.
1Review this guide to understand how to deploy and congure a vRealize Operations Manager node.
2Deploy a temporary vRealize Operations Manager node.
3Congure one or more adapters, and allow the temporary node to collect overnight.
4Access the Cluster Management page on the temporary node.
5Using the Adapter Instances list in the lower portion of the display as a reference, enter object and
metric totals of the dierent adapter types into the appropriate sizing spreadsheet from Knowledge
Base article 2093783.
6Deploy the vRealize Operations Manager cluster based on the spreadsheet sizing recommendation. You
can build the cluster by adding resources and data nodes to the temporary node or by starting over.
If you have a large number of adapters, you might need to reset and repeat the process on the temporary
node until you have all the totals you need. The temporary node will not have enough capacity to
simultaneously run every connection from a large enterprise.
VMware, Inc. 15
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
Another approach to sizing is through self monitoring. Deploy the cluster based on your best estimate, but
create an alert for when capacity falls below a threshold, one that allows enough time to add nodes or disk
to the cluster. You also have the option to create an email notication when thresholds are passed.
Add Data Disk Space to a vRealize Operations Manager Linux or Windows Node
You add to the data disk of vRealize Operations Manager Linux or Windows nodes when space for storing
the collected data runs low.
The following example is for a Linux system. The Windows process is similar, but with Windows
characteristics such as backward slashes instead of forward slashes.
Prerequisites
Note the disk size of the analytics cluster nodes. When adding disk, you must maintain uniform size across
analytics cluster nodes.
Procedure
1Add a new disk to the system, and partition and format the disk as needed.
2Use the vRealize Operations Manager administration interface to take the cluster oine.
3Stop the vmware-casa service.
4Move the contents of /storage/db into a directory on the new disk.
5Create a symbolic link from the new directory back to /storage/db, so that /storage/db now references
the new disk.
6Start the vmware-casa service.
7Bring the cluster online.
Custom vRealize Operations Manager Certificates
By default, vRealize Operations Manager includes its own authentication certicates. The default certicates
cause the browser to display a warning when you connect to the vRealize Operations Manager user
interface.
Your site security policies might require that you use another certicate, or you might want to avoid the
warnings caused by the default certicates. In either case, vRealize Operations Manager supports the use of
your own custom certicate. You can upload your custom certicate during initial master node
conguration or later.
A certicate used with vRealize Operations Manager must conform to certain requirements. Using a custom
certicate is optional and does not aect vRealize Operations Manager features.
Requirements for Custom Certificates
Custom vRealize Operations Manager certicates must meet the following requirements.
The certicatele must include the terminal (leaf) server certicate, a private key, and all issuing
n
certicates if the certicate is signed by a chain of other certicates.
In the le, the leaf certicate must be rst in the order of certicates. After the leaf certicate, the order
n
does not maer.
In the le, all certicates and the private key must be in PEM format. vRealize Operations Manager
n
does not support certicates in PFX, PKCS12, PKCS7, or other formats.
16 VMware, Inc.
Chapter 1 Preparing for vRealize Operations Manager Installation
In the le, all certicates and the private key must be PEM-encoded. vRealize Operations Manager does
n
not support DER-encoded certicates or private keys.
PEM-encoding is base-64 ASCII and contains legible BEGIN and END markers, while DER is a binary
format. Also, le extension might not match encoding. For example, a generic .cer extension might be
used with PEM or DER. To verify encoding format, examine a certicatele using a text editor.
The le extension must be .pem.
n
The private key must be generated by the RSA or DSA algorithm.
n
The private key must not be encrypted by a pass phrase if you use the master node conguration
n
wizard or the administration interface to upload the certicate.
The REST API in this vRealize Operations Manager release supports private keys that are encrypted by
n
a pass phrase. Contact VMware Technical Support for details.
The vRealize Operations Manager Web server on all nodes will have the same certicatele, so it must
n
be valid for all nodes. One way to make the certicate valid for multiple addresses is with multiple
Subject Alternative Name (SAN) entries.
SHA1 certicates creates browser compatibility issues. Therefore, ensure that all certicates that are
n
created and being uploaded to vRealize Operations Manager are signed using SHA2 or newer.
Sample Contents of Custom vRealize Operations Manager Certificates
For troubleshooting purposes, you can open a custom certicatele in a text editor and inspect its contents.
PEM Format Certificate Files
A typical PEM format certicatele resembles the following sample.
Private keys can appear in dierent formats but are enclosed with clear BEGIN and END markers.
VMware, Inc. 17
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
Valid PEM sections begin with one of the following markers.
-----BEGIN RSA PRIVATE KEY-----
-----BEGIN PRIVATE KEY-----
Encrypted private keys begin with the following marker.
-----BEGIN ENCRYPTED PRIVATE KEY-----
Bag Attributes
Microsoft certicate tools sometimes add Bag Aributes sections to certicateles.
vRealize Operations Manager safely ignores content outside of BEGIN and END markers, including Bag
Aributes sections.
Bag Attributes
Microsoft Local Key set: <No Values>
localKeyID: 01 00 00 00
Microsoft CSP Name: Microsoft RSA SChannel Cryptographic Provider
Verifying a Custom vRealize Operations Manager Certificate
When you upload a custom certicatele, the vRealize Operations Manager interface displays summary
information for all certicates in the le.
For a valid custom certicatele, you should be able to match issuer to subject, issuer to subject, back to a
self-signed certicate where the issuer and subject are the same.
In the following example, OU=MBU,O=VMware\, Inc.,CN=vc-ops-slice-32 is issued by OU=MBU,O=VMware\,
Inc.,CN=vc-ops-intermediate-32, which is issued by OU=MBU,O=VMware\, Inc.,CN=vc-ops-clusterca_33717ac0-ad81-4a15-ac4e-e1806f0d3f84, which is issued by itself.
How vRealize Operations Manager Uses Network Ports
vRealize Operations Manager uses network ports to communicate with a VMware vCenter Server system
and vRealize Operations Manager components.
In Linux and Windows deployments, you must manually verify or congure ports.
I vRealize Operations Manager does not support the customization of server ports.
Network Ports
Congure rewalls so that the following ports are open for bidirectional trac.
VMware, Inc. 19
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
Table 1‑3. Network Port Access Requirements for vRealize Operations Manager
Port NumberDescription
22 (TCP)Used for SSH access to the vRealize Operations Manager
cluster.
80 (TCP)Redirects to port 443.
123 (UDP)Used by vRealize Operations Manager for Network Time
Protocol (NTP) synchronization to the master node.
443 (TCP)Used to access the vRealize Operations Manager product
user interface and the vRealize Operations Manager
administrator interface.
10443 (TCP)Used by vRealize Operations Manager to communicate
with the vCenter Server Inventory service.
1235 (TCP)Used by all nodes in the cluster to transmit object data and
key-value data for the Global xDB database instance.
3091–3094 (TCP)When Horizon View (V4V) is installed, used to access data
for vRealize Operations Manager from V4V.
5433 (TCP)When high availability is enabled, used by the master and
replica nodes to replicate the global database.
6061 (TCP)Used by clients to connect to the GemFire Locator to get
connection information to servers in the distributed system.
Also monitors server load to send clients to the leastloaded servers.
7001 (TCP)Used by Cassandra for secure inter-node cluster
communication.
9042 (TCP)Used by Cassandra for secure client related communication
amongst nodes.
10000–10010 (TCP and UDP)GemFire Server ephemeral port range used for unicast
UDP messaging and for TCP failure detection in the peerto-peer distributed system.
20000–20010 (TCP and UDP)GemFire Locator ephemeral port range used for unicast
UDP messaging and for TCP failure detection in the peerto-peer distributed system.
Localhost Ports
Verify that your port conguration allows localhost access to the following ports. You may restrict o-host
access to these ports if site policies are a concern.
Table 1‑4. Localhost Port Access Requirements for vRealize Operations Manager
Chapter 1 Preparing for vRealize Operations Manager Installation
vRealize Operations Manager Platform Requirements for Linux
vRealize Operations Manager requires the following hardware and software when you install on Linux.
CPU and Memory Requirements
vRealize Operations Manager is supported for installation with the following CPU and memory.
Table 1‑5. vRealize Operations Manager Linux Virtual CPU and Memory Requirements
Node SizeVirtual CPU and Memory
Small4 vCPU
16 GB vRAM
Medium8 vCPU
32 GB vRAM
Large16 vCPU
48 GB vRAM
Standard Remote Collector2 vCPU
4 GB vRAM
Large Remote Collector4 vCPU
16 GB vRAM
Disk Requirements
Disk space for vRealize Operations Manager is not driven solely by how much space the application needs
in order to successfully install. In addition, you must consider data collection and retention requirements,
which might vary from site to site.
See “Sizing the vRealize Operations Manager Cluster,” on page 15.
The default disk requirement for a new, single-node cluster is 250 GB. Thereafter, one approach to prevent
disk capacity shortages is by using vRealize Operations Manager for self monitoring and by adding disk or
data nodes as needed.
Software Version Requirements
vRealize Operations Manager is supported for installation on the following Linux versions.
Red Hat Enterprise Linux (RHEL) 6, starting with version 6.5.
n
Required Linux Packages for vRealize Operations Manager
vRealize Operations Manager requires that certain Linux packages be installed before running the product
installer. Also, vRealize Operations Manager installs additional packages.
Prerequisite Linux Packages
The following packages must be present before running the vRealize Operations Manager installer.
Furthermore, if a package is a Linux default, it must not be removed after installation.
bash
n
chkcong
n
coreutils
n
VMware, Inc. 21
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
db4
n
expat
n
glibc
n
initscripts
n
libaio
n
libselinux
n
libstdc++
n
libuuid
n
mailcap
n
openldap
n
pcre
n
python
n
sudo
n
redhat-logos
n
rpm-libs
n
shadow-utils
n
zlib
n
Packages that vRealize Operations Manager Installs
vRealize Operations Manager installs its own copies of the following packages.
apr
n
apr-util
n
apr-util-ldap
n
hpd
n
hpd-tools
n
mod_ssl
n
openssl
n
python
n
VMware-Postgres-libs
n
VMware-Postgres-osslibs
n
VMware-Postgres-osslibs-server
n
VMware-Postgres-server
n
22 VMware, Inc.
Chapter 1 Preparing for vRealize Operations Manager Installation
Create a Node by Running the vRealize Operations Manager Linux
Installer
vRealize Operations Manager consists of one or more nodes, in a cluster. To create nodes, you download
and run the vRealize Operations Manager Enterprise installer for Linux.
Prerequisites
Plan to use the system only as a vRealize Operations Manager node. Do not host other applications on
n
the same machine.
Verify that vRealize Operations Manager ports are open at the rewall. See “How vRealize Operations
n
Manager Uses Network Ports,” on page 19.
Verify that prerequisite packages are installed. See “Required Linux Packages for vRealize Operations
n
Manager,” on page 21.
If this node is to be the master node, reserve a static IP address for the virtual machine, and know the
n
associated domain name server, default gateway, and network mask values.
Plan to keep the IP address because it is dicult to change the address after installation.
If this node is to be a data node that will become the HA replica node, reserve a static IP address for the
n
virtual machine, and know the associated domain name server, default gateway, and network mask
values.
Plan to keep the IP address because it is dicult to change the address after installation.
In addition, familiarize yourself with HA node placement as described in “About vRealize Operations
Manager High Availability,” on page 33.
Preplan your domain and machine naming so that the Linux machine name will begin and end with
n
alphabet (a–z) or digit (0–9) characters, and will only contain alphabet, digit, or hyphen (-) characters.
The underscore character (_) must not appear in the host name or anywhere in the fully qualied
domain name (FQDN).
Plan to keep the name because it is dicult to change the name after installation.
For more information, review the host name specications from the Internet Engineering Task Force.
See www.ietf.org.
Preplan node placement and networking to meet the requirements described in “General vRealize
n
Operations Manager Cluster Node Requirements,” on page 12 and “vRealize Operations Manager
Cluster Node Networking Requirements,” on page 13.
If you expect the vRealize Operations Manager cluster to use IPv6 addresses, review the IPv6
n
limitations described in “Using IPv6 with vRealize Operations Manager,” on page 14.
Be aware that vRealize Operations Manager uninstalls httpd if it is installed, because
n
vRealize Operations Manager installs its version of Apache.
If vRealize Operations Manager uninstalls httpd, it backs up the /etc/httpdconguration directory.
Uninstall any existing copies of PostgreSQL, and remove PostgreSQL directories and data.
n
vRealize Operations Manager must install its own copy of PostgreSQL.
Verify that all machines in the lentp.conf are resolvable. If you are unsure about the contents of
n
ntp.conf, make a backup copy of the le, and overwrite the original with the default version from a
new machine installation.
Locate your copy of the vRealize Operations Manager Enterprise bin installer for Linux.
n
VMware, Inc. 23
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
Procedure
1Log in with an account that has root privileges.
2Turn o the rewall.
If using IPv4:
# su -
# service iptables save
iptables: Saving firewall rules to /etc/sysconfig/iptables: [ OK ]
# service iptables stop
iptables: Flushing firewall rules: [ OK ]
iptables: Setting chains to policy ACCEPT: filter [ OK ]
iptables: Unloading modules: [ OK ]
# chkconfig iptables off
# service iptables status
iptables: Firewall is not running.
If using IPv6:
# su -
# service ip6tables save
ip6tables: Saving firewall rules to /etc/sysconfig/ip6tables: [ OK ]
# service ip6tables stop
ip6tables: Flushing firewall rules: [ OK ]
ip6tables: Setting chains to policy ACCEPT: filter [ OK ]
ip6tables: Unloading modules: [ OK ]
# chkconfig ip6tables off
# service ip6tables status
ip6tables: Firewall is not running.
3Ensure that the open le limit is appropriate by conguring the required minimum.
sed -i "s/SELINUX=[^ ]*/SELINUX=permissive/g" /etc/selinux/config
5Ensure that node hostname is resolvable.
6Run the vRealize Operations Manager Enterprise bin installer, and follow the prompts.
Add -i console, -i silent, or -i gui to set the installation mode. The default mode conforms to your
session type, for example, console for terminal connections or gui for X-Windows.
cd /tmp
sh ./vRealize_Operations_Manager_Enterprise.bin -i gui
7If you are creating a multiple node vRealize Operations Manager cluster, repeat Step 1 through Step 6
on each Linux machine that will serve as a node in your vRealize Operations Manager cluster.
What to do next
Use a Web browser client to congure a newly added node as the vRealize Operations Manager master
node, a data node, a high-availability master replica node, or a remote collector node. The master node is
required rst.
C For security, do not access vRealize Operations Manager from untrusted or unpatched clients, or
from clients using browser extensions.
24 VMware, Inc.
Chapter 1 Preparing for vRealize Operations Manager Installation
vRealize Operations Manager Platform Requirements for Windows
vRealize Operations Manager requires the following hardware and software when you install on Windows.
CPU and Memory Requirements
vRealize Operations Manager is supported for installation with the following CPU and memory.
Table 1‑6. vRealize Operations Manager Windows Virtual CPU and Memory Requirements
Node SizeVirtual CPU and Memory
Extra Small2 vCPU
8 GB vRAM
Small4 vCPU
16 GB vRAM
Medium8 vCPU
32 GB vRAM
Large16 vCPU
48 GB vRAM
Standard Remote Collector2 vCPU
4 GB vRAM
Large Remote Collector4 vCPU
16 GB vRAM
Disk Requirements
Disk space for vRealize Operations Manager is not driven solely by how much space the application needs
in order to successfully install. In addition, you must consider data collection and retention requirements,
which might vary from site to site.
See “Sizing the vRealize Operations Manager Cluster,” on page 15.
The default disk requirement for a new, single-node cluster is 250 GB. Thereafter, one approach to prevent
disk capacity shortages is by using vRealize Operations Manager for self monitoring and by adding disk or
data nodes as needed.
Software Version Requirements
vRealize Operations Manager is supported for installation on the following Windows versions.
Windows Server 2008 R2 Service Pack 1 (SP1)
n
Windows Server 2008 R2 Enterprise Service Pack 1 (SP1) when conguring the Large node size
n
Windows Server 2008 R2 Service Pack 1 (SP1) congurations also require the updates found in the
n
following Microsoft Knowledge Base articles:
hp://support.microsoft.com/kb/2538243
n
hp://support.microsoft.com/kb/2577795
n
Windows Server 2012 R2
n
Windows Server 2012 R2 Datacenter when conguring the Large node size
n
VMware, Inc. 25
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
Create a Node by Running the vRealize Operations Manager Windows
Installer
vRealize Operations Manager consists of one or more nodes, in a cluster. To create nodes, you download
and run the vRealize Operations Manager Enterprise installer for Windows.
Prerequisites
Plan to use the system only as a vRealize Operations Manager node. Do not host other applications on
n
the same machine.
Verify that vRealize Operations Manager ports are open at the rewall. See “How vRealize Operations
n
Manager Uses Network Ports,” on page 19.
Verify that the partition on which you install vRealize Operations Manager is formaed as NTFS.
n
If this node is to be the master node, reserve a static IP address for the virtual machine, and know the
n
associated domain name server, default gateway, and network mask values.
Plan to keep the IP address because it is dicult to change the address after installation.
If this node is to be a data node that will become the HA replica node, reserve a static IP address for the
n
virtual machine, and know the associated domain name server, default gateway, and network mask
values.
Plan to keep the IP address because it is dicult to change the address after installation.
In addition, familiarize yourself with HA node placement as described in “About vRealize Operations
Manager High Availability,” on page 33.
Preplan your domain and machine naming so that the Windows machine name will begin and end with
n
alphabet (a–z) or digit (0–9) characters, and will only contain alphabet, digit, or hyphen (-) characters.
The underscore character (_) must not appear in the host name or anywhere in the fully qualied
domain name (FQDN).
Plan to keep the name because it is dicult to change the name after installation.
For more information, review the host name specications from the Internet Engineering Task Force.
See www.ietf.org.
Preplan node placement and networking to meet the requirements described in “General vRealize
n
Operations Manager Cluster Node Requirements,” on page 12 and “vRealize Operations Manager
Cluster Node Networking Requirements,” on page 13.
If you expect the vRealize Operations Manager cluster to use IPv6 addresses, review the IPv6
n
limitations described in “Using IPv6 with vRealize Operations Manager,” on page 14.
Verify that the Task Scheduler service has not been disabled. Task Scheduler is enabled by default.
n
Uninstall any existing copies of Apache Tomcat.
n
Uninstall any existing copies of PostgreSQL, and remove PostgreSQL folders and data.
n
vRealize Operations Manager must install its own copy of PostgreSQL.
Locate your copy of the vRealize Operations Manager Enterprise EXE installer for Windows.
n
Procedure
1Start the installer by running the EXE le.
A progress bar appears, followed by the installer wizard.
2Select your language and click OK.
26 VMware, Inc.
Chapter 1 Preparing for vRealize Operations Manager Installation
3Read the introduction and click Next.
4Read the patent notice and click Next.
5Read and scroll to the boom of the license notice, select the option to accept it, and click Next.
6Accept or change the installation folder, and click Next.
7Accept or change the data folder, and click Next.
8Review your seings, and click Install.
A progress bar appears. After a few moments, the installation nishes.
9Click Done.
10 If you are creating a multiple node vRealize Operations Manager cluster, repeat Step 1 through Step 9
on each Windows machine that will serve as a node in your vRealize Operations Manager cluster.
What to do next
Use a Web browser client to congure a newly added node as the vRealize Operations Manager master
node, a data node, a high-availability master replica node, or a remote collector node. The master node is
required rst.
C For security, do not access vRealize Operations Manager from untrusted or unpatched clients, or
from clients using browser extensions.
VMware, Inc. 27
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
28 VMware, Inc.
Creating the
vRealize Operations Manager Master
Node2
All vRealize Operations Manager installations require a master node.
This chapter includes the following topics:
“About the vRealize Operations Manager Master Node,” on page 29
n
“Run the Setup Wizard to Create the Master Node,” on page 29
n
About the vRealize Operations Manager Master Node
The master node is the required, initial node in your vRealize Operations Manager cluster.
In single-node clusters, administration and data are on the same master node. A multiple-node cluster
includes one master node and one or more data nodes. In addition, there might be remote collector nodes,
and there might be one replica node used for high availability.
The master node performs administration for the cluster and must be online before you congure any new
nodes. In addition, the master node must be online before other nodes are brought online. If the master node
and replica node go oine together, bring them back online separately. Bring the master node completely
online rst, and then bring the replica node online. For example, if the entire cluster were oine for any
reason, you would bring the master node online rst.
Creating the Master Node (hp://link.brightcove.com/services/player/bcpid2296383276001?
bctid=ref:video_vrops_create_master_node)
Run the Setup Wizard to Create the Master Node
All vRealize Operations Manager installations require a master node. With a single node cluster,
administration and data functions are on the same master node. A multiple-node
vRealize Operations Manager cluster contains one master node and one or more nodes for handling
additional data.
Prerequisites
Create a node by running the vRealize Operations Manager Enterprise installer for Linux or Windows.
n
After it is deployed, note the fully qualied domain name (FQDN) or IP address of the node.
n
If you plan to use a custom authentication certicate, verify that your certicatele meets the
n
requirements for vRealize Operations Manager. See “Custom vRealize Operations Manager
Certicates,” on page 16.
VMware, Inc.
29
vRealize Operations Manager Installation and Configuration Guide for Linux and Windows
Procedure
1Navigate to the name or IP address of the node that will be the master node of
vRealize Operations Manager.
The setup wizard appears, and you do not need to log in to vRealize Operations Manager.
2Click New Installation.
3Click Next.
4Enter and conrm a password for the admin user account, and click Next.
Passwords require a minimum of 8 characters, one uppercase leer, one lowercase leer, one digit, and
one special character.
The user account name is admin by default and cannot be changed.
5Select whether to use the certicate included with vRealize Operations Manager or to install one of your
own.
aTo use your own certicate, click Browse, locate the certicatele, and click Open to load the le in
the Certicate Information text box.
bReview the information detected from your certicate to verify that it meets the requirements for
vRealize Operations Manager.
6Click Next.
7Enter a name for the master node.
For example: Ops-Master
8Enter the URL or IP address for the Network Time Protocol (NTP) server with which the cluster will
synchronize.
For example: time.nist.gov
9Click Add.
Leave the NTP blank to have vRealize Operations Manager manage its own synchronization by having
all nodes synchronize with the master node and replica node.
10 Click Next, and click Finish.
The administration interface appears, and it takes a moment for vRealize Operations Manager to nish
adding the master node.
What to do next
After creating the master node, you have the following options.
Create and add data nodes to the unstarted cluster.
n
Create and add remote collector nodes to the unstarted cluster.
n
Click Start vRealize Operations Manager to start the single-node cluster, and log in to nish
n
conguring the product.
The cluster might take from 10 to 30 minutes to start, depending on the size of your cluster and nodes.
Do not make changes or perform any actions on cluster nodes while the cluster is starting.
30 VMware, Inc.
Loading...
+ 68 hidden pages
You need points to download manuals.
1 point = 1 manual.
You can buy points or you can get point for every manual you upload.