Skip to content

Commit f67744a

Browse files
committed
rebranding [#171728965]
1 parent 11b9f1c commit f67744a

3 files changed

+34
-36
lines changed

get-credhub-vars.html.md.erb

+8-10
Original file line numberDiff line numberDiff line change
@@ -3,9 +3,7 @@ title: Fetching Variable Names and Values
33
owner: CredHub
44
---
55

6-
<strong><%= modified_date %></strong>
7-
8-
## Overview
6+
## <a id="overview"></a> Overview
97

108
CredHub has two API endpoints to identify and re-use variables. Operators who want to see all the credentials associated with their product, or support engineers who want to troubleshoot issues specific to one virtual machine (VM), can use these APIs for those purposes.
119

@@ -14,11 +12,11 @@ The API endpoints perform these functions:
1412
* Identifying and printing the name of a variable
1513
* Using the name of the variable to identify and print the value of the variable
1614

17-
### Using the API Endpoints
15+
### <a id="use-api-endpoints"></a> Using the API Endpoints
1816

1917
Use these endpoints to view variables for any product in Ops Manager, except the BOSH Director. These endpoints are read-only. You cannot use them to add, remove, or rotate variables.
2018

21-
## Fetching Variables
19+
## <a id="fetch-variables"></a> Fetching Variables
2220

2321
This endpoint returns the list of variables associated with a product that are stored in CredHub. Not all variables are stored in CredHub. If you call a variable that is not stored in CredHub, the call returns an empty value.
2422

@@ -28,7 +26,7 @@ $ curl "http<span>s</span>://OPS-MAN-FQDN/api/v0/deployed/products/product-guid/
2826
-H "Authorization: Bearer EXAMPLE_UAA_ACCESS_TOKEN"
2927
</pre>
3028

31-
### Example Response
29+
### <a id="example-response"></a> Example Response
3230

3331
<pre class="terminal">
3432
HTTP/1.1 200 OK
@@ -38,7 +36,7 @@ HTTP/1.1 200 OK
3836
}
3937
</pre>
4038

41-
### Query Parameters
39+
### <a id="query-params"></a> Query Parameters
4240

4341
<table class="nice">
4442
<th>Parameter</th>
@@ -51,7 +49,7 @@ HTTP/1.1 200 OK
5149

5250
This endpoint returns a variable's name. Use the name in the next endpoint to return the variable's value.
5351

54-
## Fetching Variable Values
52+
## <a id="fetch-var-values"></a> Fetching Variable Values
5553

5654
This endpoint returns the value of a variable stored in CredHub. Not all variables are stored in CredHub, so if you call a variable that isn't in CredHub, the call will return an empty value.
5755

@@ -61,7 +59,7 @@ $ curl "http<span>s</span>://OPS-MAN-FQDN/api/v0/deployed/products/product-guid/
6159
-H "Authorization: Bearer UAA_ACCESS_TOKEN"
6260
</pre>
6361

64-
### Example Response
62+
### <a id="example-response"></a> Example Response
6563

6664
<pre class="terminal">
6765
HTTP/1.1 200 OK
@@ -71,7 +69,7 @@ HTTP/1.1 200 OK
7169
}
7270
</pre>
7371

74-
### Query Parameters
72+
### <a id="query-params-values"></a> Query Parameters
7573
<table>
7674
<th>Parameter</th>
7775
<th>Description</th>

nozzle.html.md.erb

+16-16
Original file line numberDiff line numberDiff line change
@@ -4,13 +4,13 @@ owner: Services
44
---
55

66

7-
This topic explains how to integrate <%= vars.platform_name %> services with Cloud Foundry's logging system, _Loggregator_, by writing to and reading from its _Firehose_ endpoint.
7+
This topic explains how to integrate services with Cloud Foundry's logging system, _Loggregator_, by writing to and reading from its _Firehose_ endpoint.
88

99
## <a id="overview"></a> Overview
1010

11-
Cloud Foundry's Loggregator logging system collects logs and metrics from <%= vars.platform_name %> apps and platform components and streams them to a single endpoint, Firehose. Your tile can integrate its service with Loggregator in two ways:
11+
Cloud Foundry's Loggregator logging system collects logs and metrics from apps and platform components and streams them to a single endpoint, Firehose. Your tile can integrate its service with Loggregator in two ways:
1212

13-
* By sending your service component logs and metrics to the Firehose, to be streamed along with <%= vars.platform_name %> core platform component logs and metrics
13+
* By sending your service component logs and metrics to the Firehose, to be streamed along with core platform component logs and metrics
1414

1515
* By installing a _nozzle_ on Firehose that directs Firehose data to be consumed by external services or apps -- a built-in nozzle can enable a service to:
1616
- Drain metrics to an external dashboard product for system operators
@@ -22,9 +22,9 @@ For a real world production example of a nozzle see [Firehose-to-syslog](https:/
2222

2323
## <a id="firehose"></a> Firehose Communication
2424

25-
<%= vars.platform_name %> components publish logs and metrics to the Firehose through Loggregator agent processes that run locally on the component VMs. Loggregator agents input the data to the Loggregator system through a co-located Loggregator agent. To see how logs and metrics travel from <%= vars.platform_name %> system components to the Firehose, see the [Cloud Foundry documentation](https://docs.cloudfoundry.org/loggregator/architecture.html).
25+
<%= vars.app_runtime_full %> components publish logs and metrics to the Firehose through Loggregator agent processes that run locally on the component VMs. Loggregator agents input the data to the Loggregator system through a co-located Loggregator agent. To see how logs and metrics travel from <%= vars.app_runtime_abbr %> system components to the Firehose, see the [Cloud Foundry documentation](https://docs.cloudfoundry.org/loggregator/architecture.html).
2626

27-
Component VMs running <%= vars.platform_name %> services can publish logs and metrics the same way, by including the same component, Loggregator Agent. Historically, components used Metron for this communication.
27+
Component VMs running <%= vars.app_runtime_abbr %> services can publish logs and metrics the same way, by including the same component, Loggregator Agent. Historically, components used Metron for this communication.
2828

2929
### <a id="https"></a> HTTPS Protocol
3030

@@ -79,11 +79,11 @@ Do so with the following properties:
7979
## <a id="nozzle"></a> Nozzles
8080

8181
A nozzle is a component dedicated to reading and processing data that streams from Firehose.
82-
A service tile can install a nozzle as either a managed service, with package type `bosh-release`, or as an app pushed to Pivotal Application Service (PAS), with the package type `app`.
82+
A service tile can install a nozzle as either a managed service, with package type `bosh-release`, or as an app pushed to <%= vars.app_runtime_abbr %>, with the package type `app`.
8383

8484
### <a id="develop"></a> Develop a Nozzle
8585

86-
Pivotal recommends developing a nozzle in Go to leverage the
86+
<%= vars.company_name %> recommends developing a nozzle in Go to leverage the
8787
[NOAA library](https://github.com/cloudfoundry/noaa).
8888
NOAA does the heavy lifting of establishing
8989
an authenticated websocket connection to the logging system
@@ -163,7 +163,7 @@ Once you have built a nozzle, you can deploy it as a managed service or as an ap
163163
Visit [managed service](managed.html) for more details on what it means to be a
164164
managed service. See also this [example nozzle BOSH release](https://github.com/cloudfoundry-incubator/example-nozzle-release).
165165

166-
You can also deploy the nozzle as an app on PAS. Visit the Tile Generator's
166+
You can also deploy the nozzle as an app on <%= vars.app_runtime_abbr %>. Visit the Tile Generator's
167167
[section on pushed apps](tile-generator.html#pushed-applications)
168168
for more details.
169169

@@ -192,22 +192,22 @@ for the access control list (ACL) app name, space UUID and name, and org UUID an
192192
packages this nozzle as a BOSH release.
193193

194194
* [splunk-firehose-nozzle](https://github.com/cf-platform-eng/splunk-firehose-nozzle)
195-
has source code based on `firehose-to-syslog` and is packaged to run an app on <%= vars.platform_name %>.
195+
has source code based on `firehose-to-syslog` and is packaged to run an app on <%= vars.app_runtime_abbr %>.
196196

197197
* [datadog-firehose-nozzle](https://github.com/DataDog/datadog-firehose-nozzle) is
198198
another real world implementation.
199199

200-
## <a id="syslog-format-pcf"></a> Log Format for <%= vars.platform_name %> Components
200+
## <a id="syslog-format-pcf"></a> Log Format for <%= vars.app_runtime_abbr %> Components
201201

202-
Pivotal's standard log format adheres to the [RFC-5424 syslog protocol](https://tools.ietf.org/html/rfc5424), with log messages formatted as follows:
202+
The standard log format for <%= vars.app_runtime_abbr %> adheres to the [RFC-5424 syslog protocol](https://tools.ietf.org/html/rfc5424), with log messages formatted as follows:
203203

204204
`<${PRI}>${VERSION} ${TIMESTAMP} ${HOST_IP} ${APP_NAME} ${PROD_ID} ${MSG_ID} ${SD-ELEMENT-instance} ${MESSAGE}`
205205

206206
The [Syslog Message Elements table](#syslog-elements) immediately below describes each element of the log, and the [Structured Instance Data Format](#sd-element) table describes the contents of the structured data element that carries Cloud Foundry VM instance information.
207207

208208
### <a id="syslog-elements"></a> Syslog Message Elements
209209

210-
This table describes each element of a standard <%= vars.platform_name %> syslog message.
210+
This table describes each element of a standard <%= vars.app_runtime_abbr %> syslog message.
211211

212212
<table id='syslog-elements-table' border="1" class="nice" >
213213
<tr>
@@ -216,7 +216,7 @@ This table describes each element of a standard <%= vars.platform_name %> syslog
216216
</tr><tr>
217217
<td><code>${PRI}</code></td>
218218
<td><p><a href="https://tools.ietf.org/html/rfc5424#section-6.2.1">Priority value (PRI)</a>, calculated as <code>8 × Facility Code + Severity Code</code></p>
219-
<p>Pivotal uses a Facility Code value of <code>1</code>, indicating a user-level facility. This adds <code>8</code> to the RFC-5424 Severity Codes, resulting in the numbers listed in the <a href="#severity-codes">table below</a>.</p>
219+
<p><%= vars.app_runtime_abbr %> uses a Facility Code value of <code>1</code>, indicating a user-level facility. This adds <code>8</code> to the RFC-5424 Severity Codes, resulting in the numbers listed in the <a href="#severity-codes">table below</a>.</p>
220220
<p>If in doubt, default to <code>13</code>, to indicate Notice-level severity.</p>
221221
</td>
222222
</tr><tr>
@@ -244,7 +244,7 @@ This table describes each element of a standard <%= vars.platform_name %> syslog
244244
<td>The <a href="https://tools.ietf.org/html/rfc5424#section-6.2.7">type</a> of log message. If this is not easily available, default to <code>-</code> (hyphen) to indicate unknown.</td>
245245
</tr><tr>
246246
<td><code>${SD-ELEMENT-instance}</code></td>
247-
<td>Structured data (SD) relevant to <%= vars.platform_name %> about the <a href="https://tools.ietf.org/html/rfc5424#section-6.3.1">source instance (VM)</a> that originates the log message. See the <a href="#sd-element">Structured Instance Data Format table</a> below for content and format.</td>
247+
<td>Structured data (SD) relevant to <%= vars.app_runtime_abbr %> about the <a href="https://tools.ietf.org/html/rfc5424#section-6.3.1">source instance (VM)</a> that originates the log message. See the <a href="#sd-element">Structured Instance Data Format table</a> below for content and format.</td>
248248
</tr><tr>
249249
<td><code>${MESSAGE}</code></td>
250250
<td>The log message itself, ideally in JSON</td>
@@ -253,7 +253,7 @@ This table describes each element of a standard <%= vars.platform_name %> syslog
253253

254254
### <a id="severity"></a> RFC-5424 Severity Codes
255255

256-
<%= vars.platform_name %> components generate log messages with the following severity levels. The most common severity level is `13`.
256+
<%= vars.app_runtime_abbr %> components generate log messages with the following severity levels. The most common severity level is `13`.
257257

258258
<table id='severity-table' border="1" class="nice" >
259259
<tr>
@@ -288,7 +288,7 @@ This table describes each element of a standard <%= vars.platform_name %> syslog
288288

289289
### <a id="sd-element"></a> Structured Instance Data Format
290290

291-
The RFC-5424 syslog protocol includes a [structured data element](https://tools.ietf.org/html/rfc5424#section-6.3.1) that people can use as they see fit. Pivotal uses this element to carry VM instance information as follows:
291+
The RFC-5424 syslog protocol includes a [structured data element](https://tools.ietf.org/html/rfc5424#section-6.3.1) that people can use as they see fit. <%= vars.app_runtime_abbr %> uses this element to carry VM instance information as follows:
292292

293293
<table id='sd-element-table' border="1" class="nice" >
294294
<tr>

ssi-creds-tiledev.html.md.erb

+10-10
Original file line numberDiff line numberDiff line change
@@ -4,13 +4,13 @@ owner: Services
44
---
55

66

7-
This topic describes how to develop your <%= vars.platform_name %> service tile to support secure service instance (SSI) credentials using Runtime CredHub. For more information about Runtime CredHub, see [Runtime CredHub](https://docs.pivotal.io/pivotalcf/credhub/#runtime) in the _CredHub_ topic.
7+
This topic describes how to develop your <%= vars.platform_name %> service tile to support secure service instance (SSI) credentials using runtime CredHub. For more information about runtime CredHub, see [Runtime CredHub](https://docs.pivotal.io/pivotalcf/credhub/#runtime).
88

99
## <a id="background"></a> Background
1010

1111
When developers bind an app to a service instance, the binding typically includes _binding credentials_ required to access the service.
1212

13-
In <%= vars.platform_name %> (formerly PCF) and later, service brokers can store binding credentials as SSI credentials in runtime CredHub and apps can retrieve these credentials from CredHub. This secures service instance credential management by avoiding the following:
13+
In <%= vars.app_runtime_full %>, service brokers can store binding credentials as SSI credentials in runtime CredHub and apps can retrieve these credentials from CredHub. This secures service instance credential management by avoiding the following:
1414

1515
* Leaking environment variables to logs, which increases risk of disclosure.
1616
* Sending credentials between components, which increases risk of disclosure.
@@ -26,23 +26,23 @@ To store binding credentials in runtime CredHub, your service tile needs to supp
2626

2727
SSI credentials, which let apps access services through service instances, are distinct from the credentials that service tiles store in [BOSH CredHub](./credhub.html) for their own internal use.
2828

29-
When a service uses SSI credentials, its service broker stores the binding credentials in runtime CredHub. Then, when PAS binds an app to an instance of the service, the broker retrieves the credentials from runtime CredHub and delivers them to the Cloud Controller (CC) to enable the app to access the service.
29+
When a service uses SSI credentials, its service broker stores the binding credentials in runtime CredHub. Then, when %= vars.app_runtime_abbr %> binds an app to an instance of the service, the broker retrieves the credentials from runtime CredHub and delivers them to the Cloud Controller (CC) to enable the app to access the service.
3030

31-
These SSI credentials are different from credentials that the tile uses internally, for example, to give the service broker access to an internal database. PAS generates the internal tile credentials for a service when the service is first installed and stores them in BOSH CredHub, not runtime CredHub.
31+
These SSI credentials are different from credentials that the tile uses internally, for example, to give the service broker access to an internal database. <%= vars.app_runtime_abbr %> generates the internal tile credentials for a service when the service is first installed and stores them in BOSH CredHub, not runtime CredHub.
3232

3333
For more information on the CredHub credential management component, see the [CredHub documentation](https://docs.cloudfoundry.org/credhub/index.html) topic.
3434

3535
The sections below describe an example implementation of how to add SSI credentials functionality to a service tile.
3636

3737
## <a id="step1"></a> Step 1: Modify Your BOSH Release
3838

39-
To use runtime CredHub, your service tile needs to retrieve the location of the CredHub server, which is published in the Pivotal Application Service (PAS) tile, through a BOSH link.
39+
To use runtime CredHub, your service tile needs to retrieve the location of the CredHub server, which is published in the <%= vars.app_runtime_abbr %> tile, through a BOSH link.
4040

4141
<p class="note"><strong>Note</strong>: BOSH Links let multiple jobs share deployment-time configuration properties. This helps to avoid redundant configurations in BOSH releases and deployment manifests. For more information about BOSH Links, see <a href="http://docs.pivotal.io/tiledev/1-11/release-notes.html#bosh-links">BOSH Links</a>.</p>
4242

4343
### <a id="spec"></a> Update Spec File and Templates
4444

45-
The location of runtime CredHub is stored in the `credhub.internal_url` and `credhub.port` properties of the PAS tile. To enable your service tile to retrieve these CredHub-provided properties, add a `consumes:` section with the BOSH link from the PAS tile to the spec file of the BOSH job that will use them and edit the job's templates to access the values in the link:
45+
The location of runtime CredHub is stored in the `credhub.internal_url` and `credhub.port` properties of the <%= vars.app_runtime_abbr %> tile. To enable your service tile to retrieve these CredHub-provided properties, add a `consumes:` section with the BOSH link from the <%= vars.app_runtime_abbr %> tile to the spec file of the BOSH job that will use them and edit the job's templates to access the values in the link:
4646

4747
```
4848
consumes:
@@ -54,9 +54,9 @@ For information about using BOSH Links in the spec file and templates of a job a
5454

5555
### <a id="errand"></a> Save the Runtime CredHub Location
5656

57-
To use the runtime CredHub location retrieved from the PAS tile, you must write a `post_deploy` [tile errand](tile-structure.html#errands) that saves the value out in some way and enables the service broker to access it.
57+
To use the runtime CredHub location retrieved from the <%= vars.app_runtime_abbr %> tile, you must write a `post_deploy` [tile errand](tile-structure.html#errands) that saves the value out in some way and enables the service broker to access it.
5858

59-
Depending on how your tile deploys the service broker app, the service instance errand can save the CredHub location in different ways. If the tile pushes the broker as a Cloud Foundry app, the errand can store the location in an environment variable such as `CREDHUB_URL` for the service broker to call. If BOSH deploys the service broker outside of of PAS, the errand could write the CredHub location out to a templated configuration file that the service broker reads.
59+
Depending on how your tile deploys the service broker app, the service instance errand can save the CredHub location in different ways. If the tile pushes the broker as a Cloud Foundry app, the errand can store the location in an environment variable such as `CREDHUB_URL` for the service broker to call. If BOSH deploys the service broker outside of <%= vars.app_runtime_abbr %>, the errand could write the CredHub location out to a templated configuration file that the service broker reads.
6060

6161
### <a id="manifest"></a> Update Deployment Manifest
6262

@@ -97,7 +97,7 @@ properties:
9797
- '*.credhub.(( ..cf.credhub.network )).(( ..cf.deployment_name )).bosh'
9898
```
9999

100-
In the example, the runtime CredHub instance can be accessed at `credhub.service.cf.internal`. If your broker runs as an app, you can resolve this address with BOSH DNS. If your broker runs on a VM with a Consul agent, you can resolve the address with Consul. Alternatively, from a VM, you can resolve the address with `dig credhub.service.cf.internal @169.254.0.2`. This command uses the PAS BOSH DNS server to do lookup.
100+
In the example, the runtime CredHub instance can be accessed at `credhub.service.cf.internal`. If your broker runs as an app, you can resolve this address with BOSH DNS. If your broker runs on a VM with a Consul agent, you can resolve the address with Consul. Alternatively, from a VM, you can resolve the address with `dig credhub.service.cf.internal @169.254.0.2`. This command uses the <%= vars.app_runtime_abbr %> BOSH DNS server to do lookup.
101101

102102
## <a id="step3"></a> Step 3: Provide Operators with the Choice to Use CredHub
103103

@@ -111,7 +111,7 @@ form_types:
111111
property_inputs:
112112
- reference: .JOB-NAME.secure_credentials
113113
label: Secure service instance credentials
114-
description: "When checked, service instance credentials are stored in CredHub. Enable only when installing with <%= vars.platform_name %> (formerly PCF) or later and this feature is also enabled in the PAS tile."
114+
description: "When checked, service instance credentials are stored in CredHub. Enable only when installing with Ops Manager and this feature is also enabled in the <%= vars.app_runtime_abbr %> tile."
115115

116116
property_blueprints:
117117
- name: hidden_credhub_selector

0 commit comments

Comments
 (0)