You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: get-credhub-vars.html.md.erb
+8-10
Original file line number
Diff line number
Diff line change
@@ -3,9 +3,7 @@ title: Fetching Variable Names and Values
3
3
owner: CredHub
4
4
---
5
5
6
-
<strong><%=modified_date%></strong>
7
-
8
-
## Overview
6
+
## <aid="overview"></a> Overview
9
7
10
8
CredHub has two API endpoints to identify and re-use variables. Operators who want to see all the credentials associated with their product, or support engineers who want to troubleshoot issues specific to one virtual machine (VM), can use these APIs for those purposes.
11
9
@@ -14,11 +12,11 @@ The API endpoints perform these functions:
14
12
* Identifying and printing the name of a variable
15
13
* Using the name of the variable to identify and print the value of the variable
16
14
17
-
### Using the API Endpoints
15
+
### <aid="use-api-endpoints"></a>Using the API Endpoints
18
16
19
17
Use these endpoints to view variables for any product in Ops Manager, except the BOSH Director. These endpoints are read-only. You cannot use them to add, remove, or rotate variables.
20
18
21
-
## Fetching Variables
19
+
## <aid="fetch-variables"></a>Fetching Variables
22
20
23
21
This endpoint returns the list of variables associated with a product that are stored in CredHub. Not all variables are stored in CredHub. If you call a variable that is not stored in CredHub, the call returns an empty value.
This endpoint returns the value of a variable stored in CredHub. Not all variables are stored in CredHub, so if you call a variable that isn't in CredHub, the call will return an empty value.
Copy file name to clipboardexpand all lines: nozzle.html.md.erb
+16-16
Original file line number
Diff line number
Diff line change
@@ -4,13 +4,13 @@ owner: Services
4
4
---
5
5
6
6
7
-
This topic explains how to integrate <%= vars.platform_name %>services with Cloud Foundry's logging system, _Loggregator_, by writing to and reading from its _Firehose_ endpoint.
7
+
This topic explains how to integrate services with Cloud Foundry's logging system, _Loggregator_, by writing to and reading from its _Firehose_ endpoint.
8
8
9
9
## <aid="overview"></a> Overview
10
10
11
-
Cloud Foundry's Loggregator logging system collects logs and metrics from <%=vars.platform_name%>apps and platform components and streams them to a single endpoint, Firehose. Your tile can integrate its service with Loggregator in two ways:
11
+
Cloud Foundry's Loggregator logging system collects logs and metrics from apps and platform components and streams them to a single endpoint, Firehose. Your tile can integrate its service with Loggregator in two ways:
12
12
13
-
* By sending your service component logs and metrics to the Firehose, to be streamed along with <%=vars.platform_name%>core platform component logs and metrics
13
+
* By sending your service component logs and metrics to the Firehose, to be streamed along with core platform component logs and metrics
14
14
15
15
* By installing a _nozzle_ on Firehose that directs Firehose data to be consumed by external services or apps -- a built-in nozzle can enable a service to:
16
16
- Drain metrics to an external dashboard product for system operators
@@ -22,9 +22,9 @@ For a real world production example of a nozzle see [Firehose-to-syslog](https:/
22
22
23
23
## <aid="firehose"></a> Firehose Communication
24
24
25
-
<%=vars.platform_name%> components publish logs and metrics to the Firehose through Loggregator agent processes that run locally on the component VMs. Loggregator agents input the data to the Loggregator system through a co-located Loggregator agent. To see how logs and metrics travel from <%=vars.platform_name%> system components to the Firehose, see the [Cloud Foundry documentation](https://docs.cloudfoundry.org/loggregator/architecture.html).
25
+
<%=vars.app_runtime_full%> components publish logs and metrics to the Firehose through Loggregator agent processes that run locally on the component VMs. Loggregator agents input the data to the Loggregator system through a co-located Loggregator agent. To see how logs and metrics travel from <%=vars.app_runtime_abbr%> system components to the Firehose, see the [Cloud Foundry documentation](https://docs.cloudfoundry.org/loggregator/architecture.html).
26
26
27
-
Component VMs running <%=vars.platform_name%> services can publish logs and metrics the same way, by including the same component, Loggregator Agent. Historically, components used Metron for this communication.
27
+
Component VMs running <%=vars.app_runtime_abbr%> services can publish logs and metrics the same way, by including the same component, Loggregator Agent. Historically, components used Metron for this communication.
28
28
29
29
### <aid="https"></a> HTTPS Protocol
30
30
@@ -79,11 +79,11 @@ Do so with the following properties:
79
79
## <aid="nozzle"></a> Nozzles
80
80
81
81
A nozzle is a component dedicated to reading and processing data that streams from Firehose.
82
-
A service tile can install a nozzle as either a managed service, with package type `bosh-release`, or as an app pushed to Pivotal Application Service (PAS), with the package type `app`.
82
+
A service tile can install a nozzle as either a managed service, with package type `bosh-release`, or as an app pushed to <%=vars.app_runtime_abbr%>, with the package type `app`.
83
83
84
84
### <aid="develop"></a> Develop a Nozzle
85
85
86
-
Pivotal recommends developing a nozzle in Go to leverage the
86
+
<%=vars.company_name%> recommends developing a nozzle in Go to leverage the
has source code based on `firehose-to-syslog` and is packaged to run an app on <%=vars.platform_name%>.
195
+
has source code based on `firehose-to-syslog` and is packaged to run an app on <%=vars.app_runtime_abbr%>.
196
196
197
197
* [datadog-firehose-nozzle](https://github.com/DataDog/datadog-firehose-nozzle) is
198
198
another real world implementation.
199
199
200
-
## <aid="syslog-format-pcf"></a> Log Format for <%=vars.platform_name%> Components
200
+
## <aid="syslog-format-pcf"></a> Log Format for <%=vars.app_runtime_abbr%> Components
201
201
202
-
Pivotal's standard log format adheres to the [RFC-5424 syslog protocol](https://tools.ietf.org/html/rfc5424), with log messages formatted as follows:
202
+
The standard log format for <%=vars.app_runtime_abbr%> adheres to the [RFC-5424 syslog protocol](https://tools.ietf.org/html/rfc5424), with log messages formatted as follows:
The [Syslog Message Elements table](#syslog-elements) immediately below describes each element of the log, and the [Structured Instance Data Format](#sd-element) table describes the contents of the structured data element that carries Cloud Foundry VM instance information.
207
207
208
208
### <aid="syslog-elements"></a> Syslog Message Elements
209
209
210
-
This table describes each element of a standard <%=vars.platform_name%> syslog message.
210
+
This table describes each element of a standard <%=vars.app_runtime_abbr%> syslog message.
@@ -216,7 +216,7 @@ This table describes each element of a standard <%= vars.platform_name %> syslog
216
216
</tr><tr>
217
217
<td><code>${PRI}</code></td>
218
218
<td><p><ahref="https://tools.ietf.org/html/rfc5424#section-6.2.1">Priority value (PRI)</a>, calculated as <code>8 × Facility Code + Severity Code</code></p>
219
-
<p>Pivotal uses a Facility Code value of <code>1</code>, indicating a user-level facility. This adds <code>8</code> to the RFC-5424 Severity Codes, resulting in the numbers listed in the <ahref="#severity-codes">table below</a>.</p>
219
+
<p><%=vars.app_runtime_abbr%> uses a Facility Code value of <code>1</code>, indicating a user-level facility. This adds <code>8</code> to the RFC-5424 Severity Codes, resulting in the numbers listed in the <ahref="#severity-codes">table below</a>.</p>
220
220
<p>If in doubt, default to <code>13</code>, to indicate Notice-level severity.</p>
221
221
</td>
222
222
</tr><tr>
@@ -244,7 +244,7 @@ This table describes each element of a standard <%= vars.platform_name %> syslog
244
244
<td>The <ahref="https://tools.ietf.org/html/rfc5424#section-6.2.7">type</a> of log message. If this is not easily available, default to <code>-</code> (hyphen) to indicate unknown.</td>
245
245
</tr><tr>
246
246
<td><code>${SD-ELEMENT-instance}</code></td>
247
-
<td>Structured data (SD) relevant to <%=vars.platform_name%> about the <ahref="https://tools.ietf.org/html/rfc5424#section-6.3.1">source instance (VM)</a> that originates the log message. See the <ahref="#sd-element">Structured Instance Data Format table</a> below for content and format.</td>
247
+
<td>Structured data (SD) relevant to <%=vars.app_runtime_abbr%> about the <ahref="https://tools.ietf.org/html/rfc5424#section-6.3.1">source instance (VM)</a> that originates the log message. See the <ahref="#sd-element">Structured Instance Data Format table</a> below for content and format.</td>
248
248
</tr><tr>
249
249
<td><code>${MESSAGE}</code></td>
250
250
<td>The log message itself, ideally in JSON</td>
@@ -253,7 +253,7 @@ This table describes each element of a standard <%= vars.platform_name %> syslog
253
253
254
254
### <aid="severity"></a> RFC-5424 Severity Codes
255
255
256
-
<%=vars.platform_name%> components generate log messages with the following severity levels. The most common severity level is `13`.
256
+
<%=vars.app_runtime_abbr%> components generate log messages with the following severity levels. The most common severity level is `13`.
@@ -288,7 +288,7 @@ This table describes each element of a standard <%= vars.platform_name %> syslog
288
288
289
289
### <aid="sd-element"></a> Structured Instance Data Format
290
290
291
-
The RFC-5424 syslog protocol includes a [structured data element](https://tools.ietf.org/html/rfc5424#section-6.3.1) that people can use as they see fit. Pivotal uses this element to carry VM instance information as follows:
291
+
The RFC-5424 syslog protocol includes a [structured data element](https://tools.ietf.org/html/rfc5424#section-6.3.1) that people can use as they see fit. <%=vars.app_runtime_abbr%> uses this element to carry VM instance information as follows:
Copy file name to clipboardexpand all lines: ssi-creds-tiledev.html.md.erb
+10-10
Original file line number
Diff line number
Diff line change
@@ -4,13 +4,13 @@ owner: Services
4
4
---
5
5
6
6
7
-
This topic describes how to develop your <%= vars.platform_name %> service tile to support secure service instance (SSI) credentials using Runtime CredHub. For more information about Runtime CredHub, see [Runtime CredHub](https://docs.pivotal.io/pivotalcf/credhub/#runtime) in the _CredHub_ topic.
7
+
This topic describes how to develop your <%= vars.platform_name %> service tile to support secure service instance (SSI) credentials using runtime CredHub. For more information about runtime CredHub, see [Runtime CredHub](https://docs.pivotal.io/pivotalcf/credhub/#runtime).
8
8
9
9
## <aid="background"></a> Background
10
10
11
11
When developers bind an app to a service instance, the binding typically includes _binding credentials_ required to access the service.
12
12
13
-
In <%=vars.platform_name%> (formerly PCF) and later, service brokers can store binding credentials as SSI credentials in runtime CredHub and apps can retrieve these credentials from CredHub. This secures service instance credential management by avoiding the following:
13
+
In <%=vars.app_runtime_full%>, service brokers can store binding credentials as SSI credentials in runtime CredHub and apps can retrieve these credentials from CredHub. This secures service instance credential management by avoiding the following:
14
14
15
15
* Leaking environment variables to logs, which increases risk of disclosure.
16
16
* Sending credentials between components, which increases risk of disclosure.
@@ -26,23 +26,23 @@ To store binding credentials in runtime CredHub, your service tile needs to supp
26
26
27
27
SSI credentials, which let apps access services through service instances, are distinct from the credentials that service tiles store in [BOSH CredHub](./credhub.html) for their own internal use.
28
28
29
-
When a service uses SSI credentials, its service broker stores the binding credentials in runtime CredHub. Then, when PAS binds an app to an instance of the service, the broker retrieves the credentials from runtime CredHub and delivers them to the Cloud Controller (CC) to enable the app to access the service.
29
+
When a service uses SSI credentials, its service broker stores the binding credentials in runtime CredHub. Then, when %= vars.app_runtime_abbr %> binds an app to an instance of the service, the broker retrieves the credentials from runtime CredHub and delivers them to the Cloud Controller (CC) to enable the app to access the service.
30
30
31
-
These SSI credentials are different from credentials that the tile uses internally, for example, to give the service broker access to an internal database. PAS generates the internal tile credentials for a service when the service is first installed and stores them in BOSH CredHub, not runtime CredHub.
31
+
These SSI credentials are different from credentials that the tile uses internally, for example, to give the service broker access to an internal database. <%=vars.app_runtime_abbr%> generates the internal tile credentials for a service when the service is first installed and stores them in BOSH CredHub, not runtime CredHub.
32
32
33
33
For more information on the CredHub credential management component, see the [CredHub documentation](https://docs.cloudfoundry.org/credhub/index.html) topic.
34
34
35
35
The sections below describe an example implementation of how to add SSI credentials functionality to a service tile.
36
36
37
37
## <aid="step1"></a> Step 1: Modify Your BOSH Release
38
38
39
-
To use runtime CredHub, your service tile needs to retrieve the location of the CredHub server, which is published in the Pivotal Application Service (PAS) tile, through a BOSH link.
39
+
To use runtime CredHub, your service tile needs to retrieve the location of the CredHub server, which is published in the <%=vars.app_runtime_abbr%> tile, through a BOSH link.
40
40
41
41
<pclass="note"><strong>Note</strong>: BOSH Links let multiple jobs share deployment-time configuration properties. This helps to avoid redundant configurations in BOSH releases and deployment manifests. For more information about BOSH Links, see <ahref="http://docs.pivotal.io/tiledev/1-11/release-notes.html#bosh-links">BOSH Links</a>.</p>
42
42
43
43
### <aid="spec"></a> Update Spec File and Templates
44
44
45
-
The location of runtime CredHub is stored in the `credhub.internal_url` and `credhub.port` properties of the PAS tile. To enable your service tile to retrieve these CredHub-provided properties, add a `consumes:` section with the BOSH link from the PAS tile to the spec file of the BOSH job that will use them and edit the job's templates to access the values in the link:
45
+
The location of runtime CredHub is stored in the `credhub.internal_url` and `credhub.port` properties of the <%=vars.app_runtime_abbr%>tile. To enable your service tile to retrieve these CredHub-provided properties, add a `consumes:` section with the BOSH link from the <%=vars.app_runtime_abbr%> tile to the spec file of the BOSH job that will use them and edit the job's templates to access the values in the link:
46
46
47
47
```
48
48
consumes:
@@ -54,9 +54,9 @@ For information about using BOSH Links in the spec file and templates of a job a
54
54
55
55
### <aid="errand"></a> Save the Runtime CredHub Location
56
56
57
-
To use the runtime CredHub location retrieved from the PAS tile, you must write a `post_deploy` [tile errand](tile-structure.html#errands) that saves the value out in some way and enables the service broker to access it.
57
+
To use the runtime CredHub location retrieved from the <%=vars.app_runtime_abbr%> tile, you must write a `post_deploy` [tile errand](tile-structure.html#errands) that saves the value out in some way and enables the service broker to access it.
58
58
59
-
Depending on how your tile deploys the service broker app, the service instance errand can save the CredHub location in different ways. If the tile pushes the broker as a Cloud Foundry app, the errand can store the location in an environment variable such as `CREDHUB_URL` for the service broker to call. If BOSH deploys the service broker outside of of PAS, the errand could write the CredHub location out to a templated configuration file that the service broker reads.
59
+
Depending on how your tile deploys the service broker app, the service instance errand can save the CredHub location in different ways. If the tile pushes the broker as a Cloud Foundry app, the errand can store the location in an environment variable such as `CREDHUB_URL` for the service broker to call. If BOSH deploys the service broker outside of <%=vars.app_runtime_abbr%>, the errand could write the CredHub location out to a templated configuration file that the service broker reads.
In the example, the runtime CredHub instance can be accessed at `credhub.service.cf.internal`. If your broker runs as an app, you can resolve this address with BOSH DNS. If your broker runs on a VM with a Consul agent, you can resolve the address with Consul. Alternatively, from a VM, you can resolve the address with `dig credhub.service.cf.internal @169.254.0.2`. This command uses the PAS BOSH DNS server to do lookup.
100
+
In the example, the runtime CredHub instance can be accessed at `credhub.service.cf.internal`. If your broker runs as an app, you can resolve this address with BOSH DNS. If your broker runs on a VM with a Consul agent, you can resolve the address with Consul. Alternatively, from a VM, you can resolve the address with `dig credhub.service.cf.internal @169.254.0.2`. This command uses the <%=vars.app_runtime_abbr%> BOSH DNS server to do lookup.
101
101
102
102
## <aid="step3"></a> Step 3: Provide Operators with the Choice to Use CredHub
103
103
@@ -111,7 +111,7 @@ form_types:
111
111
property_inputs:
112
112
- reference: .JOB-NAME.secure_credentials
113
113
label: Secure service instance credentials
114
-
description: "When checked, service instance credentials are stored in CredHub. Enable only when installing with <%=vars.platform_name%> (formerly PCF) or later and this feature is also enabled in the PAS tile."
114
+
description: "When checked, service instance credentials are stored in CredHub. Enable only when installing with Ops Manager and this feature is also enabled in the <%=vars.app_runtime_abbr%> tile."
0 commit comments