Skip to content

Commit 09e0850

Browse files
authored
Binary data converter (#90)
1 parent 4825d7a commit 09e0850

File tree

25 files changed

+734
-1963
lines changed

25 files changed

+734
-1963
lines changed

.github/workflows/python.yaml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,9 @@ jobs:
2222
run: 'echo head_ref: ${{ github.head_ref }}, ref: ${{ github.ref }}, python repo ref: ${{ inputs.python-repo-ref }}'
2323
working-directory: '.'
2424

25+
- name: Install Protoc
26+
uses: arduino/setup-protoc@v1
27+
2528
- name: Checkout SDK features repo
2629
uses: actions/checkout@v2
2730
with:

.gitignore

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,4 +14,11 @@ node_modules
1414
/tslib/
1515

1616
# Python stuff
17-
__pycache__
17+
__pycache__
18+
19+
# Build Go stuff
20+
bin
21+
22+
# VS Code config
23+
.vscode
24+

README.md

Lines changed: 42 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -5,11 +5,11 @@ contains a runner and language-specific harnesses to confirm feature behavior ac
55

66
These SDK features serve several purposes:
77

8-
* Ensure parity across SDKs by having same-feature snippets adjacent to one another
9-
* Confirm feature behavior across SDK versions
10-
* Confirm history across SDK versions
11-
* Document features in different SDKs
12-
* Easy-to-use environment for writing quick workflows in all languages/versions
8+
- Ensure parity across SDKs by having same-feature snippets adjacent to one another
9+
- Confirm feature behavior across SDK versions
10+
- Confirm history across SDK versions
11+
- Document features in different SDKs
12+
- Easy-to-use environment for writing quick workflows in all languages/versions
1313

1414
## Building
1515

@@ -87,38 +87,42 @@ the `features/` directory.
8787
In addition to code for the feature, there are configuration settings that can be set in `.config.json`. The possible
8888
settings are:
8989

90-
* `go`
91-
* `minVersion` - Minimum version in Go this feature should be run in. The feature will be skipped in older versions.
90+
- `go`
91+
- `minVersion` - Minimum version in Go this feature should be run in. The feature will be skipped in older versions.
9292

9393
There are also files in the `history/` subdirectory which contain history files used during run. See the
9494
"History Checking" and "Generating History" sections for more info.
9595

9696
### Best Practices
9797

98-
* Try to only demonstrate/test one feature per feature directory.
99-
* Code should be kept as short and clear as possible.
100-
* No need to over-assert on a bunch of values, just confirm that the feature does what is expected via its output.
101-
* A Go feature should be in `feature.go`.
102-
* For incompatible versions, different files like `feature_pre1.11.0.go` can be present using build tags
103-
* A Java feature should be in `feature.java`.
104-
* A TypeScript feature should be in `feature.ts` for all non-workflow code and `feature.workflow.ts` for all workflow
105-
code.
106-
* A Python feature should be in `feature.py`.
107-
* Add a README.md to each feature directory.
108-
* README should have a title summarizing the feature (only first letter needs to be in title case), then a short
98+
- Try to only demonstrate/test one feature per feature directory.
99+
- Code should be kept as short and clear as possible.
100+
- No need to over-assert on a bunch of values, just confirm that the feature does what is expected via its output.
101+
- A Go feature should be in `feature.go`.
102+
- For incompatible versions, different files like `feature_pre1.11.0.go` can be present using build tags
103+
- A Java feature should be in `feature.java`.
104+
- A TypeScript feature should be in `feature.ts`.
105+
106+
**NOTE**: TypeScript features include workflow and non workflow code in the same file. Those are run in different
107+
environments so they may not share variables and the feature author should keep the workflow runtime limitations in min
108+
mind when writing features.
109+
110+
- A Python feature should be in `feature.py`.
111+
- Add a README.md to each feature directory.
112+
- README should have a title summarizing the feature (only first letter needs to be in title case), then a short
109113
paragraph explaining the feature and its purpose, and then optionally another paragraph explaining details of the
110114
specific code steps.
111-
* Other sections can also explain decisions made in language specific ways or other details about versions/approaches.
112-
* Feel free to add links and more text as necessary.
113-
* Verification/regression feature directories for bugs should be under `features/bugs/<lang>`.
114-
* Ideally the checking of the result has a version condition that shows in earlier versions it should fail and in
115+
- Other sections can also explain decisions made in language specific ways or other details about versions/approaches.
116+
- Feel free to add links and more text as necessary.
117+
- Verification/regression feature directories for bugs should be under `features/bugs/<lang>`.
118+
- Ideally the checking of the result has a version condition that shows in earlier versions it should fail and in
115119
newer versions it should succeed.
116-
* The more languages per non-bug feature, the better. Try not to create non-bug features that use specific language
120+
- The more languages per non-bug feature, the better. Try not to create non-bug features that use specific language
117121
constructs unless that's the purpose of the feature.
118-
* Refactor liberally to create shortcuts and better harnesses for making features easy to read and write.
119-
* This is not a library used by anyone, there are no backwards compatibility concerns. If one feature uses something
122+
- Refactor liberally to create shortcuts and better harnesses for making features easy to read and write.
123+
- This is not a library used by anyone, there are no backwards compatibility concerns. If one feature uses something
120124
that has value to another, extract and put in helper/harness and have both use it.
121-
* History should be generated for each feature on the earliest version the feature is expected to work at without
125+
- History should be generated for each feature on the earliest version the feature is expected to work at without
122126
history changes.
123127

124128
#### Generating History
@@ -132,15 +136,16 @@ incompatibility. Otherwise, history files should remain checked in and not regen
132136

133137
## TODO
134138

135-
* Add support for replaying testing of all versions _inside_ each SDKs harness as part of the run
136-
* Add TypeScript support
137-
* The main support is present, but there are outstanding questions on what constitutes a "version" since really
139+
- Add support for replaying testing of all versions _inside_ each SDKs harness as part of the run
140+
- Add TypeScript support
141+
- The main support is present, but there are outstanding questions on what constitutes a "version" since really
138142
TypeScript has many versions
139-
* Add many more feature workflows
140-
* Document how to use this framework to easily write and test features even when not committing
141-
* Log swallowing and concurrent execution
142-
* Investigate support for changing runtime versions (i.e. Go, Java, and Node versions)
143-
* Investigate support for changing server versions
144-
* CI support
145-
* Support using a commit hash and alternative git location for an SDK to run against
146-
* Decide whether the matrix of different SDK versions and such is really part of this repo or part of CI tooling
143+
- Add many more feature workflows
144+
- Document how to use this framework to easily write and test features even when not committing
145+
- Log swallowing and concurrent execution
146+
- Investigate support for changing runtime versions (i.e. Go, Java, and Node versions)
147+
- Investigate support for changing server versions
148+
- CI support
149+
- Support using a commit hash and alternative git location for an SDK to run against
150+
- Decide whether the matrix of different SDK versions and such is really part of this repo or part of CI tooling
151+

features/activity/cancel_try_cancel/feature.ts

Lines changed: 11 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,11 @@
11
import { Context } from '@temporalio/activity';
2-
import { WorkflowClient } from '@temporalio/client';
3-
import { CancelledFailure, ActivityFailure, ApplicationFailure } from '@temporalio/common';
4-
import { Feature } from '@temporalio/harness';
2+
import { CancelledFailure } from '@temporalio/common';
3+
import { Feature, getWorkflowClient } from '@temporalio/harness';
4+
import { ActivityFailure, ApplicationFailure } from '@temporalio/common';
55
import * as wf from '@temporalio/workflow';
66

7-
const { cancellableActivity } = wf.proxyActivities<typeof activitiesImpl>({
7+
// Allow 4 retries with no backoff
8+
const activities = wf.proxyActivities<typeof activitiesImpl>({
89
startToCloseTimeout: '1 minute',
910
heartbeatTimeout: '5 seconds',
1011
// Disable retry
@@ -18,7 +19,7 @@ export async function workflow(): Promise<void> {
1819
try {
1920
await wf.CancellationScope.cancellable(async () => {
2021
// Start activity
21-
const actPromise = cancellableActivity();
22+
const actPromise = activities.cancellableActivity();
2223

2324
// Sleep for smallest amount of time (force task turnover)
2425
await wf.sleep(1);
@@ -43,14 +44,9 @@ export async function workflow(): Promise<void> {
4344
}
4445
}
4546

46-
let client: WorkflowClient | undefined;
47-
4847
const activitiesImpl = {
4948
async cancellableActivity() {
50-
// Expect client to be set
51-
if (!client) {
52-
throw new Error('Missing client');
53-
}
49+
const client = getWorkflowClient();
5450

5551
// Heartbeat every second for a minute
5652
let result = 'timeout';
@@ -76,13 +72,7 @@ const activitiesImpl = {
7672
},
7773
};
7874

79-
export const feature =
80-
!wf.inWorkflowContext() &&
81-
new Feature({
82-
workflow,
83-
activities: activitiesImpl,
84-
execute: async (runner) => {
85-
client = runner.client;
86-
return await runner.executeSingleParameterlessWorkflow();
87-
},
88-
});
75+
export const feature = new Feature({
76+
workflow,
77+
activities: activitiesImpl,
78+
});

features/activity/retry_on_error/feature.ts

Lines changed: 14 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -28,19 +28,17 @@ const activitiesImpl = {
2828
},
2929
};
3030

31-
export const feature =
32-
!wf.inWorkflowContext() &&
33-
new Feature({
34-
workflow,
35-
activities: activitiesImpl,
36-
checkResult: async (runner, handle) => {
37-
await assert.rejects(runner.waitForRunResult(handle), (err) => {
38-
assert.ok(
39-
err instanceof WorkflowFailedError,
40-
`expected WorkflowFailedError, got ${typeof err}, message: ${(err as any).message}`
41-
);
42-
assert.equal(err.cause?.cause?.message, 'activity attempt 5 failed');
43-
return true;
44-
});
45-
},
46-
});
31+
export const feature = new Feature({
32+
workflow,
33+
activities: activitiesImpl,
34+
checkResult: async (runner, handle) => {
35+
await assert.rejects(runner.waitForRunResult(handle), (err) => {
36+
assert.ok(
37+
err instanceof WorkflowFailedError,
38+
`expected WorkflowFailedError, got ${typeof err}, message: ${(err as any).message}`
39+
);
40+
assert.equal(err.cause?.cause?.message, 'activity attempt 5 failed');
41+
return true;
42+
});
43+
},
44+
});

features/data_converter/binary/README.md

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,12 +2,13 @@
22

33
Binary values can be converted to and from `binary/plain` Payloads.
44

5-
This feature:
5+
Steps:
66

7-
- runs the binary value `101` (5) through the default Payload Converter, writes it to `payloads/binary.[lang]`, and
8-
verifies it matches the other files in `payloads/`
9-
- decodes all files in `payloads/` with the default Payload Converter and verifies the binary value is `101`
7+
- run a workflow that returns binary value `0xdeadbeef`
8+
- verify client result is binary `0xdeadbeef`
9+
- get result payload of WorkflowExecutionCompleted event from workflow history
10+
- load JSON payload from `./payload.json` and compare it to result payload
1011

1112
# Detailed spec
1213

13-
`metadata.encoding = toBinary("binary/plain")`
14+
`metadata.encoding = toBinary("binary/plain")`
Lines changed: 74 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,74 @@
1+
package binary
2+
3+
import (
4+
"bytes"
5+
"context"
6+
"errors"
7+
"fmt"
8+
"os"
9+
"path"
10+
11+
"github.com/gogo/protobuf/jsonpb"
12+
common "go.temporal.io/api/common/v1"
13+
historyProto "go.temporal.io/api/history/v1"
14+
"go.temporal.io/sdk-features/harness/go/harness"
15+
"go.temporal.io/sdk/client"
16+
"go.temporal.io/sdk/workflow"
17+
)
18+
19+
var EXPECTED_RESULT = []byte{0xde, 0xad, 0xbe, 0xef}
20+
21+
var Feature = harness.Feature{
22+
Workflows: Workflow,
23+
CheckResult: CheckResult,
24+
}
25+
26+
// run a workflow that returns binary value `0xdeadbeef`
27+
func Workflow(ctx workflow.Context) ([]byte, error) {
28+
return EXPECTED_RESULT, nil
29+
}
30+
31+
func CheckResult(ctx context.Context, runner *harness.Runner, run client.WorkflowRun) error {
32+
// verify client result is binary `0xdeadbeef`
33+
result := make([]byte, 4)
34+
if err := run.Get(ctx, &result); err != nil {
35+
return err
36+
}
37+
if !bytes.Equal(result, EXPECTED_RESULT) {
38+
return fmt.Errorf("invalid result: %v", result)
39+
}
40+
history := runner.Client.GetWorkflowHistory(ctx, run.GetID(), "", false, 0)
41+
42+
var attrs *historyProto.WorkflowExecutionCompletedEventAttributes
43+
44+
for history.HasNext() {
45+
ev, err := history.Next()
46+
if err != nil {
47+
return err
48+
}
49+
// get result payload of WorkflowExecutionCompleted event from workflow history
50+
attrs = ev.GetWorkflowExecutionCompletedEventAttributes()
51+
if attrs != nil {
52+
break
53+
}
54+
}
55+
if attrs == nil {
56+
return errors.New("could not locate WorkflowExecutionCompleted event")
57+
}
58+
payload := attrs.GetResult().GetPayloads()[0]
59+
60+
// load JSON payload from `./payload.json` and compare it to result payload
61+
file, err := os.Open(path.Join(runner.Feature.AbsDir, "../../../features/data_converter/binary/payload.json"))
62+
if err != nil {
63+
return err
64+
}
65+
66+
expectedPayload := &common.Payload{}
67+
unmarshaler := jsonpb.Unmarshaler{}
68+
err = unmarshaler.Unmarshal(file, expectedPayload)
69+
if err != nil {
70+
return err
71+
}
72+
runner.Require.Equal(expectedPayload, payload)
73+
return nil
74+
}
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
package data_converter.binary;
2+
3+
import io.temporal.api.common.v1.Payload;
4+
import io.temporal.sdkfeatures.Feature;
5+
import io.temporal.sdkfeatures.Run;
6+
import io.temporal.sdkfeatures.Runner;
7+
import io.temporal.workflow.WorkflowInterface;
8+
import io.temporal.workflow.WorkflowMethod;
9+
10+
import static org.junit.jupiter.api.Assertions.assertArrayEquals;
11+
import static org.junit.jupiter.api.Assertions.assertEquals;
12+
13+
import java.nio.file.Files;
14+
import java.nio.file.Paths;
15+
16+
import com.google.protobuf.util.JsonFormat;
17+
18+
@WorkflowInterface
19+
public interface feature extends Feature {
20+
static byte[] deadbeef = new byte[]{(byte)0xde, (byte)0xad, (byte)0xbe, (byte)0xef};
21+
22+
@WorkflowMethod
23+
public byte[] workflow();
24+
25+
class Impl implements feature {
26+
/**
27+
* run a workflow that returns binary value `0xdeadbeef`
28+
*/
29+
@Override
30+
public byte[] workflow() {
31+
return deadbeef;
32+
}
33+
34+
@Override
35+
public void checkResult(Runner runner, Run run) throws Exception {
36+
// verify client result is binary `0xdeadbeef`
37+
var result = runner.waitForRunResult(run, byte[].class);
38+
assertArrayEquals(deadbeef, result);
39+
40+
// get result payload of WorkflowExecutionCompleted event from workflow history
41+
var history = runner.getWorkflowHistory(run);
42+
var event = history.getEventsList().stream().filter(e -> e.hasWorkflowExecutionCompletedEventAttributes()).findFirst();
43+
var payload = event.get().getWorkflowExecutionCompletedEventAttributes().getResult().getPayloads(0);
44+
45+
// load JSON payload from `./payload.json` and compare it to JSON representation of result payload
46+
var content = Files.readAllBytes(Paths.get(System.getProperty("user.dir"), "..", "features", runner.featureInfo.dir, "payload.json"));
47+
var builder = Payload.newBuilder();
48+
JsonFormat.parser().merge(new String(content), builder);
49+
var expected = builder.build();
50+
assertEquals(expected, payload);
51+
}
52+
}
53+
}

0 commit comments

Comments
 (0)