Skip to content
This repository was archived by the owner on Jan 8, 2025. It is now read-only.

Commit 0adfdbc

Browse files
author
erichendrickson
authored
Backended (#14)
* Updated to include first backend articles * Added reference * Update credentials-setup.md * Update credentials-setup.md * Changed titles of articles * Updated some info in document * Added basic Lambda tutorial * Updated chapters in README * Updated intro to AWS cloud * Update introduction-to-lambda.md * Added API Gateway chapter * 3. Advanced Topics ii. How to Pass a File Through API Gateway take 2 (#4) * added How to Pass a File Through API Gateway in markdown format * Update pass-file-through-API-gateway.md * Update pass-file-through-API-gateway.md * Lambda transfer buckets (#3) * set up directory for guide and images, link to guide in readme * add info about creating buckets * add completed doc for using Lambda to transfer files between buckets * remove 'testtesttest' from top of readme.md (#5) * fix typo (#6) * added How to Pass a File Through API Gateway in markdown format * changes per review * Finished IAM article * image fixes * image fixes 2 * Reorganized folders * changes per review 2 * Started mfa article * Delete image12.PNG * Delete image7.PNG * Reformatted pass-file-through-API-gateway.md * More formatting * Finished formatting api gateway advanced doc * Update pass-file-through-API-gateway.md * Update pass-file-through-API-gateway.md * Update pass-file-through-API-gateway.md * Making formatting changes to my files * Reformatting introduction-to-lambda.md * Updated my api gateway doc * Added mocking article * Update aws-sdk-mock.md * Fixed formatting, unlinked one article * Added environmental variables article * Reorganized chapter hierarchy * Delete .DS_Store * Update readme.md * Started the writing to DynamoDB tutorial * Added SES tutorial * Delete image2.png * Delete image3.png * Update ses-lambda.md * Added new article title * Finished MFA article * Reverted numbering of Fundamental Technologies back to original numbering * Updated image references in credentials-setup.md * Updated another image reference * Readded images * Update credentials-setup.md * Added reference to readme * Update mfa.md * Update mfa.md * Added .DS_Store to .gitignore * Cropped unnecessary bits from mobile screenshots * Removed old versions of screenshots * Returned new versions of screenshots * Removed old versions of screenshots (attempt 2) * Re-added new versions of screenshots (attempt 2) * Fixed typo * Added S3 DynamoDB chapter * Renamed some files, updated readme * Added more information on the IAM article * Update readme.md * Edited image * Using SNS and Lambda to Send a Random 6 Digit Number Via Text Message (#10) * added How to Pass a File Through API Gateway in markdown format * minor changes * started documentation on SMS being sent by SNS * minor changes * sms function working, writing documentation * SNS 6 digit verification code SMS complete, ready for review * changes made to SMS text verification documentaion per review * Update send-sms-code-with-sns.md * Lambda write to dynamodb (#12) * complete documentation of initial lambda function, add beginning of test tutorial * complete documentation about writing to dynamodb with lambda function, delete unused image files * crop images/1.png * proof-reading * fix inline code snippets, remove explanation of iam and instead link to appropriate doc, remove unused images * implement requested changes from review * Update lambda-dynamodb.md * change examples of index.js to move const docClient declaration inside handler function. Move dexplanation of docClient accordingly to make sense in context * style all instances of 'index.js' as code snippets * added promisified docClient.put * update screenshot showing error for invalid params * Update javascript.md * Added Promises chapter * Update lambda-transfer-buckets.md * Update lambda-dynamodb.md * Update env-variables.md * Update introduction-to-lambda.md * Added Promise chapter and removed Authorship info
1 parent 6c3f98b commit 0adfdbc

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

130 files changed

+1926
-0
lines changed

.gitignore

+1
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
.DS_Store
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,132 @@
1+
# Mock Testing the AWS SDK
2+
3+
Testing is an important part of the development of a project. It provides reliability to the end product. Sometimes while testing, your functions will have a few dependencies, usually from an outside source, that can make the unit testing phase a bit tricky. In certain cases, running a unit test with dependencies can be impractical or impossible. For these cases, there is a method we can use to overcome these obstacles. This method is called "mocking".
4+
5+
Mocking simulates complex dependencies for use in testing. This avoids triggering actual live dependencies or having to meet specific criteria in order to execute your function. The mocking libraries are usually defined as variables at the top of the testing file with some exceptions we will go over later. For this example, we will be going over the mocking library for Amazon Web Services software development kit. (i.e. `aws-sdk`)
6+
7+
Below is a barebones S3 upload lambda function that will return a url once the upload is complete. Notice how this function is relying on the AWS SDK on the first line and how this function is creating a new S3 event inside the function.
8+
9+
```javascript
10+
const AWS = require('aws-sdk');
11+
12+
module.exports = function(filename, file, data) {
13+
const s3 = new AWS.S3();
14+
let params = {
15+
Bucket: 'testBucket',
16+
Key: filename,
17+
Body: file
18+
};
19+
return s3.upload(params).promise()
20+
.then(function (url) {
21+
return url.Location;
22+
});
23+
}
24+
```
25+
26+
To begin to use the AWS mocking library, we need to install it via the command line by typing in `pnpm install --save-dev aws-sdk-mock`. After the library finishes installing, we can start creating our testing file.
27+
28+
At the top of our testing file, we will need to "require" the mocking library and assign it to a const variable:
29+
30+
```javascript
31+
const AWS = require('aws-sdk-mock');
32+
```
33+
34+
This will grant access to the mocking library we have installed. We will then set up the skeleton of the chai testing function, including the describe/it functions and the remaining const variables.
35+
36+
```javascript
37+
const AWS = require('aws-sdk-mock');
38+
const sampleTestData = require('./sampleTestData.js');
39+
40+
// (Any dummy info you want to pass in goes here)
41+
42+
const uploadS3 = require('../lib/uploadS3.js');
43+
44+
//The upload file from above
45+
const expect = require('chai').expect;
46+
describe('aws-sdk-mock testing', function() {
47+
it('should give a successful output of an S3 upload', function() {
48+
let goodApple = sampleTestData;
49+
return uploadS3('resume.pdf', buffer, goodApple).then(function (url) {
50+
expect(url).to.be.a('string');
51+
});
52+
});
53+
}
54+
```
55+
56+
The problem with this test file is that it needs to call on the AWS SDK’s S3 class’s `upload()` function to complete, but because it’s a test, we don’t actually want to call to call this function, because we aren’t really uploading anything. So how do we go about making that happen? Well, now that we have the AWS SDK mocking library we can use an `AWS.mock()` function within our describe function to mock the actions of a real S3 upload.
57+
58+
Within the `describe()` function, but outside of the `it()` function, we need to make an `AWS.mock()` function that runs every time we want to run an it() function so we can simulate the S3 event. In this example, we are only running one `it()` function, but in the future we may want to run multiple. To run the mocking function every time, we can use a `beforeEach()` function and nest the `AWS.mock()` inside of it. The `beforeEach()` function and its equivalents (`afterEach()`, `before()`, and `after()`) are called "hooks". The `beforeEach()` function will execute a set of commands before each `it()` function is tested. As the names imply, `afterEach()` will run a series of commands after every `it()` function, while `before()` and `after()` will only run commands before and after all of the `it()` functions run, respectively.
59+
60+
Below is a typical framework for hooks:
61+
62+
```javascript
63+
describe('description', function() {
64+
beforeEach(function () {
65+
//...
66+
});
67+
68+
afterEach(function () {
69+
//...
70+
});
71+
72+
it('blah blah blah', function() {
73+
//...
74+
});
75+
});
76+
```
77+
78+
For more information on hooks, please see [here](https://medium.com/@kanyang/hooks-in-mocha-87cb43baa91c).
79+
80+
For this example, one of these commands that these hooks run will be an `AWS.mock()`. This function takes in several parameters including: which AWS platform to mock (e.g. DynamoDB, S3, SNS, etc.), which action to take in that platform (e.g. `upload`, `putItem`, `publish`, etc.), and a function that can define additional details for use by the mock. A typical `AWS.mock()` function designed for an S3 upload will look like this:
81+
82+
```javascript
83+
AWS.mock('S3', 'upload', function (params, callback) {
84+
// additional details here
85+
});
86+
```
87+
88+
Each `AWS.mock()` function can only be run once before needing to be reset. To reset the `AWS.mock()` function, we will use an `afterEach()` function after every `it()` function is executed to restore the mock library before the next `it()` function runs. For this we will use `AWS.restore()`. There are two ways to restore a mock library with this function. We can either target the platform and action we just used like this: `AWS.restore(‘S3’, ‘upload’);` or we can use a restore all option like this: `AWS.restore();`. The restore all option will restore every `AWS.mock()` you have used so far so be careful when using this command.
89+
90+
For more information on setting up and using `aws-sdk-mock`, look [here](https://github.com/dwyl/aws-sdk-mock).
91+
92+
If you put all of this together, it could look like something along these lines:
93+
94+
```javascript
95+
const AWS = require('aws-sdk-mock');
96+
const sampleTestData = require('./sampleTestData.js');
97+
const uploadS3 = require('../lib/uploadS3.js');
98+
const buffer = 'TestBuffer';
99+
100+
describe('aws-sdk mock testing', function() {
101+
beforeEach(function () {
102+
AWS.mock('S3', 'upload', function (params, callback) {
103+
expect(params).to.be.an('Object');
104+
expect(params).to.have.property('Bucket', 'TestBucket');
105+
expect(params).to.have.property('Key');
106+
expect(params).to.have.property('Body', 'TestBuffer');
107+
108+
callback(null, {
109+
ETag: 'SomeETag',
110+
Location: 'PublicWebsiteLink',
111+
Key: 'RandomKey',
112+
Bucket: 'TestBucket'
113+
});
114+
});
115+
});
116+
117+
afterEach(function () {
118+
AWS.restore('S3', 'upload');
119+
});
120+
121+
it('should give a successful output of an S3 upload', function() {
122+
let goodApple = sampleTestData;
123+
return uploadS3('resume.pdf', buffer, goodApple).then(function (url) {
124+
expect(url).to.be.a('string');
125+
});
126+
});
127+
});
128+
```
129+
130+
All there is left to do is run our test. For this example, we can use a simple `mocha test` in our command line, but with other project setups, you might need to run `pnpm run test`.
131+
132+
It cannot be said too often that there is a distinction between mock testing and regular unit testing. In regular unit testing, we need to account for everything that can go wrong in a process. However, in mock testing, we should always assume that the input we are receiving is absolutely correct. Mock testing should be reserved to only test the actual SDK’s expected behavior and nothing else. If there are any pitfalls before the actual mock is tested, then they should not be part of the actual testing of the mock. This is because in some cases, wrong information can be passed into a mocking test, yet result in a success. This is something to be aware of and consider with care. You should write any possible vanilla unit tests first to avoid any incorrect data being passed through a mock test.
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,191 @@
1+
# How to Pass a File Type through API Gateway
2+
3+
API Gateway now allows for the passing of binary media types (i.e. files). In this document, we will be showing you how to pass through a PDF file.
4+
5+
First create an API in API Gateway by clicking the "Create API" button in the top left of the screen and following the instruction to name and create the API.
6+
7+
![alt text](images/image1.png)
8+
9+
After that, under the API name on the left side of the screen, click on settings.
10+
11+
![alt text](images/image2.png)
12+
13+
From there, you can add all files to the accepted binary media types by adding `*/*` (we are allowing the API to accept all file types and can validate elsewhere such as in the Lambda function itself).
14+
15+
![alt text](images/image3.png)
16+
17+
Next, we need to link up our Lambda function to an API Gateway Resource. Create a resource by going to the Resources section under the API name on the left side of the screen, click on the "Actions" dropdown menu and select "Create Resource."
18+
19+
![alt text](images/image4.png)
20+
21+
Name your new resource and click "Create Resource". Next, select the new resource created, click the "Actions" drop-down menu again, and click "Create Method."
22+
23+
![alt text](images/image5.png)
24+
25+
A dropdown field will appear below the resource. Click on the field and select "POST". A "POST" section will appear below the resource.
26+
27+
![alt text](images/image6.png)
28+
29+
Clicking on the "POST" section will provide you with the setup page to the right. Under Integration type select Lambda Function. Click the checkbox for Use Lambda Proxy integration. Under lambda region, select your AWS region (for MK it will be us-west-2). Once that is entered, another field will appear asking for the Lambda Function you will be connecting to the resource. If you have already made a function, enter the name and click Save.
30+
31+
![alt text](images/image7.png)
32+
33+
Now, let’s take a look at our Lambda function we are linking to our API method.
34+
35+
```javascript
36+
module.exports.handler = function(event, context, callback) {
37+
38+
const response = {
39+
statusCode: 200,
40+
body: ''
41+
};
42+
43+
callback(null, response);
44+
};
45+
```
46+
47+
This is a simple function for binary events. All binary event functions should have at least these two keys in the response:
48+
49+
{
50+
statusCode: ###,
51+
body: ''
52+
}
53+
54+
There are two other keys that may be required in the response for more complex requests. These are:
55+
56+
headers: {},
57+
isBase64Encoded: true/false.
58+
59+
To check if this is working with a real PDF, download and open Postman. While you are doing that, grab the URL of the method in API Gateway. First deploy the API by going to the Resources section, clicking the "Actions" drop-down button and selecting "Deploy API."
60+
61+
![alt text](images/image8.png)
62+
63+
Select the Deployment stage. If there is none, create a new deployment stage and name it "prod."
64+
65+
After the API has been deployed, go to the Stages section under the API name on the left side of the screen, click on your new deployment, click on your resource, and click on the POST method.
66+
67+
![alt text](images/image9.png)
68+
69+
From there, grab the URL on the top of the new window.
70+
71+
![alt text](images/image10.png)
72+
73+
Back in Postman, change the request to POST and input the URL provided by API Gateway you just grabbed above, go to the Body tab, select the form-data input, change the key to be a file type, and browse to choose your PDF test file. If you are not sending the PDF through a form, you can simply use the binary input option.
74+
75+
![alt text](images/image11.png)
76+
77+
Press on the "Send" button in the top right. The response field should be empty. If there are any errors, they would show up in here. Some common errors I found while testing were 403: Missing Authentication Token, which means your URL is probably incorrect or 502: Internal server error, which means your Lambda function is probably not correct.
78+
79+
To check if the PDF is actually being sent through, we can check the Cloudwatch logs. In the Cloudwatch service, click on the Logs section on the left side of the screen and choose the lambda function you have been using.
80+
81+
![alt text](images/image12.png)
82+
83+
After that, click on the latest log stream which will be the first log on the top of the list.
84+
85+
![alt text](images/image13.png)
86+
87+
In this log you will see all of the events that occured while running this lambda function. In one of the events you can see that the PDF file is being passed through.
88+
89+
![alt text](images/image14.png)
90+
91+
If you have not made an S3 bucket yet, make one now. In the bucket, you will have to change the permissions to make the bucket public. To do this, click on your bucket in the S3 service, click on the permissions tab, click on the Bucket Policy section, and copy the following code into the IDE:
92+
93+
(Please refer to the document "[AWS - Using Lambda to transfer file between buckets](https://docs.google.com/document/d/1awH_pgaMtY9g7kJUtBUNPY3ZCCmZJcdO--Z7Fr5lZ-o/edit)" for a more detailed explanation of how to set up buckets, if needed.)
94+
95+
![alt text](images/image15.png)
96+
97+
```json
98+
{
99+
"Version": "2008-10-17",
100+
"Statement": [
101+
{
102+
"Sid": "PublicReadGetObject",
103+
"Effect": "Allow",
104+
"Principal": "*",
105+
"Action": "s3GetObject",
106+
"Resource": "arn:aws:s3:::<bucket name goes here>/*"
107+
}
108+
]
109+
}
110+
```
111+
112+
Now that we know the PDF is successful being passed through API Gateway and we have our S3 bucket setup, we need to parse the file data back to being usable so we can send it to S3 from the Lambda function. For this we will use a Node.js package called Busboy. We need to update our Lambda function as follows:
113+
114+
```javascript
115+
const AWS = require('aws-sdk');
116+
const s3 = new AWS.S3();
117+
const busboy = require('busboy');
118+
119+
exports.handler = function(event, context, callback) {
120+
121+
const response = {
122+
statusCode: 200,
123+
body: ''
124+
};
125+
126+
let bb = new busboy({ headers: event.headers });
127+
console.log(bb);
128+
bb.on('file', function (fieldname, file, filename, encoding, mimetype) {
129+
console.log('File [%s]: filename=%j; encoding=%j; mimetype=%j', fieldname, filename, encoding, mimetype);
130+
131+
s3.upload({
132+
Bucket: '<bucket name goes here>',
133+
Key: filename,
134+
Body: file
135+
}).promise().then(function() {
136+
callback(null, response);
137+
});
138+
})
139+
.on('field', (fieldname, val) => {
140+
console.log('Field [%s]: value: %j', fieldname, val)
141+
})
142+
.on('finish', () => {
143+
console.log('Done parsing form');
144+
callback(null, response)
145+
})
146+
.on('error', err => {
147+
console.log('failed', err);
148+
callback(err)
149+
})
150+
151+
bb.end(Buffer.from(event.body, 'base64'));
152+
console.log('event.body', event.body)
153+
};
154+
```
155+
156+
This code uses Busboy to parse the incoming file. The .on is an event listener which recognizes when a file is present and executes certain commands. Busboy outputs a stream that we can use to upload to S3 by using the s3.upload function seen above. By the time Busboy ends, the file will be in the S3 bucket specified.
157+
158+
159+
If the Lambda function is not setup properly, you may experience a PDF being saved into the S3 bucket, but when opened is blank. With a little refactoring, the PDF will show.
160+
161+
Now we must grab the S3 URL of the newly imported file so we can reference it later in a database, we can build the URL using the basic AWS S3 web link and the information we already have. It will look something like this:
162+
163+
```javascript
164+
let url = 'https://s3-us-west-2.amazonaws.com/' + s3Bucket + '/' + s3Key;
165+
```
166+
167+
We have to update the Lambda function very slightly to build this URL. Add in and update the few missing variables in the given URL above.
168+
169+
170+
If you are submitting the information through a form and you have other non-binary data that needs to be parsed out, we can do this by updating the "field" portion of the Busboy function. The way I did it was to define jsonObject and jsonComplete as empty objects outside of the Busboy function. After that you can make changes to the "field" portion like so:
171+
172+
```javascript
173+
// ...
174+
.on('field', function (fieldname, val) {
175+
jsonObject[fieldname]=val;
176+
let url = 'https://s3-us-west-2.amazonaws.com/' + s3Bucket = '/' + s3Key;
177+
jsonObject.resume = url;
178+
jsonComplete = {
179+
statusCode:200,
180+
body: JSON.stringify(jsonObject)
181+
};
182+
console.log('JSON', jsonComplete);
183+
})
184+
// ...
185+
```
186+
187+
This essentially will iterate every time the function sees an event key. The jsonObject[fieldname]=val; line is creating a new JSON object key every iteration and assigning all of the keys to be the correct name and, therefore, the correct value.
188+
189+
The next two lines about the URL will add in the S3 reference that the PDF file created to the JSON object so you can store the reference with the rest of the information.
190+
191+
The jsonComplete object is giving API Gateway an OK response and stringifying the newly completed JSON. After this, add a line in the `.on('finish')` section to `callback(null, jsonComplete)` to get the desired output.

0 commit comments

Comments
 (0)