diff --git a/ColorChange.png b/ColorChange.png new file mode 100644 index 0000000..f746ec2 Binary files /dev/null and b/ColorChange.png differ diff --git a/Noise.png b/Noise.png new file mode 100644 index 0000000..ad15d55 Binary files /dev/null and b/Noise.png differ diff --git a/NoiseTransform.png b/NoiseTransform.png new file mode 100644 index 0000000..7ad851a Binary files /dev/null and b/NoiseTransform.png differ diff --git a/README.md b/README.md index c636328..08758b6 100644 --- a/README.md +++ b/README.md @@ -1,77 +1,22 @@ # HW 0: Noisy Planet Part 1 (Intro to Javascript and WebGL) -

- -

-

(source: Ken Perlin)

+## Live Demo: https://eddieh80.github.io/hw00-webgl-intro/ -## Objective -- Check that the tools and build configuration we will be using for the class works. -- Start learning Typescript and WebGL2 -- Practice implementing noise - -## Forking the Code -Rather than cloning the homework repository, please __fork__ the code into your own repository using the `Fork` button in the upper-right hand corner of the Github UI. This will enable you to have your own personal repository copy of the code, and let you make a live demo (described later in this document). - -## Running the Code - -1. [Install Node.js](https://nodejs.org/en/download/). Node.js is a JavaScript runtime. It basically allows you to run JavaScript when not in a browser. For our purposes, this is not necessary. The important part is that with it comes `npm`, the Node Package Manager. This allows us to easily declare and install external dependencies such as [dat.GUI](https://workshop.chromeexperiments.com/examples/gui/#1--Basic-Usage), and [glMatrix](http://glmatrix.net/). - -2. Using a command terminal, run `npm install` in the root directory of your project. This will download all of those dependencies. - -3. Do either of the following (but we highly recommend the first one for reasons we will explain later). - - a. Run `npm start` and then go to `localhost:5660` in your web browser - - b. Run `npm run build` and then go open `dist/index.html` in your web browser +## Project Description -## Module Bundling -One of the most important dependencies of our projects is [Webpack](https://webpack.js.org/concepts/). Webpack is a module bundler which allows us to write code in separate files and use `import`s and `export`s to load classes and functions for other files. It also allows us to preprocess code before compiling to a single file. We will be using [Typescript](https://www.typescriptlang.org/docs/home.html) for this course which is Javascript augmented with type annotations. Webpack will convert Typescript files to Javascript files on compilation and in doing so will also check for proper type-safety and usage. Read more about Javascript modules in the resources section below. +Initially the scene only has a cube with lambert vertex and fragment shading. The code to render a sphere or square is currently commented out, but commenting/uncommenting elements in the geometry array for the render function can be used to change what geometry is rendered on the screen. Different shader combinations can be accessed by using the Shader slider, where 1 = lambert vertex + lambert fragment, 2 = deform vertex + lambert fragment, 3 = lambert vertex + noise fragment, and 4 = deform vertex + noise fragment. A different color for the lambert shading can be picked out using the color wheel. I used Perlin Noise plugged into a orange-blue cosine color palette for the custom noise fragment shader, using the fragment position as the input to the noise function. For the deformation vertex shader, I used a uniform time variable that is incremented everytime tick() is called and then used this time variable to interpolate between the default position of the geometry and an expanded version of the geometry, modifying each dimension with either sin or cos as well as varying the timing with sin and cos. -## Developing Your Code -All of the JavaScript code is living inside the `src` directory. The main file that gets executed when you load the page as you may have guessed is `main.ts`. Here, you can make any changes you want, import functions from other files, etc. The reason that we highly suggest you build your project with `npm start` is that doing so will start a process that watches for any changes you make to your code. If it detects anything, it'll automagically rebuild your project and then refresh your browser window for you. Wow. That's cool. If you do it the other way, you'll need to run `npm build` and then refresh your page every time you want to test something. +![](Regular.png) -We would suggest editing your project with Visual Studio Code https://code.visualstudio.com/. Microsoft develops it and Microsoft also develops Typescript so all of the features work nicely together. Sublime Text and installing the Typescript plugins should probably work as well. +![](ColorChange.png) -## Assignment Details -1. Take some time to go through the existing codebase so you can get an understanding of syntax and how the code is architected. Much of the code is designed to mirror the class structures used in CIS 460's OpenGL assignments, so it should hopefully be somewhat familiar. -2. Take a look at the resources linked in the section below. Definitely read about Javascript modules and Typescript. The other links provide documentation for classes used in the code. -3. Add a `Cube` class that inherits from `Drawable` and at the very least implement a constructor and its `create` function. Then, add a `Cube` instance to the scene to be rendered. -4. Read the documentation for dat.GUI below. Update the existing GUI in `main.ts` with a parameter to alter the color passed to `u_Color` in the Lambert shader. -5. Write a custom fragment shader that implements FBM, Worley Noise, or Perlin Noise based on 3D inputs (as opposed to the 2D inputs in the slides). This noise must be used to modify your fragment color. If your custom shader is particularly interesting, you'll earn some bonus points. -6. Write a custom vertex shader that uses a trigonometric function (e.g. `sin`, `tan`) to non-uniformly modify your cube's vertex positions over time. This will necessitate instantiating an incrementing variable in your Typescript code that you pass to your shader every tick. Refer to the base code's methods of passing variables to shaders if you are unsure how to do so. -7. Feel free to update any of the files when writing your code. The implementation of the `OpenGLRenderer` is currently very simple. +![](Transform.png) -## Making a Live Demo -When you push changes to the `master` branch of your repository on Github, a Github workflow will run automatically which builds your code and pushes the build to a new branch `gh-pages`. The configuration file which handles this is located at `.github/workflows/build-and-deploy.yml`. If you want to modify this, you can read more about workflows [here](https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions). +![](Noise.png) -Once your built code is pushed to `gh-pages`, Github can automatically publish a live site. Configure that by: +![](NoiseTransform.png) - 1. Open the Settings tab of your repository in Github. - - 2. Scroll down to the Pages tab of the Settings (in the table on the left) and choose which branch to make the source for the deployed project. This should be the `gh-pages` branch which is automatically created after the first successful build of the `master` branch. - - 3. Done! Now, any new commits on the `master` branch will be built and pushed to `gh-pages`. The project should be visible at http://username.github.io/repo-name. -  - -To check if everything is on the right track: - -1. Make sure the `gh-pages` branch of your repo has a files called `index.html`, `bundle.js`, and `bundle.js.map` - -2. In the settings tab of the repo, under Pages, make sure it says your site is published at some url. - -## Submission -1. Create a pull request to this repository with your completed code. -2. Update README.md to contain a solid description of your project with a screenshot of some visuals, and a link to your live demo. -3. Submit the link to your pull request on Canvas, and add a comment to your submission with a hyperlink to your live demo. -4. Include a link to your live site. - -## Resources -- Javascript modules https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/import -- Typescript https://www.typescriptlang.org/docs/home.html -- dat.gui https://workshop.chromeexperiments.com/examples/gui/ -- glMatrix http://glmatrix.net/docs/ -- WebGL - - Interfaces https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API - - Types https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/Types - - Constants https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/Constants +## Objective +- Check that the tools and build configuration we will be using for the class works. +- Start learning Typescript and WebGL2 +- Practice implementing noise diff --git a/Regular.png b/Regular.png new file mode 100644 index 0000000..98278df Binary files /dev/null and b/Regular.png differ diff --git a/Transform.png b/Transform.png new file mode 100644 index 0000000..7d93e74 Binary files /dev/null and b/Transform.png differ diff --git a/src/.DS_Store b/src/.DS_Store new file mode 100644 index 0000000..94249de Binary files /dev/null and b/src/.DS_Store differ diff --git a/src/geometry/Cube.ts b/src/geometry/Cube.ts new file mode 100644 index 0000000..5016d95 --- /dev/null +++ b/src/geometry/Cube.ts @@ -0,0 +1,101 @@ +import {vec3, vec4} from 'gl-matrix'; +import Drawable from '../rendering/gl/Drawable'; +import {gl} from '../globals'; + +class Cube extends Drawable { + buffer: ArrayBuffer; + indices: Uint32Array; + positions: Float32Array; + normals: Float32Array; + center: vec4; + + constructor(center: vec3) { + super(); // Call the constructor of the super class. This is required. + this.center = vec4.fromValues(center[0], center[1], center[2], 1); + } + + create() { + + this.indices = new Uint32Array([ + 2, 5, 11, + 2, 8, 11, + 0, 6, 18, + 0, 12, 18, + 1, 13, 16, + 1, 4, 16, + 3, 15, 21, + 3, 9, 21, + 7, 19, 22, + 7, 10, 22, + 14, 17, 23, + 14, 20, 23]); + this.normals = new Float32Array([ + -1, 0, 0, 0, + 0, 1, 0, 0, + 0, 0, 1, 0, + 1, 0, 0, 0, + 0, 1, 0, 0, + 0, 0, 1, 0, + -1, 0, 0, 0, + 0, -1, 0, 0, + 0, 0, 1, 0, + 1, 0, 0, 0, + 0, -1, 0, 0, + 0, 0, 1, 0, + -1, 0, 0, 0, + 0, 1, 0, 0, + 0, 0, -1, 0, + 1, 0, 0, 0, + 0, 1, 0, 0, + 0, 0, -1, 0, + -1, 0, 0, 0, + 0, -1, 0, 0, + 0, 0, -1, 0, + 1, 0, 0, 0, + 0, -1, 0, 0, + 0, 0, -1, 0]); + this.positions = new Float32Array([ + -1, 1, 1, 1, + -1, 1, 1, 1, + -1, 1, 1, 1, + 1, 1, 1, 1, + 1, 1, 1, 1, + 1, 1, 1, 1, + -1, -1, 1, 1, + -1, -1, 1, 1, + -1, -1, 1, 1, + 1, -1, 1, 1, + 1, -1, 1, 1, + 1, -1, 1, 1, + -1, 1, -1, 1, + -1, 1, -1, 1, + -1, 1, -1, 1, + 1, 1, -1, 1, + 1, 1, -1, 1, + 1, 1, -1, 1, + -1, -1, -1, 1, + -1, -1, -1, 1, + -1, -1, -1, 1, + 1, -1, -1, 1, + 1, -1, -1, 1, + 1, -1, -1, 1]); + + this.generateIdx(); + this.generatePos(); + this.generateNor(); + + this.count = this.indices.length; + gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, this.bufIdx); + gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, this.indices, gl.STATIC_DRAW); + + gl.bindBuffer(gl.ARRAY_BUFFER, this.bufNor); + gl.bufferData(gl.ARRAY_BUFFER, this.normals, gl.STATIC_DRAW); + + gl.bindBuffer(gl.ARRAY_BUFFER, this.bufPos); + gl.bufferData(gl.ARRAY_BUFFER, this.positions, gl.STATIC_DRAW); + + console.log(`Created cube`); + } +}; + +export default Cube; \ No newline at end of file diff --git a/src/main.ts b/src/main.ts index 65a9461..392f0ea 100644 --- a/src/main.ts +++ b/src/main.ts @@ -1,8 +1,9 @@ -import {vec3} from 'gl-matrix'; +import {vec3, vec4} from 'gl-matrix'; const Stats = require('stats-js'); import * as DAT from 'dat.gui'; import Icosphere from './geometry/Icosphere'; import Square from './geometry/Square'; +import Cube from './geometry/Cube'; import OpenGLRenderer from './rendering/gl/OpenGLRenderer'; import Camera from './Camera'; import {setGL} from './globals'; @@ -13,17 +14,25 @@ import ShaderProgram, {Shader} from './rendering/gl/ShaderProgram'; const controls = { tesselations: 5, 'Load Scene': loadScene, // A function pointer, essentially + Color: [255, 0, 0], + Shader: 1, }; let icosphere: Icosphere; let square: Square; +let cube: Cube; let prevTesselations: number = 5; +let time: number = 0; +let prevShader: number = 1; +let currShader: ShaderProgram; function loadScene() { icosphere = new Icosphere(vec3.fromValues(0, 0, 0), 1, controls.tesselations); icosphere.create(); square = new Square(vec3.fromValues(0, 0, 0)); square.create(); + cube = new Cube(vec3.fromValues(0, 0, 0)); + cube.create(); } function main() { @@ -39,6 +48,8 @@ function main() { const gui = new DAT.GUI(); gui.add(controls, 'tesselations', 0, 8).step(1); gui.add(controls, 'Load Scene'); + gui.addColor(controls, 'Color').onChange(updateColor); + gui.add(controls, 'Shader', 0, 4).step(1); // get canvas and webgl context const canvas = document.getElementById('canvas'); @@ -57,13 +68,38 @@ function main() { const renderer = new OpenGLRenderer(canvas); renderer.setClearColor(0.2, 0.2, 0.2, 1); - gl.enable(gl.DEPTH_TEST); + gl.enable(gl.DEPTH_TEST); const lambert = new ShaderProgram([ new Shader(gl.VERTEX_SHADER, require('./shaders/lambert-vert.glsl')), new Shader(gl.FRAGMENT_SHADER, require('./shaders/lambert-frag.glsl')), ]); + const lambertDeform = new ShaderProgram([ + new Shader(gl.VERTEX_SHADER, require('./shaders/deform-vert.glsl')), + new Shader(gl.FRAGMENT_SHADER, require('./shaders/lambert-frag.glsl')), + ]); + + const noise = new ShaderProgram([ + new Shader(gl.VERTEX_SHADER, require('./shaders/lambert-vert.glsl')), + new Shader(gl.FRAGMENT_SHADER, require('./shaders/noise-frag.glsl')), + ]); + + const noiseDeform = new ShaderProgram([ + new Shader(gl.VERTEX_SHADER, require('./shaders/deform-vert.glsl')), + new Shader(gl.FRAGMENT_SHADER, require('./shaders/noise-frag.glsl')), + ]); + + currShader = lambert; + + function updateColor() { + let col = vec4.fromValues(controls.Color[0] / 255, + controls.Color[1] / 255, + controls.Color[2] / 255, 1); + renderer.render(camera, lambert, [cube], col, time); + } + + // This function will be called every frame function tick() { camera.update(); @@ -75,12 +111,35 @@ function main() { prevTesselations = controls.tesselations; icosphere = new Icosphere(vec3.fromValues(0, 0, 0), 1, prevTesselations); icosphere.create(); + square = new Square(vec3.fromValues(0, 0, 0)); + square.create(); + cube = new Cube(vec3.fromValues(0, 0, 0)); + cube.create(); } - renderer.render(camera, lambert, [ - icosphere, - // square, - ]); - stats.end(); + + if (controls.Shader != prevShader) { + prevShader = controls.Shader; + if (controls.Shader == 1) { + currShader = lambert; + } else if (controls.Shader == 2) { + currShader = lambertDeform; + } else if (controls.Shader == 3) { + currShader = noise; + } else if (controls.Shader == 4) { + currShader = noiseDeform; + } + } + + let col = vec4.fromValues(controls.Color[0] / 255, + controls.Color[1] / 255, + controls.Color[2] / 255, 1); + renderer.render(camera, currShader, [ + //icosphere, + //square, + cube + ], col, time); + stats.end(); + time++; // Tell the browser to call `tick` again whenever it renders a new frame requestAnimationFrame(tick); diff --git a/src/rendering/gl/OpenGLRenderer.ts b/src/rendering/gl/OpenGLRenderer.ts index 7e527c2..a6559e8 100644 --- a/src/rendering/gl/OpenGLRenderer.ts +++ b/src/rendering/gl/OpenGLRenderer.ts @@ -22,16 +22,16 @@ class OpenGLRenderer { gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT); } - render(camera: Camera, prog: ShaderProgram, drawables: Array) { + render(camera: Camera, prog: ShaderProgram, drawables: Array, color: vec4, t: number) { let model = mat4.create(); let viewProj = mat4.create(); - let color = vec4.fromValues(1, 0, 0, 1); mat4.identity(model); mat4.multiply(viewProj, camera.projectionMatrix, camera.viewMatrix); prog.setModelMatrix(model); prog.setViewProjMatrix(viewProj); prog.setGeometryColor(color); + prog.setTime(t); for (let drawable of drawables) { prog.draw(drawable); diff --git a/src/rendering/gl/ShaderProgram.ts b/src/rendering/gl/ShaderProgram.ts index 67fef40..1ad8a0c 100644 --- a/src/rendering/gl/ShaderProgram.ts +++ b/src/rendering/gl/ShaderProgram.ts @@ -29,6 +29,7 @@ class ShaderProgram { unifModelInvTr: WebGLUniformLocation; unifViewProj: WebGLUniformLocation; unifColor: WebGLUniformLocation; + unifTime: WebGLUniformLocation; constructor(shaders: Array) { this.prog = gl.createProgram(); @@ -47,7 +48,8 @@ class ShaderProgram { this.unifModel = gl.getUniformLocation(this.prog, "u_Model"); this.unifModelInvTr = gl.getUniformLocation(this.prog, "u_ModelInvTr"); this.unifViewProj = gl.getUniformLocation(this.prog, "u_ViewProj"); - this.unifColor = gl.getUniformLocation(this.prog, "u_Color"); + this.unifColor = gl.getUniformLocation(this.prog, "u_Color"); + this.unifTime = gl.getUniformLocation(this.prog, "u_Time"); } use() { @@ -78,6 +80,13 @@ class ShaderProgram { } } + setTime(t: number) { + this.use(); + if (this.unifTime !== -1) { + gl.uniform1i(this.unifTime, t); + } + } + setGeometryColor(color: vec4) { this.use(); if (this.unifColor !== -1) { diff --git a/src/shaders/deform-vert.glsl b/src/shaders/deform-vert.glsl new file mode 100644 index 0000000..93065cf --- /dev/null +++ b/src/shaders/deform-vert.glsl @@ -0,0 +1,69 @@ +#version 300 es + +//This is a vertex shader. While it is called a "shader" due to outdated conventions, this file +//is used to apply matrix transformations to the arrays of vertex data passed to it. +//Since this code is run on your GPU, each vertex is transformed simultaneously. +//If it were run on your CPU, each vertex would have to be processed in a FOR loop, one at a time. +//This simultaneous transformation allows your program to run much faster, especially when rendering +//geometry with millions of vertices. + +uniform mat4 u_Model; // The matrix that defines the transformation of the + // object we're rendering. In this assignment, + // this will be the result of traversing your scene graph. + +uniform mat4 u_ModelInvTr; // The inverse transpose of the model matrix. + // This allows us to transform the object's normals properly + // if the object has been non-uniformly scaled. + +uniform mat4 u_ViewProj; // The matrix that defines the camera's transformation. + // We've written a static matrix for you to use for HW2, + // but in HW3 you'll have to generate one yourself + +uniform highp int u_Time; + +in vec4 vs_Pos; // The array of vertex positions passed to the shader + +in vec4 vs_Nor; // The array of vertex normals passed to the shader + +in vec4 vs_Col; // The array of vertex colors passed to the shader. + +out vec4 fs_Nor; // The array of normals that has been transformed by u_ModelInvTr. This is implicitly passed to the fragment shader. +out vec4 fs_LightVec; // The direction in which our virtual light lies, relative to each vertex. This is implicitly passed to the fragment shader. +out vec4 fs_Col; // The color of each vertex. This is implicitly passed to the fragment shader. +out vec4 fs_Pos; + +const vec4 lightPos = vec4(5, 5, 3, 1); //The position of our virtual light, which is used to compute the shading of + //the geometry in the fragment shader. + +void main() +{ + fs_Col = vs_Col; // Pass the vertex colors to the fragment shader for interpolation + + mat3 invTranspose = mat3(u_ModelInvTr); + fs_Nor = vec4(invTranspose * vec3(vs_Nor), 0); // Pass the vertex normals to the fragment shader for interpolation. + // Transform the geometry's normals by the inverse transpose of the + // model matrix. This is necessary to ensure the normals remain + // perpendicular to the surface after the surface is transformed by + // the model matrix. + + vec4 modelposition = u_Model * vs_Pos; // Temporarily store the transformed vertex positions for use below + + fs_LightVec = lightPos - modelposition; // Compute the direction in which the light source lies + + // Calculate default and outward positions + vec3 defaultPos = vec3(modelposition); + vec3 outPos = normalize(defaultPos) * 3.f * sin(float(u_Time) / 150.f); + + // Time-changing function + float t = abs(cos(float(u_Time) / 80.f)); + + // Linear interpolation between default position and out position based on time + float x = mix(defaultPos[0] * cos(float(u_Time) / 10.f), outPos[0], t); + float y = mix(defaultPos[1] * sin(float(u_Time) / 50.f) / 20.f, outPos[1], t); + float z = mix(defaultPos[2] * sin(float(u_Time) / 30.f), outPos[2], t); + vec4 newPos = vec4(x, y, z, 1); + + gl_Position = u_ViewProj * newPos;// gl_Position is a built-in variable of OpenGL which is + // used to render the final positions of the geometry's vertices + fs_Pos = gl_Position; +} diff --git a/src/shaders/lambert-frag.glsl b/src/shaders/lambert-frag.glsl index 2b8e11b..73d95ba 100644 --- a/src/shaders/lambert-frag.glsl +++ b/src/shaders/lambert-frag.glsl @@ -30,7 +30,7 @@ void main() // Calculate the diffuse term for Lambert shading float diffuseTerm = dot(normalize(fs_Nor), normalize(fs_LightVec)); // Avoid negative lighting values - // diffuseTerm = clamp(diffuseTerm, 0, 1); + diffuseTerm = clamp(diffuseTerm, 0.f, 1.f); float ambientTerm = 0.2; @@ -40,4 +40,4 @@ void main() // Compute final shaded color out_Col = vec4(diffuseColor.rgb * lightIntensity, diffuseColor.a); -} +} \ No newline at end of file diff --git a/src/shaders/lambert-vert.glsl b/src/shaders/lambert-vert.glsl index 7f95a37..bc173d3 100644 --- a/src/shaders/lambert-vert.glsl +++ b/src/shaders/lambert-vert.glsl @@ -28,6 +28,7 @@ in vec4 vs_Col; // The array of vertex colors passed to the shader. out vec4 fs_Nor; // The array of normals that has been transformed by u_ModelInvTr. This is implicitly passed to the fragment shader. out vec4 fs_LightVec; // The direction in which our virtual light lies, relative to each vertex. This is implicitly passed to the fragment shader. out vec4 fs_Col; // The color of each vertex. This is implicitly passed to the fragment shader. +out vec4 fs_Pos; const vec4 lightPos = vec4(5, 5, 3, 1); //The position of our virtual light, which is used to compute the shading of //the geometry in the fragment shader. @@ -50,4 +51,5 @@ void main() gl_Position = u_ViewProj * modelposition;// gl_Position is a built-in variable of OpenGL which is // used to render the final positions of the geometry's vertices + fs_Pos = gl_Position; } diff --git a/src/shaders/noise-frag.glsl b/src/shaders/noise-frag.glsl new file mode 100644 index 0000000..368b072 --- /dev/null +++ b/src/shaders/noise-frag.glsl @@ -0,0 +1,92 @@ +#version 300 es + +// This is a fragment shader. If you've opened this file first, please +// open and read lambert.vert.glsl before reading on. +// Unlike the vertex shader, the fragment shader actually does compute +// the shading of geometry. For every pixel in your program's output +// screen, the fragment shader is run for every bit of geometry that +// particular pixel overlaps. By implicitly interpolating the position +// data passed into the fragment shader by the vertex shader, the fragment shader +// can compute what color to apply to its pixel based on things like vertex +// position, light position, and vertex color. +precision highp float; + +uniform vec4 u_Color; // The color with which to render this instance of geometry. + +uniform highp int u_Time; + +// These are the interpolated values out of the rasterizer, so you can't know +// their specific values without knowing the vertices that contributed to them +in vec4 fs_Nor; +in vec4 fs_LightVec; +in vec4 fs_Col; +in vec4 fs_Pos; + +out vec4 out_Col; // This is the final output color that you will see on your + // screen for the pixel that is currently being processed. + +// Cosine palette variables +const vec3 a = vec3(0.5, 0.5, 0.5); +const vec3 b = vec3(0.5, 0.5, 0.5); +const vec3 c = vec3(1.0, 1.0, 1.0); +const vec3 d = vec3(0.0, 0.1, 0.2); + +vec3 cosinePalette(float t) { + return a + b * cos(6.2831 * (c * t + d)); +} + +vec3 random3( vec3 p ) { + return fract(sin(vec3(dot(p,vec3(127.1, 311.7, 147.6)), + dot(p,vec3(269.5, 183.3, 221.7)), + dot(p, vec3(420.6, 631.2, 344.2)) + )) * 43758.5453); +} + +float surflet(vec3 p, vec3 gridPoint) { + // Compute the distance between p and the grid point along each axis, and warp it with a + // quintic function so we can smooth our cells + vec3 t2 = abs(p - gridPoint); + vec3 t = vec3(1.f) - 6.f * pow(t2, vec3(5.f)) + 15.f * pow(t2, vec3(4.f)) - 10.f * pow(t2, vec3(3.f)); + // Get the random vector for the grid point (assume we wrote a function random2 + // that returns a vec2 in the range [0, 1]) + vec3 gradient = random3(gridPoint) * 2. - vec3(1., 1., 1.); + // Get the vector from the grid point to P + vec3 diff = p - gridPoint; + // Get the value of our height field by dotting grid->P with our gradient + float height = dot(diff, gradient); + // Scale our height field (i.e. reduce it) by our polynomial falloff function + return height * t.x * t.y * t.z; +} + +float perlinNoise3D(vec3 p) { + float surfletSum = 0.f; + // Iterate over the four integer corners surrounding uv + for(int dx = 0; dx <= 1; ++dx) { + for(int dy = 0; dy <= 1; ++dy) { + for(int dz = 0; dz <= 1; ++dz) { + surfletSum += surflet(p, floor(p) + vec3(dx, dy, dz)); + } + } + } + return surfletSum; +} + +void main() +{ + // Material base color (before shading) + vec4 diffuseColor = u_Color; + + // Calculate the diffuse term for Lambert shading + float diffuseTerm = dot(normalize(fs_Nor), normalize(fs_LightVec)); + // Avoid negative lighting values + diffuseTerm = clamp(diffuseTerm, 0.f, 1.f); + + float ambientTerm = 0.2; + + float lightIntensity = diffuseTerm + ambientTerm; //Add a small float value to the color multiplier + //to simulate ambient lighting. This ensures that faces that are not + //lit by our point light are not completely black. + // Compute final shaded color + float perlin = perlinNoise3D(fs_Pos.xyz * 2.5f); + out_Col = vec4(cosinePalette(perlin), diffuseColor.a); +}