diff --git a/1.gif b/1.gif
new file mode 100644
index 0000000..d7a8236
Binary files /dev/null and b/1.gif differ
diff --git a/2.gif b/2.gif
new file mode 100644
index 0000000..444554d
Binary files /dev/null and b/2.gif differ
diff --git a/README.md b/README.md
index c636328..e85292c 100644
--- a/README.md
+++ b/README.md
@@ -1,77 +1,15 @@
# HW 0: Noisy Planet Part 1 (Intro to Javascript and WebGL)
-
-
-
-(source: Ken Perlin)
-
-## Objective
-- Check that the tools and build configuration we will be using for the class works.
-- Start learning Typescript and WebGL2
-- Practice implementing noise
-
-## Forking the Code
-Rather than cloning the homework repository, please __fork__ the code into your own repository using the `Fork` button in the upper-right hand corner of the Github UI. This will enable you to have your own personal repository copy of the code, and let you make a live demo (described later in this document).
-
-## Running the Code
-
-1. [Install Node.js](https://nodejs.org/en/download/). Node.js is a JavaScript runtime. It basically allows you to run JavaScript when not in a browser. For our purposes, this is not necessary. The important part is that with it comes `npm`, the Node Package Manager. This allows us to easily declare and install external dependencies such as [dat.GUI](https://workshop.chromeexperiments.com/examples/gui/#1--Basic-Usage), and [glMatrix](http://glmatrix.net/).
-
-2. Using a command terminal, run `npm install` in the root directory of your project. This will download all of those dependencies.
-
-3. Do either of the following (but we highly recommend the first one for reasons we will explain later).
-
- a. Run `npm start` and then go to `localhost:5660` in your web browser
-
- b. Run `npm run build` and then go open `dist/index.html` in your web browser
-
-## Module Bundling
-One of the most important dependencies of our projects is [Webpack](https://webpack.js.org/concepts/). Webpack is a module bundler which allows us to write code in separate files and use `import`s and `export`s to load classes and functions for other files. It also allows us to preprocess code before compiling to a single file. We will be using [Typescript](https://www.typescriptlang.org/docs/home.html) for this course which is Javascript augmented with type annotations. Webpack will convert Typescript files to Javascript files on compilation and in doing so will also check for proper type-safety and usage. Read more about Javascript modules in the resources section below.
-
-## Developing Your Code
-All of the JavaScript code is living inside the `src` directory. The main file that gets executed when you load the page as you may have guessed is `main.ts`. Here, you can make any changes you want, import functions from other files, etc. The reason that we highly suggest you build your project with `npm start` is that doing so will start a process that watches for any changes you make to your code. If it detects anything, it'll automagically rebuild your project and then refresh your browser window for you. Wow. That's cool. If you do it the other way, you'll need to run `npm build` and then refresh your page every time you want to test something.
+## Description
-We would suggest editing your project with Visual Studio Code https://code.visualstudio.com/. Microsoft develops it and Microsoft also develops Typescript so all of the features work nicely together. Sublime Text and installing the Typescript plugins should probably work as well.
+- Implement FBM noise based custom fragment shader
+- Implement deforming custom vertex shader
+- Add cube class & option in controls
+-
+## Live Demo
-## Assignment Details
-1. Take some time to go through the existing codebase so you can get an understanding of syntax and how the code is architected. Much of the code is designed to mirror the class structures used in CIS 460's OpenGL assignments, so it should hopefully be somewhat familiar.
-2. Take a look at the resources linked in the section below. Definitely read about Javascript modules and Typescript. The other links provide documentation for classes used in the code.
-3. Add a `Cube` class that inherits from `Drawable` and at the very least implement a constructor and its `create` function. Then, add a `Cube` instance to the scene to be rendered.
-4. Read the documentation for dat.GUI below. Update the existing GUI in `main.ts` with a parameter to alter the color passed to `u_Color` in the Lambert shader.
-5. Write a custom fragment shader that implements FBM, Worley Noise, or Perlin Noise based on 3D inputs (as opposed to the 2D inputs in the slides). This noise must be used to modify your fragment color. If your custom shader is particularly interesting, you'll earn some bonus points.
-6. Write a custom vertex shader that uses a trigonometric function (e.g. `sin`, `tan`) to non-uniformly modify your cube's vertex positions over time. This will necessitate instantiating an incrementing variable in your Typescript code that you pass to your shader every tick. Refer to the base code's methods of passing variables to shaders if you are unsure how to do so.
-7. Feel free to update any of the files when writing your code. The implementation of the `OpenGLRenderer` is currently very simple.
+https://seiseiko.github.io/hw00-webgl-intro/
-## Making a Live Demo
-When you push changes to the `master` branch of your repository on Github, a Github workflow will run automatically which builds your code and pushes the build to a new branch `gh-pages`. The configuration file which handles this is located at `.github/workflows/build-and-deploy.yml`. If you want to modify this, you can read more about workflows [here](https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions).
-
-Once your built code is pushed to `gh-pages`, Github can automatically publish a live site. Configure that by:
-
- 1. Open the Settings tab of your repository in Github.
-
- 2. Scroll down to the Pages tab of the Settings (in the table on the left) and choose which branch to make the source for the deployed project. This should be the `gh-pages` branch which is automatically created after the first successful build of the `master` branch.
-
- 3. Done! Now, any new commits on the `master` branch will be built and pushed to `gh-pages`. The project should be visible at http://username.github.io/repo-name.
-
-
-To check if everything is on the right track:
-
-1. Make sure the `gh-pages` branch of your repo has a files called `index.html`, `bundle.js`, and `bundle.js.map`
-
-2. In the settings tab of the repo, under Pages, make sure it says your site is published at some url.
-
-## Submission
-1. Create a pull request to this repository with your completed code.
-2. Update README.md to contain a solid description of your project with a screenshot of some visuals, and a link to your live demo.
-3. Submit the link to your pull request on Canvas, and add a comment to your submission with a hyperlink to your live demo.
-4. Include a link to your live site.
-
-## Resources
-- Javascript modules https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/import
-- Typescript https://www.typescriptlang.org/docs/home.html
-- dat.gui https://workshop.chromeexperiments.com/examples/gui/
-- glMatrix http://glmatrix.net/docs/
-- WebGL
- - Interfaces https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API
- - Types https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/Types
- - Constants https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/Constants
+
+
+
diff --git a/package-lock.json b/package-lock.json
index e1a47a3..ee43328 100644
--- a/package-lock.json
+++ b/package-lock.json
@@ -1237,6 +1237,11 @@
"slash": "^3.0.0"
}
},
+ "glsl-specular-gaussian": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/glsl-specular-gaussian/-/glsl-specular-gaussian-1.0.0.tgz",
+ "integrity": "sha1-6z8oug332zBuurr8GU4osJDKwTg="
+ },
"graceful-fs": {
"version": "4.2.8",
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.8.tgz",
diff --git a/package.json b/package.json
index 1627791..3cbf937 100644
--- a/package.json
+++ b/package.json
@@ -17,6 +17,7 @@
"3d-view-controls": "^2.2.2",
"dat.gui": "^0.7.7",
"gl-matrix": "^3.3.0",
+ "glsl-specular-gaussian": "^1.0.0",
"stats-js": "^1.0.1"
}
}
diff --git a/screenshot.png b/screenshot.png
new file mode 100644
index 0000000..01d6281
Binary files /dev/null and b/screenshot.png differ
diff --git a/src/Camera.ts b/src/Camera.ts
index 77a7610..b7a7909 100644
--- a/src/Camera.ts
+++ b/src/Camera.ts
@@ -34,6 +34,7 @@ class Camera {
update() {
this.controls.tick();
vec3.add(this.target, this.position, this.direction);
+ this.position = this.controls.eye;
mat4.lookAt(this.viewMatrix, this.controls.eye, this.controls.center, this.controls.up);
}
};
diff --git a/src/geometry/Cube.ts b/src/geometry/Cube.ts
new file mode 100644
index 0000000..e33570f
--- /dev/null
+++ b/src/geometry/Cube.ts
@@ -0,0 +1,117 @@
+import {vec3, vec4} from 'gl-matrix';
+import Drawable from '../rendering/gl/Drawable';
+import {gl} from '../globals';
+
+class Cube extends Drawable {
+ indices: Uint32Array;
+ positions: Float32Array;
+ normals: Float32Array;
+
+ color: vec4;
+ colors: Float32Array;
+ center: vec4;
+
+ constructor(center: vec3) {
+ super(); // Call the constructor of the super class. This is required.
+ this.center = vec4.fromValues(center[0], center[1], center[2], 1);
+ }
+
+ create() {
+
+
+ this.normals = new Float32Array([0, 0, -1, 0,
+ 0, 0, -1, 0,
+ 0, 0, -1, 0,
+ 0, 0, -1, 0,
+
+ 0, 0, 1, 0,
+ 0, 0, 1, 0,
+ 0, 0, 1, 0,
+ 0, 0, 1, 0,
+
+ -1, 0, 0, 0,
+ -1, 0, 0, 0,
+ -1, 0, 0, 0,
+ -1, 0, 0, 0,
+
+ 1, 0, 0, 0,
+ 1, 0, 0, 0,
+ 1, 0, 0, 0,
+ 1, 0, 0, 0,
+
+ 0, -1, 0, 0,
+ 0, -1, 0, 0,
+ 0, -1, 0, 0,
+ 0, -1, 0, 0,
+
+ 0, 1, 0, 0,
+ 0, 1, 0, 0,
+ 0, 1, 0, 0,
+ 0, 1, 0, 0,
+
+ ]);
+
+ this.positions = new Float32Array([-0.5, -0.5, 0, 1,
+ 0.5, -0.5, 0, 1,
+ 0.5, 0.5, 0, 1,
+ -0.5, 0.5, 0, 1,
+
+ -0.5, -0.5, 1, 1,
+ 0.5, -0.5, 1, 1,
+ 0.5, 0.5, 1, 1,
+ -0.5, 0.5, 1, 1,
+
+ -0.5, -0.5, 0, 1,
+ -0.5, -0.5, 1, 1,
+ -0.5, 0.5, 1, 1,
+ -0.5, 0.5, 0, 1,
+
+ 0.5, -0.5, 0, 1,
+ 0.5, -0.5, 1, 1,
+ 0.5, 0.5, 1, 1,
+ 0.5, 0.5, 0, 1,
+
+ -0.5, -0.5, 0, 1,
+ -0.5, -0.5, 1, 1,
+ 0.5, -0.5, 1, 1,
+ 0.5, -0.5, 0, 1,
+
+ -0.5, 0.5, 0, 1,
+ -0.5, 0.5, 1, 1,
+ 0.5, 0.5, 1, 1,
+ 0.5, 0.5, 0, 1,
+ ]);
+
+this.indices = new Uint32Array([0, 1, 2,
+ 0, 2, 3,
+ 4, 5, 6,
+ 4, 6, 7,
+ 8, 9, 10,
+ 8, 10, 11,
+ 12, 13, 14,
+ 12, 14, 15,
+ 16, 17, 18,
+ 16, 18, 19,
+ 20, 21, 22,
+ 20, 22, 23
+
+ ]);
+ this.generateIdx();
+ this.generatePos();
+ this.generateNor();
+
+ this.count = this.indices.length;
+ gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, this.bufIdx);
+ gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, this.indices, gl.STATIC_DRAW);
+
+ gl.bindBuffer(gl.ARRAY_BUFFER, this.bufNor);
+ gl.bufferData(gl.ARRAY_BUFFER, this.normals, gl.STATIC_DRAW);
+
+ gl.bindBuffer(gl.ARRAY_BUFFER, this.bufPos);
+ gl.bufferData(gl.ARRAY_BUFFER, this.positions, gl.STATIC_DRAW);
+
+ console.log(`Created cube`);
+ }
+};
+
+export default Cube;
diff --git a/src/geometry/Icosphere.ts b/src/geometry/Icosphere.ts
index 412995b..e5e96ed 100644
--- a/src/geometry/Icosphere.ts
+++ b/src/geometry/Icosphere.ts
@@ -13,7 +13,38 @@ class Icosphere extends Drawable {
super(); // Call the constructor of the super class. This is required.
this.center = vec4.fromValues(center[0], center[1], center[2], 1);
}
-
+ loadTexture(url: string) {
+ const texture = this.generateTex();
+ gl.bindTexture(gl.TEXTURE_2D, texture);
+
+ const level = 0;
+ const internalFormat = gl.RGBA;
+ const width = 1;
+ const height = 1;
+ const border = 0;
+ const srcFormat = gl.RGBA;
+ const srcType = gl.UNSIGNED_BYTE;
+ const pixel = new Uint8Array([0, 0, 255, 255]); // opaque blue
+ gl.texImage2D(gl.TEXTURE_2D, level, internalFormat,
+ width, height, border, srcFormat, srcType,
+ pixel);
+
+ const image = new Image();
+ image.crossOrigin = "anonymous";
+ image.onload = function() {
+ gl.bindTexture(gl.TEXTURE_2D, texture);
+ gl.texImage2D(gl.TEXTURE_2D, level, internalFormat,
+ srcFormat, srcType, image);
+ gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
+ gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
+ gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
+
+ };
+ image.src = url;
+
+ this.texture = texture;
+ console.log(this.texture);
+ }
create() {
const X = 0.525731112119133606;
const Z = 0.850650808352039932;
diff --git a/src/main.ts b/src/main.ts
index 65a9461..fe58c89 100644
--- a/src/main.ts
+++ b/src/main.ts
@@ -1,8 +1,9 @@
-import {vec3} from 'gl-matrix';
+import {vec3,vec4} from 'gl-matrix';
const Stats = require('stats-js');
import * as DAT from 'dat.gui';
import Icosphere from './geometry/Icosphere';
import Square from './geometry/Square';
+import Cube from './geometry/Cube';
import OpenGLRenderer from './rendering/gl/OpenGLRenderer';
import Camera from './Camera';
import {setGL} from './globals';
@@ -13,19 +14,26 @@ import ShaderProgram, {Shader} from './rendering/gl/ShaderProgram';
const controls = {
tesselations: 5,
'Load Scene': loadScene, // A function pointer, essentially
+ color:[122,0,255],
+ 'Set Text': Settexture,
};
let icosphere: Icosphere;
let square: Square;
+let cube: Cube;
+let time : number = 0;
let prevTesselations: number = 5;
-
+let set_text: number = -1;
function loadScene() {
icosphere = new Icosphere(vec3.fromValues(0, 0, 0), 1, controls.tesselations);
icosphere.create();
- square = new Square(vec3.fromValues(0, 0, 0));
- square.create();
+ if(set_text==1){
+ icosphere.loadTexture('https://images-wixmp-ed30a86b8c4ca887773594c2.wixmp.com/f/ab371d58-f694-4953-a2e5-c79acedd9f56/dcuxgeq-1005a082-f321-4d7c-80d7-5cb2e4ffda89.png?token=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJ1cm46YXBwOjdlMGQxODg5ODIyNjQzNzNhNWYwZDQxNWVhMGQyNmUwIiwiaXNzIjoidXJuOmFwcDo3ZTBkMTg4OTgyMjY0MzczYTVmMGQ0MTVlYTBkMjZlMCIsIm9iaiI6W1t7InBhdGgiOiJcL2ZcL2FiMzcxZDU4LWY2OTQtNDk1My1hMmU1LWM3OWFjZWRkOWY1NlwvZGN1eGdlcS0xMDA1YTA4Mi1mMzIxLTRkN2MtODBkNy01Y2IyZTRmZmRhODkucG5nIn1dXSwiYXVkIjpbInVybjpzZXJ2aWNlOmZpbGUuZG93bmxvYWQiXX0.wAvefXk3lTLluz8RfuvjXvRWBMik2psG6kWva8Fbe2I');
+ }
+ }
+function Settexture(){
+ set_text = -set_text;
}
-
function main() {
// Initial display for framerate
const stats = Stats();
@@ -37,8 +45,93 @@ function main() {
// Add controls to the gui
const gui = new DAT.GUI();
- gui.add(controls, 'tesselations', 0, 8).step(1);
+ gui.add(controls, 'tesselations', 0, 9).step(1);
gui.add(controls, 'Load Scene');
+ gui.add(controls, 'Set Text');
+
+
+ // set up GUI
+ var c = gui.addFolder('Terrain general setting');
+ const continent_obj = {
+ oceanDepthMultiplier:10.0,
+ oceanFloorDepth:1.0,
+ oceanFloorSmoothing:0.5,
+ mountainBlend:0.2
+ };
+ c.add(continent_obj,'oceanDepthMultiplier',0,100);
+ c.add(continent_obj,'oceanFloorDepth',0,10);
+ c.add(continent_obj,'oceanFloorSmoothing',0,10);
+ c.add(continent_obj,'mountainBlend',0,10);
+ c.open();
+
+ var d = gui.addFolder('Light Position');
+ const lightpos_obj = {
+ x:5,
+ y:5,
+ z:3,
+ };
+ d.add(lightpos_obj,'x',-10,10);
+ d.add(lightpos_obj,'y',-10,10);
+ d.add(lightpos_obj,'z',-10,10);
+ var f = gui.addFolder('Noise Control - Continent');
+ const noise_con_obj = {
+ octaves: 5,
+ persistance: 0.5,
+ lacunarity: 0.5,
+ scale: 0.8,
+ multiplier: 4.0,
+ vertical_shift: 0.1,
+ amplitude: 2.0
+ };
+ f.add(noise_con_obj, 'octaves', 1, 9).step(1);
+ f.add(noise_con_obj, 'persistance',0.00,100.0);
+ f.add(noise_con_obj, 'lacunarity',0.00,100.0);
+ f.add(noise_con_obj, 'scale',0.00,100.00);
+ f.add(noise_con_obj, 'multiplier',0.00,100.0);
+ f.add(noise_con_obj, 'vertical_shift',0.00,100.0);
+ f.add(noise_con_obj, 'amplitude',0.00,100.0);
+
+ var f_2 = gui.addFolder('Noise Control - Ridge');
+ const noise_rid_obj = {
+ octaves: 5,
+ persistance: 0.5,
+ lacunarity: 0.5,
+ scale: 1.5,
+ multiplier: 11.0,
+ power:3.0,
+ gain:0.8,
+ vertical_shift: 0.0,
+ amplitude: 2.0
+ };
+ f_2.add(noise_rid_obj, 'octaves', 1, 9).step(1);
+ f_2.add(noise_rid_obj, 'persistance',0.00,100.0);
+ f_2.add(noise_rid_obj, 'lacunarity',0.00,100.0);
+ f_2.add(noise_rid_obj, 'scale',0.00,100.0);
+ f_2.add(noise_rid_obj, 'multiplier',0.00,100.0);
+ f_2.add(noise_rid_obj, 'vertical_shift',0.00,100.0);
+ f_2.add(noise_rid_obj, 'power',0.00,100.0);
+ f_2.add(noise_rid_obj, 'gain',0.00,100.0);
+ f_2.add(noise_rid_obj, 'amplitude',0.00,100.0);
+
+ var f_3 = gui.addFolder('Noise Control - Mountain Mask');
+ const noise_mask_obj = {
+ octaves: 5,
+ persistance: 0.5,
+ lacunarity: 0.5,
+ scale: 0.8,
+ multiplier: 0.2,
+ vertical_shift: 0.1,
+ amplitude: 2.0
+ };
+ f_3.add(noise_mask_obj, 'octaves', 1, 9).step(1);
+ f_3.add(noise_mask_obj, 'persistance',0.0,1.0);
+ f_3.add(noise_mask_obj, 'lacunarity',0.1,0.9);
+ f_3.add(noise_mask_obj, 'scale',0.01,1.0);
+ f_3.add(noise_mask_obj, 'multiplier',0.01,10.0);
+ f_3.add(noise_mask_obj, 'vertical_shift',0.01,10.0);
+ f_3.add(noise_mask_obj, 'amplitude',0.01,10.0);
+
+
// get canvas and webgl context
const canvas = document.getElementById('canvas');
@@ -76,9 +169,38 @@ function main() {
icosphere = new Icosphere(vec3.fromValues(0, 0, 0), 1, prevTesselations);
icosphere.create();
}
+
+ // setup time
+ lambert.setTime(time++);
+
+ // setup shader controllable stuff
+ lambert.setGeometryColor(vec4.fromValues(controls.color[0]/255.0,
+ controls.color[1]/255.0,controls.color[2]/255.0, 1));
+ let noise_parameter_continent_array: Float32Array = new Float32Array([noise_con_obj.octaves,
+ noise_con_obj.persistance,noise_con_obj.lacunarity,
+ noise_con_obj.scale,noise_con_obj.multiplier,
+ noise_con_obj.vertical_shift,noise_con_obj.amplitude]);
+ let noise_parameter_ridge_array: Float32Array = new Float32Array([noise_rid_obj.octaves,
+ noise_rid_obj.persistance,noise_rid_obj.lacunarity,noise_rid_obj.scale,
+ noise_rid_obj.multiplier,noise_rid_obj.power,noise_rid_obj.gain,
+ noise_rid_obj.vertical_shift,noise_rid_obj.amplitude]);
+ let noise_parameter_mask_array: Float32Array = new Float32Array([noise_mask_obj.octaves,
+ noise_mask_obj.persistance,noise_mask_obj.lacunarity,noise_mask_obj.scale,
+ noise_mask_obj.multiplier,noise_mask_obj.vertical_shift,noise_mask_obj.amplitude]);
+
+ lambert.setNoise_Con(noise_parameter_continent_array);
+ lambert.setNoise_Ridge(noise_parameter_ridge_array);
+ lambert.setNoise_Mask(noise_parameter_mask_array);
+ lambert.setoceanDepthMultiplier(continent_obj.oceanDepthMultiplier);
+ lambert.setoceanFloorDepth(continent_obj.oceanFloorDepth);
+ lambert.setoceanFloorSmoothing(continent_obj.oceanFloorSmoothing);
+ lambert.setmountainBlend(continent_obj.mountainBlend);
+
+ lambert.setTextBool(set_text);
+ lambert.setCam(vec4.fromValues(camera.position[0],camera.position[1],camera.position[2],1.0));
+ lambert.setLight(vec4.fromValues(lightpos_obj.x,lightpos_obj.y,lightpos_obj.z,1.0));
renderer.render(camera, lambert, [
- icosphere,
- // square,
+ icosphere
]);
stats.end();
diff --git a/src/rendering/gl/Drawable.ts b/src/rendering/gl/Drawable.ts
index 3006b5c..84fc52d 100644
--- a/src/rendering/gl/Drawable.ts
+++ b/src/rendering/gl/Drawable.ts
@@ -6,17 +6,20 @@ abstract class Drawable {
bufIdx: WebGLBuffer;
bufPos: WebGLBuffer;
bufNor: WebGLBuffer;
+
+ texture: WebGLTexture;
+ texBound: boolean = false;
idxBound: boolean = false;
posBound: boolean = false;
norBound: boolean = false;
-
abstract create() : void;
destory() {
gl.deleteBuffer(this.bufIdx);
gl.deleteBuffer(this.bufPos);
gl.deleteBuffer(this.bufNor);
+ gl.deleteTexture(this.texture);
}
generateIdx() {
@@ -33,6 +36,12 @@ abstract class Drawable {
this.norBound = true;
this.bufNor = gl.createBuffer();
}
+
+ generateTex(): WebGLTexture{
+ this.texBound = true;
+ this.texture = gl.createTexture();
+ return this.texture;
+ }
bindIdx(): boolean {
if (this.idxBound) {
@@ -54,6 +63,16 @@ abstract class Drawable {
}
return this.norBound;
}
+ bindTex(): boolean {
+ if(this.texBound) {
+ gl.bindTexture(gl.TEXTURE_2D, this.texture);
+ }
+ return this.texBound;
+ }
+
+ getTex(): WebGLTexture{
+ return this.texture;
+ }
elemCount(): number {
return this.count;
diff --git a/src/rendering/gl/OpenGLRenderer.ts b/src/rendering/gl/OpenGLRenderer.ts
index 7e527c2..508994b 100644
--- a/src/rendering/gl/OpenGLRenderer.ts
+++ b/src/rendering/gl/OpenGLRenderer.ts
@@ -25,17 +25,23 @@ class OpenGLRenderer {
render(camera: Camera, prog: ShaderProgram, drawables: Array) {
let model = mat4.create();
let viewProj = mat4.create();
- let color = vec4.fromValues(1, 0, 0, 1);
mat4.identity(model);
mat4.multiply(viewProj, camera.projectionMatrix, camera.viewMatrix);
prog.setModelMatrix(model);
prog.setViewProjMatrix(viewProj);
- prog.setGeometryColor(color);
-
for (let drawable of drawables) {
prog.draw(drawable);
}
+ for (let drawable of drawables) {
+ if (drawable){
+ if (drawable.bindTex()){
+ prog.setText(drawable.getTex());
+ //console.log(drawable);
+ }
+ prog.draw(drawable);
+ }
+ }
}
};
diff --git a/src/rendering/gl/ShaderProgram.ts b/src/rendering/gl/ShaderProgram.ts
index 67fef40..2499f2e 100644
--- a/src/rendering/gl/ShaderProgram.ts
+++ b/src/rendering/gl/ShaderProgram.ts
@@ -1,4 +1,4 @@
-import {vec4, mat4} from 'gl-matrix';
+import {vec3,vec4, mat4} from 'gl-matrix';
import Drawable from './Drawable';
import {gl} from '../../globals';
@@ -25,11 +25,22 @@ class ShaderProgram {
attrNor: number;
attrCol: number;
+ unifTime: WebGLUniformLocation;
+ unifNoise_Con: WebGLUniformLocation;
+ unifNoise_Ridge: WebGLUniformLocation;
+ unifNoise_Mask: WebGLUniformLocation;
unifModel: WebGLUniformLocation;
unifModelInvTr: WebGLUniformLocation;
unifViewProj: WebGLUniformLocation;
unifColor: WebGLUniformLocation;
-
+ unifoceanDepthMultiplier:WebGLUniformLocation;
+ unifoceanFloorDepth:WebGLUniformLocation;
+ unifoceanFloorSmoothing:WebGLUniformLocation;
+ unifmountainBlend:WebGLUniformLocation;
+ unifCameraPos:WebGLUniformLocation;
+ unifLightPos:WebGLUniformLocation;
+ unifText:WebGLUniformLocation;
+ unifTextBool:WebGLUniformLocation;
constructor(shaders: Array) {
this.prog = gl.createProgram();
@@ -40,14 +51,30 @@ class ShaderProgram {
if (!gl.getProgramParameter(this.prog, gl.LINK_STATUS)) {
throw gl.getProgramInfoLog(this.prog);
}
-
+ // Vertex
this.attrPos = gl.getAttribLocation(this.prog, "vs_Pos");
this.attrNor = gl.getAttribLocation(this.prog, "vs_Nor");
this.attrCol = gl.getAttribLocation(this.prog, "vs_Col");
+
+ // Camera Stuff
+ this.unifCameraPos = gl.getUniformLocation(this.prog,"u_CamPos");
this.unifModel = gl.getUniformLocation(this.prog, "u_Model");
this.unifModelInvTr = gl.getUniformLocation(this.prog, "u_ModelInvTr");
this.unifViewProj = gl.getUniformLocation(this.prog, "u_ViewProj");
+ this.unifLightPos = gl.getUniformLocation(this.prog,"u_Light_pos");
+
+ // Customized
this.unifColor = gl.getUniformLocation(this.prog, "u_Color");
+ this.unifTime = gl.getUniformLocation(this.prog, "u_Time");
+ this.unifoceanDepthMultiplier = gl.getUniformLocation(this.prog, "oceanDepthMultiplier");
+ this.unifoceanFloorDepth = gl.getUniformLocation(this.prog, "oceanFloorDepth");
+ this.unifoceanFloorSmoothing = gl.getUniformLocation(this.prog, "oceanFloorSmoothing");
+ this.unifmountainBlend = gl.getUniformLocation(this.prog, "mountainBlend");
+ this.unifNoise_Con = gl.getUniformLocation(this.prog, "noise_params_continent");
+ this.unifNoise_Ridge = gl.getUniformLocation(this.prog, "noise_params_ridge");
+ this.unifNoise_Mask = gl.getUniformLocation(this.prog, "noise_params_mask");
+ this.unifText = gl.getUniformLocation(this.prog, "u_Text");
+ this.unifTextBool = gl.getUniformLocation(this.prog, "u_TextBool");
}
use() {
@@ -56,7 +83,27 @@ class ShaderProgram {
activeProgram = this.prog;
}
}
-
+ setCam(color: vec4) {
+ this.use();
+ if (this.unifCameraPos !== -1) {
+ gl.uniform4fv(this.unifCameraPos, color);
+ }
+ }
+ setLight(color: vec4) {
+ this.use();
+ if (this.unifLightPos !== -1) {
+ gl.uniform4fv(this.unifLightPos, color);
+ }
+ }
+
+ setText(text: WebGLTexture){
+ this.use();
+ if (this.unifText !== -1) {
+ gl.activeTexture(gl.TEXTURE0);
+ gl.bindTexture(gl.TEXTURE_2D, text);
+ gl.uniform1i(this.unifText, 0);
+ }
+ }
setModelMatrix(model: mat4) {
this.use();
if (this.unifModel !== -1) {
@@ -85,6 +132,65 @@ class ShaderProgram {
}
}
+ // time
+
+ setTime(time: number) {
+ this.use();
+ if (this.unifTime !== -1) {
+ gl.uniform1f(this.unifTime, time);
+ }
+ }
+ setoceanDepthMultiplier(oceanDepthMultiplier: number) {
+ this.use();
+ if (this.unifoceanDepthMultiplier !== -1) {
+ gl.uniform1f(this.unifoceanDepthMultiplier, oceanDepthMultiplier);
+ }
+ }
+
+ setoceanFloorDepth(oceanFloorDepth: number) {
+ this.use();
+ if (this.unifoceanFloorDepth !== -1) {
+ gl.uniform1f(this.unifoceanFloorDepth, oceanFloorDepth);
+ }
+ }
+ setoceanFloorSmoothing(oceanFloorSmoothing: number) {
+ this.use();
+ if (this.unifoceanFloorSmoothing !== -1) {
+ gl.uniform1f(this.unifoceanFloorSmoothing,oceanFloorSmoothing);
+ }
+ }
+ setmountainBlend(mountainBlend: number) {
+ this.use();
+ if (this.unifmountainBlend !== -1) {
+ gl.uniform1f(this.unifmountainBlend, mountainBlend);
+ }
+ }
+ setTextBool(n: number) {
+ this.use();
+ if (this.unifTextBool !== -1) {
+ gl.uniform1f(this.unifTextBool, n);
+ }
+ }
+ setNoise_Con(noise_par: Float32Array) {
+ this.use();
+ if (this.unifNoise_Con !== -1) {
+ gl.uniform1fv(this.unifNoise_Con, noise_par);
+ }
+ }
+ setNoise_Ridge(noise_par: Float32Array) {
+ this.use();
+ if (this.unifNoise_Ridge !== -1) {
+ gl.uniform1fv(this.unifNoise_Ridge, noise_par);
+ }
+ }
+ setNoise_Mask(noise_par: Float32Array) {
+ this.use();
+ if (this.unifNoise_Mask!== -1) {
+ gl.uniform1fv(this.unifNoise_Mask, noise_par);
+ }
+ }
+
+
draw(d: Drawable) {
this.use();
diff --git a/src/shaders/lambert-frag.glsl b/src/shaders/lambert-frag.glsl
index 2b8e11b..2e57434 100644
--- a/src/shaders/lambert-frag.glsl
+++ b/src/shaders/lambert-frag.glsl
@@ -1,5 +1,4 @@
#version 300 es
-
// This is a fragment shader. If you've opened this file first, please
// open and read lambert.vert.glsl before reading on.
// Unlike the vertex shader, the fragment shader actually does compute
@@ -10,34 +9,273 @@
// can compute what color to apply to its pixel based on things like vertex
// position, light position, and vertex color.
precision highp float;
+// ************************** 3D Simplex Noise*****************************
+ // Usage: float snoise(vec3 v) returning (-1,1)
+ // Description : Array and textureless GLSL 2D/3D/4D simplex
+ // noise functions.
+ // Author : Ian McEwan, Ashima Arts.
+ // Maintainer : stegu
+ // Lastmod : 20201014 (stegu)
+ // License : Copyright (C) 2011 Ashima Arts. All rights reserved.
+ // Distributed under the MIT License. See LICENSE file.
+ // https://github.com/ashima/webgl-noise
+ // https://github.com/stegu/webgl-noise
+//
-uniform vec4 u_Color; // The color with which to render this instance of geometry.
+ vec3 mod289(vec3 x) {
+ return x - floor(x * (1.0 / 289.0)) * 289.0;
+ }
+
+ vec4 mod289(vec4 x) {
+ return x - floor(x * (1.0 / 289.0)) * 289.0;
+ }
+
+ vec4 permute(vec4 x) {
+ return mod289(((x*34.0)+10.0)*x);
+ }
+
+ vec4 taylorInvSqrt(vec4 r)
+ {
+ return 1.79284291400159 - 0.85373472095314 * r;
+ }
+
+ float snoise(vec3 v)
+ {
+ const vec2 C = vec2(1.0/6.0, 1.0/3.0) ;
+ const vec4 D = vec4(0.0, 0.5, 1.0, 2.0);
+
+ // First corner
+ vec3 i = floor(v + dot(v, C.yyy) );
+ vec3 x0 = v - i + dot(i, C.xxx) ;
+
+ // Other corners
+ vec3 g = step(x0.yzx, x0.xyz);
+ vec3 l = 1.0 - g;
+ vec3 i1 = min( g.xyz, l.zxy );
+ vec3 i2 = max( g.xyz, l.zxy );
+
+ // x0 = x0 - 0.0 + 0.0 * C.xxx;
+ // x1 = x0 - i1 + 1.0 * C.xxx;
+ // x2 = x0 - i2 + 2.0 * C.xxx;
+ // x3 = x0 - 1.0 + 3.0 * C.xxx;
+ vec3 x1 = x0 - i1 + C.xxx;
+ vec3 x2 = x0 - i2 + C.yyy; // 2.0*C.x = 1/3 = C.y
+ vec3 x3 = x0 - D.yyy; // -1.0+3.0*C.x = -0.5 = -D.y
+
+ // Permutations
+ i = mod289(i);
+ vec4 p = permute( permute( permute(
+ i.z + vec4(0.0, i1.z, i2.z, 1.0 ))
+ + i.y + vec4(0.0, i1.y, i2.y, 1.0 ))
+ + i.x + vec4(0.0, i1.x, i2.x, 1.0 ));
+
+ // Gradients: 7x7 points over a square, mapped onto an octahedron.
+ // The ring size 17*17 = 289 is close to a multiple of 49 (49*6 = 294)
+ float n_ = 0.142857142857; // 1.0/7.0
+ vec3 ns = n_ * D.wyz - D.xzx;
+
+ vec4 j = p - 49.0 * floor(p * ns.z * ns.z); // mod(p,7*7)
+
+ vec4 x_ = floor(j * ns.z);
+ vec4 y_ = floor(j - 7.0 * x_ ); // mod(j,N)
+
+ vec4 x = x_ *ns.x + ns.yyyy;
+ vec4 y = y_ *ns.x + ns.yyyy;
+ vec4 h = 1.0 - abs(x) - abs(y);
+
+ vec4 b0 = vec4( x.xy, y.xy );
+ vec4 b1 = vec4( x.zw, y.zw );
+
+ //vec4 s0 = vec4(lessThan(b0,0.0))*2.0 - 1.0;
+ //vec4 s1 = vec4(lessThan(b1,0.0))*2.0 - 1.0;
+ vec4 s0 = floor(b0)*2.0 + 1.0;
+ vec4 s1 = floor(b1)*2.0 + 1.0;
+ vec4 sh = -step(h, vec4(0.0));
+
+ vec4 a0 = b0.xzyw + s0.xzyw*sh.xxyy ;
+ vec4 a1 = b1.xzyw + s1.xzyw*sh.zzww ;
+
+ vec3 p0 = vec3(a0.xy,h.x);
+ vec3 p1 = vec3(a0.zw,h.y);
+ vec3 p2 = vec3(a1.xy,h.z);
+ vec3 p3 = vec3(a1.zw,h.w);
+
+ //Normalise gradients
+ vec4 norm = taylorInvSqrt(vec4(dot(p0,p0), dot(p1,p1), dot(p2, p2), dot(p3,p3)));
+ p0 *= norm.x;
+ p1 *= norm.y;
+ p2 *= norm.z;
+ p3 *= norm.w;
+ // Mix final noise value
+ vec4 m = max(0.5 - vec4(dot(x0,x0), dot(x1,x1), dot(x2,x2), dot(x3,x3)), 0.0);
+ m = m * m;
+ return 105.0 * dot( m*m, vec4( dot(p0,x0), dot(p1,x1),
+ dot(p2,x2), dot(p3,x3) ) );
+ }
+
+// ******************************** noise end **********************************
+float remap01(float v, float minOld, float maxOld) {
+ return clamp((v-minOld) / (maxOld-minOld),0.0,1.0);
+}
+// **************************
+// camera
+
+uniform float u_TextBool;
+uniform vec4 u_CamPos;
+uniform sampler2D u_Text;
+uniform vec4 u_Color; // The color with which to render this instance of geometry.
+uniform float u_Time;
+uniform mat4 u_Model;
// These are the interpolated values out of the rasterizer, so you can't know
// their specific values without knowing the vertices that contributed to them
in vec4 fs_Nor;
in vec4 fs_LightVec;
in vec4 fs_Col;
-
+in vec4 fs_Pos;
+in float fs_elevation;
out vec4 out_Col; // This is the final output color that you will see on your
// screen for the pixel that is currently being processed.
+float bias(float x,float bias){
+ float k = pow(1.0-bias,3.0);
+ return(x*k)/(x*k-x+1.0);
+}
+// ***************Raycast sphere *******
+ // Returns dstToSphere, dstThroughSphere
+ // If inside sphere, dstToSphere will be 0
+ // If ray misses sphere, dstToSphere = max float value, dstThroughSphere = 0
+ // Given rayDir must be normalized
+ vec2 raySphere(vec3 centre, float radius, vec3 rayOrigin, vec3 rayDir) {
+ vec3 offset = rayOrigin - centre;
+ const float a = 1.0; // set to dot(rayDir, rayDir) instead if rayDir may not be normalized
+ float b = 2.0 * dot(offset, rayDir);
+ float c = dot (offset, offset) - radius * radius;
+
+ float discriminant = b*b-4.0*a*c;
+ // No intersections: discriminant < 0
+ // 1 intersection: discriminant == 0
+ // 2 intersections: discriminant > 0
+ if (discriminant > 0.0) {
+ float s = sqrt(discriminant);
+ float dstToSphereNear = max(0.0, (-b - s) / (2.0 * a));
+ float dstToSphereFar = (-b + s) / (2.0 * a);
+
+ if (dstToSphereFar >= 0.0) {
+ return vec2(dstToSphereNear, dstToSphereFar - dstToSphereNear);
+ }
+ }
+ // Ray did not intersect sphere
+ return vec2(-1.0, 0.0);
+ }
+
+ float ocean_cal(vec3 p){
+ mat3 invModel = mat3(inverse(u_Model));
+ vec3 view = fs_Pos.xyz-u_CamPos.xyz;
+ view = normalize(view);
+ vec2 r = raySphere(vec3(0.0),1.0,vec3(u_CamPos),normalize(view));
+ float l = length(fs_Pos.xyz-u_CamPos.xyz);
+ if(r.x >0.0){
+ if(l>r.x){
+ return (l-r.x)*2.0;
+ }
+ }
+ else{
+ return -1.0;
+ }
+
+ }
+
+vec3 palette(float t, vec3 a, vec3 b, vec3 c, vec3 d )
+{
+ return a + b*cos( 6.28318*(c*t+d) );
+}
+
void main()
{
- // Material base color (before shading)
- vec4 diffuseColor = u_Color;
+ vec4 mountain_color = vec4(0.2549, 0.251, 0.251, 1.0);
+ vec4 grass_color = vec4(0.1804, 0.1922, 0.1804, 1.0);
+ vec3 a = vec3(0.5, 0.5, 0.5);
+ vec3 b = vec3(0.5, 0.5, 0.5);
+ vec3 c = vec3(1.0, 1.0, 0.5);
+ vec3 d = vec3(0.80, 0.90, 0.30);
+ vec4 snow_color = vec4(palette(sin(u_Time*0.002),a,b,c,d),1.0);
+
+ vec4 shore_color = vec4(0.3922, 0.3451, 0.2745, 1.0);
+ vec4 ocean_color = vec4(0.5451, 0.8549, 1.0, 1.0);
+ a=vec3(0.5, 0.5, 0.5);b=vec3(0.5, 0.5, 0.5);c=vec3(2.0, 1.0, 0.0);d=vec3(0.50, 0.20, 0.25);
+ vec3 SpecularColor = palette(sin(u_Time*0.001),a,b,c,d)*0.5;
+ vec4 diffuseColor = grass_color*0.5;
+ vec3 specularTerm = vec3(0.0);
+ float shininess = 0.2;
+ // ************* blend color ***************** //
+
+ // calculate steepness
+ vec3 localNormal = normalize(fs_Pos.xyz);
+ float steepness = 1.0 - dot(localNormal, vec3(fs_Nor));
+ steepness = remap01(steepness, 0.0, 0.05);
+
+ // blend some ocean color based on ray-tracing result
+ float steepThreshold=0.7;
+ float elevationThreshold=0.3;
+ // steep weight calculation & weight
+ vec3 steepCol = vec3(mountain_color);
+ float noise = snoise(vec3(fs_Pos))*0.05;
+ // flat col
+ vec3 flatCol = vec3(grass_color);
+ float flatStrength = 1.0 - bias(steepness,0.8)*0.5;
+
+ vec3 compositeCol = mix(steepCol, flatCol, flatStrength);
+ // shore
+ if(fs_elevation<=(0.1+noise+sin(u_Time*0.01)*0.01)){
+ float shoreStrength = 1.0 - bias(fs_elevation*10.0,0.3);
+ compositeCol = mix(compositeCol,vec3(shore_color),shoreStrength);
+ }
+
+ // threshold
+ if(fs_elevation>=(0.18+noise*sin(u_Time)*0.01)){
+ float le = 0.18+noise*sin(u_Time)*0.01;
+ float snowStrength = bias((fs_elevation-le)/(0.3-le),0.2);
+ snowStrength = clamp(snowStrength,0.0,1.0);
+ compositeCol = mix(compositeCol,vec3(snow_color),snowStrength);
+ shininess += clamp(snowStrength,0.0,0.5);
+ }
+
+ diffuseColor += vec4(compositeCol,1.0);
+
+ vec3 view = -normalize(fs_Pos.xyz - u_CamPos.xyz);
+ vec3 light = normalize(fs_LightVec.xyz);
+ vec3 halfVec = view.xyz + light.xyz;
+ halfVec = normalize(halfVec);
+ float NoH = clamp(dot( fs_Nor.xyz, halfVec ), 0.0, 1.0);
+ specularTerm = vec3(pow(clamp(NoH, 0.0, 1.0), pow(200.0, shininess))) * SpecularColor * shininess;
+
+ if(u_TextBool==1.0){
+ vec3 n = normalize(fs_Pos.xyz - vec3(0.0));
+ float u = atan(n.x, n.z) / (2.0*3.14159) + 0.5;
+ float v = n.y*0.5+0.5;
+ vec4 t = texture(u_Text,vec2(u,v));
+ if(length(t.xyz)<=0.2){
+ float textStrength = 1.0 - bias(length(t.xyz)/1.8,0.5);
+ diffuseColor = mix(diffuseColor,t,textStrength);
+ }
+ }
+ float ocean_blend= ocean_cal(vec3(fs_Pos));
+ if(ocean_blend>0.0){
+ diffuseColor = vec4(ocean_blend * ocean_color.xyz + vec3(noise), ocean_color.w);
+ }
// Calculate the diffuse term for Lambert shading
float diffuseTerm = dot(normalize(fs_Nor), normalize(fs_LightVec));
// Avoid negative lighting values
- // diffuseTerm = clamp(diffuseTerm, 0, 1);
+ diffuseTerm = clamp(diffuseTerm, 0.0, 1.0);
- float ambientTerm = 0.2;
+ float ambientTerm = 0.8;
float lightIntensity = diffuseTerm + ambientTerm; //Add a small float value to the color multiplier
//to simulate ambient lighting. This ensures that faces that are not
//lit by our point light are not completely black.
- // Compute final shaded color
- out_Col = vec4(diffuseColor.rgb * lightIntensity, diffuseColor.a);
+
+ out_Col = vec4((diffuseColor.rgb+ specularTerm) * lightIntensity, diffuseColor.a);
}
diff --git a/src/shaders/lambert-vert.glsl b/src/shaders/lambert-vert.glsl
index 7f95a37..33b9612 100644
--- a/src/shaders/lambert-vert.glsl
+++ b/src/shaders/lambert-vert.glsl
@@ -1,53 +1,359 @@
#version 300 es
+// ************************** 3D Simplex Noise*****************************
+// Usage: float snoise(vec3 v) returning (-1,1)
+// Description : Array and textureless GLSL 2D/3D/4D simplex
+// noise functions.
+// Author : Ian McEwan, Ashima Arts.
+// Maintainer : stegu
+// Lastmod : 20201014 (stegu)
+// License : Copyright (C) 2011 Ashima Arts. All rights reserved.
+// Distributed under the MIT License. See LICENSE file.
+// https://github.com/ashima/webgl-noise
+// https://github.com/stegu/webgl-noise
+//
-//This is a vertex shader. While it is called a "shader" due to outdated conventions, this file
-//is used to apply matrix transformations to the arrays of vertex data passed to it.
-//Since this code is run on your GPU, each vertex is transformed simultaneously.
-//If it were run on your CPU, each vertex would have to be processed in a FOR loop, one at a time.
-//This simultaneous transformation allows your program to run much faster, especially when rendering
-//geometry with millions of vertices.
+ vec3 mod289(vec3 x) {
+ return x - floor(x * (1.0 / 289.0)) * 289.0;
+ }
-uniform mat4 u_Model; // The matrix that defines the transformation of the
- // object we're rendering. In this assignment,
- // this will be the result of traversing your scene graph.
+ vec4 mod289(vec4 x) {
+ return x - floor(x * (1.0 / 289.0)) * 289.0;
+ }
-uniform mat4 u_ModelInvTr; // The inverse transpose of the model matrix.
- // This allows us to transform the object's normals properly
- // if the object has been non-uniformly scaled.
+ vec4 permute(vec4 x) {
+ return mod289(((x*34.0)+10.0)*x);
+ }
-uniform mat4 u_ViewProj; // The matrix that defines the camera's transformation.
- // We've written a static matrix for you to use for HW2,
- // but in HW3 you'll have to generate one yourself
+ vec4 taylorInvSqrt(vec4 r)
+ {
+ return 1.79284291400159 - 0.85373472095314 * r;
+ }
-in vec4 vs_Pos; // The array of vertex positions passed to the shader
+ float snoise(vec3 v)
+ {
+ const vec2 C = vec2(1.0/6.0, 1.0/3.0) ;
+ const vec4 D = vec4(0.0, 0.5, 1.0, 2.0);
-in vec4 vs_Nor; // The array of vertex normals passed to the shader
+ // First corner
+ vec3 i = floor(v + dot(v, C.yyy) );
+ vec3 x0 = v - i + dot(i, C.xxx) ;
-in vec4 vs_Col; // The array of vertex colors passed to the shader.
+ // Other corners
+ vec3 g = step(x0.yzx, x0.xyz);
+ vec3 l = 1.0 - g;
+ vec3 i1 = min( g.xyz, l.zxy );
+ vec3 i2 = max( g.xyz, l.zxy );
-out vec4 fs_Nor; // The array of normals that has been transformed by u_ModelInvTr. This is implicitly passed to the fragment shader.
-out vec4 fs_LightVec; // The direction in which our virtual light lies, relative to each vertex. This is implicitly passed to the fragment shader.
-out vec4 fs_Col; // The color of each vertex. This is implicitly passed to the fragment shader.
+ // x0 = x0 - 0.0 + 0.0 * C.xxx;
+ // x1 = x0 - i1 + 1.0 * C.xxx;
+ // x2 = x0 - i2 + 2.0 * C.xxx;
+ // x3 = x0 - 1.0 + 3.0 * C.xxx;
+ vec3 x1 = x0 - i1 + C.xxx;
+ vec3 x2 = x0 - i2 + C.yyy; // 2.0*C.x = 1/3 = C.y
+ vec3 x3 = x0 - D.yyy; // -1.0+3.0*C.x = -0.5 = -D.y
+
+ // Permutations
+ i = mod289(i);
+ vec4 p = permute( permute( permute(
+ i.z + vec4(0.0, i1.z, i2.z, 1.0 ))
+ + i.y + vec4(0.0, i1.y, i2.y, 1.0 ))
+ + i.x + vec4(0.0, i1.x, i2.x, 1.0 ));
+
+ // Gradients: 7x7 points over a square, mapped onto an octahedron.
+ // The ring size 17*17 = 289 is close to a multiple of 49 (49*6 = 294)
+ float n_ = 0.142857142857; // 1.0/7.0
+ vec3 ns = n_ * D.wyz - D.xzx;
+
+ vec4 j = p - 49.0 * floor(p * ns.z * ns.z); // mod(p,7*7)
+
+ vec4 x_ = floor(j * ns.z);
+ vec4 y_ = floor(j - 7.0 * x_ ); // mod(j,N)
+
+ vec4 x = x_ *ns.x + ns.yyyy;
+ vec4 y = y_ *ns.x + ns.yyyy;
+ vec4 h = 1.0 - abs(x) - abs(y);
+
+ vec4 b0 = vec4( x.xy, y.xy );
+ vec4 b1 = vec4( x.zw, y.zw );
+
+ //vec4 s0 = vec4(lessThan(b0,0.0))*2.0 - 1.0;
+ //vec4 s1 = vec4(lessThan(b1,0.0))*2.0 - 1.0;
+ vec4 s0 = floor(b0)*2.0 + 1.0;
+ vec4 s1 = floor(b1)*2.0 + 1.0;
+ vec4 sh = -step(h, vec4(0.0));
+
+ vec4 a0 = b0.xzyw + s0.xzyw*sh.xxyy ;
+ vec4 a1 = b1.xzyw + s1.xzyw*sh.zzww ;
+
+ vec3 p0 = vec3(a0.xy,h.x);
+ vec3 p1 = vec3(a0.zw,h.y);
+ vec3 p2 = vec3(a1.xy,h.z);
+ vec3 p3 = vec3(a1.zw,h.w);
+
+ //Normalise gradients
+ vec4 norm = taylorInvSqrt(vec4(dot(p0,p0), dot(p1,p1), dot(p2, p2), dot(p3,p3)));
+ p0 *= norm.x;
+ p1 *= norm.y;
+ p2 *= norm.z;
+ p3 *= norm.w;
+
+ // Mix final noise value
+ vec4 m = max(0.5 - vec4(dot(x0,x0), dot(x1,x1), dot(x2,x2), dot(x3,x3)), 0.0);
+ m = m * m;
+ return 105.0 * dot( m*m, vec4( dot(p0,x0), dot(p1,x1),
+ dot(p2,x2), dot(p3,x3) ) );
+ }
+
+// ******************************** noise end **********************************
+
+// ******************************** Math **********************************
+ // Smooth minimum of two values, controlled by smoothing factor k
+ // When k = 0, this behaves identically to min(a, b)
+ float smoothMin(float a, float b, float k) {
+ k = max(0.0, k);
+ // https://www.iquilezles.org/www/articles/smin/smin.htm
+ float h = max(0.0, min(1.0, (b - a + k) / (2.0 * k)));
+ return a * h + b * (1.0 - h) - k * h * (1.0 - h);
+ }
+
+ // Smooth maximum of two values, controlled by smoothing factor k
+ // When k = 0, this behaves identically to max(a, b)
+ float smoothMax(float a, float b, float k) {
+ k = min(0.0, -k);
+ float h = max(0.0, min(1.0, (b - a + k) / (2.0 * k)));
+ return a * h + b * (1.0 - h) - k * h * (1.0 - h);
+ }
+
+ float Blend(float startHeight, float blendDst, float height) {
+ return smoothstep(startHeight - blendDst / 2.0, startHeight + blendDst / 2.0, height);
+ }
+// ******************************** Math **********************************
+
+// *************** Uniform & input output
+ /** [0] octaves [1] persistance [2] lacunarity
+ [3] scale [4] multiplier [5] vertical shift [6] amplitude **/
+ uniform float noise_params_continent[7];
+ uniform float noise_params_ridge[9];
+ uniform float noise_params_mask[7];
+
+
+ // ocean
+ uniform float oceanDepthMultiplier;
+ uniform float oceanFloorDepth;
+ uniform float oceanFloorSmoothing;
+ uniform float mountainBlend;
+
+ // crater deformation stuff
+ #define MAX_CRATERS 10
+ uniform int crater_amount;
+ uniform vec3 crater[MAX_CRATERS]; // crater value
+
+
+ // custom uniform value
+ uniform float u_Time; // Current time
+ uniform vec4 u_Light_pos;
+
+ // shader stuff
+ uniform mat4 u_Model; // The matrix that defines the transformation of the
+ // object we're rendering. In this assignment,
+ // this will be the result of traversing your scene graph.
+
+ uniform mat4 u_ModelInvTr; // The inverse transpose of the model matrix.
+ // This allows us to transform the object's normals properly
+ // if the object has been non-uniformly scaled.
+
+ uniform mat4 u_ViewProj; // The matrix that defines the camera's transformation.
+ // We've written a static matrix for you to use for HW2,
+ // but in HW3 you'll have to generate one yourself
+
+ in vec4 vs_Pos; // The array of vertex positions passed to the shader
+
+ in vec4 vs_Nor; // The array of vertex normals passed to the shader
+
+ in vec4 vs_Col; // The array of vertex colors passed to the shader.
+
+ out vec4 fs_Nor; // The array of normals that has been transformed by u_ModelInvTr. This is implicitly passed to the fragment shader.
+ out vec4 fs_LightVec; // The direction in which our virtual light lies, relative to each vertex. This is implicitly passed to the fragment shader.
+ out vec4 fs_Col; // The color of each vertex. This is implicitly passed to the fragment shader.
+ out vec4 fs_Pos;
+ out float fs_elevation;
+ const vec4 lightPos = vec4(5, 5, 3, 1); //The position of our virtual light, which is used to compute the shading of
+ //the geometry in the fragment shader.
+
+
+
+/****************** terrain stuff *************/
+ /** rigid noise_params[9]
+ [0] octaves [1] persistance [2] lacunarity
+ [3] scale [4] multiplier [5] power [6] gain [8] vertical shift [9] amplitude **/
+ float ridgidNoise(vec3 pos, float noise_params[9]) {
+ // Extract parameters for readability
+
+ vec3 offset = vec3(cos(u_Time*0.001));
+ float octaves = noise_params[0];
+ float persistence = noise_params[1];
+ float lacunarity = noise_params[2];
+ float scale = noise_params[3];
+ float multiplier = noise_params[4];
+ float power = noise_params[5];
+ float gain = noise_params[6];
+ float verticalShift = noise_params[7];
+ float amplitude = noise_params[8];
+
+ // Sum up noise layers
+ float noiseSum = 0.0;
+ float frequency = scale;
+ float ridgeWeight = 1.0;
+
+ for (float i = 0.0; i < octaves; i ++) {
+ float noiseVal = 1.0 - abs(snoise(pos * frequency + offset));
+ noiseVal = pow(abs(noiseVal), power);
+ noiseVal *= ridgeWeight;
+ ridgeWeight = clamp(noiseVal * gain,0.0,1.0);
+ noiseSum += noiseVal * amplitude;
+ amplitude *= persistence;
+ frequency *= lacunarity;
+ }
+ return noiseSum * multiplier + verticalShift;
+ }
+
+ /** rigid noise_params[9]
+ [0] octaves [1] persistance [2] lacunarity
+ [3] scale [4] multiplier [5] power [6] gain [8] vertical shift [9] amplitude **/
+ float smoothedRidgidNoise(vec3 pos, float noise_params[9]) {
+ vec3 sphereNormal = normalize(pos);
+ vec3 axisA = cross(sphereNormal, vec3(0.0,1.0,0.0));
+ vec3 axisB = cross(sphereNormal, axisA);
+
+ float offsetDst = 8.0*0.05;
+ float sample0 = ridgidNoise(pos, noise_params);
+ float sample1 = ridgidNoise(pos - axisA * offsetDst, noise_params);
+ float sample2 = ridgidNoise(pos + axisA * offsetDst, noise_params);
+ float sample3 = ridgidNoise(pos - axisB * offsetDst, noise_params);
+ float sample4 = ridgidNoise(pos + axisB * offsetDst, noise_params);
+ return (sample0 + sample1 + sample2 + sample3 + sample4) / 5.0;
+ }
+
+ /** fbm use noise_params[7]
+ [0] octaves [1] persistance [2] lacunarity
+ [3] scale [4] multiplier [5] vertical shift [6] amplitude **/
+ float simplexNoise_FBM(vec3 p, float noise_params[7]){
+
+ //vec3 offset = vec3(sin(u_Time*0.001));
+ vec3 offset = vec3(sin(u_Time*0.001));
+ float octaves = floor(noise_params[0]);
+ float persistence = noise_params[1];
+ float lacunarity = noise_params[2];
+ float scale = noise_params[3];
+ float multiplier = noise_params[4];
+ float verticalShift = noise_params[5];
+ float amplitude = noise_params[6];
+
+ float noise = 0.0;
+ float frequency = scale;
+ for (float i = 0.0; i < octaves; i ++) {
+ noise += snoise(p * frequency + offset) * amplitude;
+ amplitude *= persistence;
+ frequency *= lacunarity;
+ }
+ return noise*multiplier + verticalShift;
+ }
+
+ float terrain_generate(vec3 p){
+
+ float continent_noise = simplexNoise_FBM(p,noise_params_continent);
+ continent_noise = smoothMax(continent_noise,-oceanFloorDepth,oceanFloorSmoothing);
+ if(continent_noise < 0.0){ // when height is less than 0, the terrain is ocean type
+ continent_noise *= oceanDepthMultiplier;
+ }
+ float mask = Blend(0.0,mountainBlend,simplexNoise_FBM(p,noise_params_mask));
+ float ridgeNoise = smoothedRidgidNoise(p,noise_params_ridge);
+ return continent_noise * 0.01 + ridgeNoise * 0.01 * mask;
+ }
+/****************** terrain stuff end*************/
+/****************** re-cal normal*************/
+ vec3 to_polar(vec4 p) {
+ return vec3(sqrt(p.x * p.x + p.y * p.y + p.z * p.z),
+ atan(p.y / p.x),
+ acos(p.z / sqrt(p.x * p.x + p.y * p.y + p.z * p.z)));
+ }
+
+ vec4 toWorld(vec4 nor) {
+ vec3 normal = normalize(vec3(vs_Nor));
+ vec3 tangent = normalize(cross(vec3(0.0, 1.0, 0.0), normal));
+ vec3 bitangent = normalize(cross(normal, tangent));
+ mat4 transform;
+ transform[0] = vec4(tangent, 0.0);
+ transform[1] = vec4(bitangent, 0.0);
+ transform[2] = vec4(normal, 0.0);
+ transform[3] = vec4(0.0, 0.0, 0.0, 1.0);
+ return vec4(normalize(vec3(transform * nor)), 0.0);
+ }
+
+ vec4 to_cart(float r, float theta, float phi) {
+ return vec4(r * sin(phi) * cos(theta),
+ r * sin(phi) * sin(theta),
+ r * cos(phi), 1.);
+ }
+
+ vec4 cal_normal(vec4 p) {
+ vec3 pp = to_polar(p);
+ float alpha = .0001;
+ float n1 = terrain_generate(vec3(to_cart(pp.x, pp.y + alpha, pp.z)));
+ float n2 = terrain_generate(vec3(to_cart(pp.x, pp.y - alpha, pp.z)));
+ float n3 = terrain_generate(vec3(to_cart(pp.x, pp.y, pp.z + alpha)));
+ float n4 = terrain_generate(vec3(to_cart(pp.x, pp.y, pp.z + alpha)));
+ float multiplier = 800.0;
+ float xDiff = multiplier * (n1 - n2) ;
+ float yDiff = multiplier * (n3 - n4) ;
+ p.z = sqrt(1. - xDiff * xDiff - yDiff * yDiff);
+
+ return toWorld(normalize(vec4(vec3(xDiff, yDiff, p.z), 0.0)));
+ }
+
+
+
+
+// calculate crater influence
+float crater_height(vec3 p){
+
+ float height = 0.0;
+ for(int i = 0;i < crater_amount;i++){
+
+ }
+ return height;
+}
-const vec4 lightPos = vec4(5, 5, 3, 1); //The position of our virtual light, which is used to compute the shading of
- //the geometry in the fragment shader.
void main()
{
fs_Col = vs_Col; // Pass the vertex colors to the fragment shader for interpolation
mat3 invTranspose = mat3(u_ModelInvTr);
- fs_Nor = vec4(invTranspose * vec3(vs_Nor), 0); // Pass the vertex normals to the fragment shader for interpolation.
+ fs_Nor = vec4(invTranspose * vec3(vs_Nor), 0.0); // Pass the vertex normals to the fragment shader for interpolation.
// Transform the geometry's normals by the inverse transpose of the
// model matrix. This is necessary to ensure the normals remain
// perpendicular to the surface after the surface is transformed by
// the model matrix.
+ // offset the vertices to form biomes
+ vec3 noise_input = vec3(vs_Pos);
+ vec4 pos = vs_Pos;
+ // generate terrain
+ float noise = terrain_generate(noise_input);
+ fs_elevation = noise;
+ pos = pos + noise * fs_Nor;
- vec4 modelposition = u_Model * vs_Pos; // Temporarily store the transformed vertex positions for use below
+
+ vec4 modelposition = u_Model * pos; // Temporarily store the transformed vertex positions for use below
- fs_LightVec = lightPos - modelposition; // Compute the direction in which the light source lies
+
+ fs_LightVec = u_Light_pos - modelposition; // Compute the direction in which the light source lies
+ fs_Pos = modelposition;
+
+ // calculate normal
+ fs_Nor = cal_normal(vs_Pos);
gl_Position = u_ViewProj * modelposition;// gl_Position is a built-in variable of OpenGL which is
// used to render the final positions of the geometry's vertices
}
diff --git a/text.jpg b/text.jpg
new file mode 100644
index 0000000..a03a9e9
Binary files /dev/null and b/text.jpg differ