diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..46096f2 --- /dev/null +++ b/.gitignore @@ -0,0 +1,3 @@ +node_modules +package-lock.json +.DS_Store diff --git a/.vscode/settings.json b/.vscode/settings.json new file mode 100644 index 0000000..6f3a291 --- /dev/null +++ b/.vscode/settings.json @@ -0,0 +1,3 @@ +{ + "liveServer.settings.port": 5501 +} \ No newline at end of file diff --git a/README.md b/README.md index cc17a05..3156ca7 100644 --- a/README.md +++ b/README.md @@ -1,61 +1,23 @@ -Assignment 4 - Creative Coding: Interactive Multimedia Experiences -=== +## a4-sameer-desai -Due: October 2nd, by 11:59 AM. +Glitch url: couldn't get working +Song used: https://soundcloud.com/knowwless/perfect-pair-ft-jel -For this assignment we will focus on client-side development using popular audio/graphics/visualization technologies. The goal of this assignment is to refine our JavaScript knowledge while exploring the multimedia capabilities of the browser. +![](https://github.com/SmeeBoi/a4-sameer-desai/blob/main/assets/a4visual.gif) -[WebAudio / Canvas Tutorial](https://github.com/cs-4241-2023/cs4241-2023.github.io/blob/main/using.webaudio_and_canvas.md) -[SVG + D3 tutorial](https://github.com/cs-4241-2023/cs-4241-2023.github.io/blob/main/using.svg_and_d3.md) +The visualizer is four spheres moving in a circle which scale with the volume of the music. User clicks start button to start music. Then user can click and drag with standard orbit controls to change view and scrolling zooms in and out of the scene. Then user can click spacebar to play/pause music. -Baseline Requirements ---- +Goal: Create a 3D scene with multiple shapes which scale with volume of music. When the music is louder, the shapes get larger and when the music is softer, the shapes get smaller. -Your application is required to implement the following functionalities: +Challenges: Yes -- A server created using Express. This server can be as simple as needed. -- A client-side interactive experience using at least one of the following web technologies frameworks. - - [Three.js](https://threejs.org/): A library for 3D graphics / VR experiences - - [D3.js](https://d3js.org): A library that is primarily used for interactive data visualizations - - [Canvas](https://developer.mozilla.org/en-US/docs/Web/API/Canvas_API): A 2D raster drawing API included in all modern browsers - - [SVG](https://developer.mozilla.org/en-US/docs/Web/API/Canvas_API): A 2D vector drawing framework that enables shapes to be defined via XML. - - [Web Audio API](https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API): An API for audio synthesis, analysis, processing, and file playback. -- A user interface for interaction with your project, which must expose at least four parameters for user control. [tweakpane](https://cocopon.github.io/tweakpane/) is highly recommended for this, but you can also use regular HTML `` tags (the `range` type is useful to create sliders). You might also explore interaction by tracking mouse movement via the `window.onmousemove` event handler in tandem with the `event.clientX` and `event.clientY` properties. Consider using the [Pointer Events API](https://developer.mozilla.org/en-US/docs/Web/API/Pointer_events) to ensure that that both mouse and touch events will both be supported in your app. -- Your application should display basic documentation for the user interface when the application first loads. +Additional notes: +I referenced the following tutorials: +1) https://www.youtube.com/watch?v=6_3YjEc4q1Y&t=2159s +2) https://threejs-journey.com/lessons/local-server#more-about-the-vite-template -The interactive experience should possess a reasonable level of complexity. Some examples: -### Three.js -- A generative algorithm creates simple agents that move through a virtual world. Your interface controls the behavior / appearance of these agents. -- A simple 3D game... you really want this to be a simple as possible or it will be outside the scope of this assignment. -- An 3D audio visualization of a song of your choosing. User interaction should control aspects of the visualization. -### Canvas -- Implement a generative algorithm such as [Conway's Game of Life](https://bitstorm.org/gameoflife/) (or 1D cellular automata) and provide interactive controls. Note that the Game of Life has been created by 100s of people using ``; we'll be checking to ensure that your implementation is not a copy of these. -- Design a 2D audio visualizer of a song of your choosing. User interaction should control visual aspects of the experience. -### Web Audio API -- Create a screen-based musical instrument using the Web Audio API. You can use projects such as [Interface.js](http://charlie-roberts.com/interface/) or [Nexus UI](https://nexus-js.github.io/ui/api/#Piano) to provide common musical interface elements, or use dat.GUI in combination with mouse/touch events (use the Pointer Events API). Your GUI should enable users to control aspects of sound synthesis. If you want to use higher-level instruments instead of the raw WebAudio API sounds, consider trying the instruments provided by [Tone.js]() or [Gibber](https://github.com/charlieroberts/gibber.audio.lib). -### D3.js -- Create visualizations using the datasets found at [Awesome JSON Datasets](https://github.com/jdorfman/Awesome-JSON-Datasets). Experiment with providing different visualizations of the same data set, and providing users interactive control over visualization parameters and/or data filtering. Alternatively, create a single visualization with using one of the more complicated techniques shown at [d3js.org](d3js.org) and provide meaningful points of interaction for users. -Deliverables ---- -Do the following to complete this assignment: -1. Implement your project with the above requirements. -3. Test your project to make sure that when someone goes to your main page on Glitch/Heroku/etc., it displays correctly. -4. Ensure that your project has the proper naming scheme `a4-firstname-lastname` so we can find it. -5. Fork this repository and modify the README to the specifications below. *NOTE: If you don't use Glitch for hosting (where we can see the files) then you must include all project files that you author in your repo for this assignment*. -6. Create and submit a Pull Request to the original repo. Name the pull request using the following template: `a4-firstname-lastname`. -Sample Readme (delete the above when you're ready to submit, and modify the below so with your links and descriptions) ---- -## Your Web Application Title - -your hosting link e.g. http://a4-charlieroberts.glitch.me - -Include a very brief summary of your project here. Images are encouraged when needed, along with concise, high-level text. Be sure to include: - -- the goal of the application -- challenges you faced in realizing the application -- the instructions you present in the website should be clear enough to use the application, but if you feel any need to provide additional instructions please do so here. diff --git a/assets/a4visual.gif b/assets/a4visual.gif new file mode 100644 index 0000000..b286c2e Binary files /dev/null and b/assets/a4visual.gif differ diff --git a/package.json b/package.json new file mode 100644 index 0000000..18186b5 --- /dev/null +++ b/package.json @@ -0,0 +1,19 @@ +{ + "name": "threejs-journey-exercise", + "private": true, + "version": "0.0.0", + "type": "module", + "scripts": { + "dev": "vite", + "build": "vite build" + }, + "devDependencies": { + "vite": "^4.3.9" + }, + "dependencies": { + "gsap": "^3.12.2", + "stats-js": "^1.0.1", + "three": "^0.153.0", + "tweakpane": "^4.0.1" + } +} diff --git a/src/index.html b/src/index.html new file mode 100644 index 0000000..71886d8 --- /dev/null +++ b/src/index.html @@ -0,0 +1,26 @@ + + + + + + + 04 - Local Server + + + + + + + + + + +
+

Audio Visualizer

+ +
+ + + + + \ No newline at end of file diff --git a/src/js/LoaderManager.js b/src/js/LoaderManager.js new file mode 100644 index 0000000..bf017ba --- /dev/null +++ b/src/js/LoaderManager.js @@ -0,0 +1,197 @@ +import { GLTFLoader } from 'three/examples/jsm/loaders/GLTFLoader.js' +import { OBJLoader } from 'three/examples/jsm/loaders/OBJLoader.js' +import { DRACOLoader } from 'three/examples/jsm/loaders/DRACOLoader.js' +import { FontLoader } from 'three/examples/jsm/loaders/FontLoader.js' +import { TextureLoader } from 'three' +import { AudioLoader } from 'three' + +class LoaderManager { + #assets + #textureLoader = new TextureLoader() + #GLTFLoader = new GLTFLoader() + #OBJLoader = new OBJLoader() + #DRACOLoader = new DRACOLoader() + #FontLoader = new FontLoader() + #AudioLoader = new AudioLoader() + + constructor() { + this.#assets = {} // Dictionary of assets, can be different type, gltf, texture, img, font, feel free to make a Enum if using TypeScript + } + + get assets() { + return this.#assets + } + + set assets(value) { + this.#assets = value + } + + /** + * Public method + */ + + get(name) { + return this.#assets[name] + } + + load = (data) => + new Promise((resolve) => { + const promises = [] + for (let i = 0; i < data.length; i++) { + const { name, gltf, texture, img, font, obj, audio } = data[i] + + if (!this.#assets[name]) { + this.#assets[name] = {} + } + + if (gltf) { + promises.push(this.loadGLTF(gltf, name)) + } + + if (texture) { + promises.push(this.loadTexture(texture, name)) + } + + if (img) { + promises.push(this.loadImage(img, name)) + } + + if (font) { + promises.push(this.loadFont(font, name)) + } + + if (obj) { + promises.push(this.loadObj(obj, name)) + } + + if (audio) { + promises.push(this.loadAudio(audio, name)) + } + } + + Promise.all(promises).then(() => resolve()) + }) + + loadGLTF(url, name) { + return new Promise((resolve) => { + this.#DRACOLoader.setDecoderPath('https://www.gstatic.com/draco/v1/decoders/') + this.#GLTFLoader.setDRACOLoader(this.#DRACOLoader) + + this.#GLTFLoader.load( + url, + (result) => { + this.#assets[name].gltf = result + resolve(result) + }, + undefined, + (e) => { + console.log(e) + } + ) + }) + } + + loadTexture(url, name) { + if (!this.#assets[name]) { + this.#assets[name] = {} + } + return new Promise((resolve) => { + this.#textureLoader.load(url, (result) => { + this.#assets[name].texture = result + resolve(result) + }) + }) + } + + loadImage(url, name) { + return new Promise((resolve) => { + const image = new Image() + + image.onload = () => { + this.#assets[name].img = image + resolve(image) + } + + image.src = url + }) + } + + loadFont(url, name) { + // you can convert font to typeface.json using https://gero3.github.io/facetype.js/ + return new Promise((resolve) => { + this.#FontLoader.load( + url, + + // onLoad callback + (font) => { + this.#assets[name].font = font + resolve(font) + }, + + // onProgress callback + () => + // xhr + { + // console.log((xhr.loaded / xhr.total) * 100 + '% loaded') + }, + + // onError callback + (err) => { + console.log('An error happened', err) + } + ) + }) + } + + // https://threejs.org/docs/#examples/en/loaders/OBJLoader + loadObj(url, name) { + return new Promise((resolve) => { + // load a resource + this.#OBJLoader.load( + // resource URL + url, + // called when resource is loaded + (object) => { + this.#assets[name].obj = object + resolve(object) + }, + // onProgress callback + () => + // xhr + { + // console.log((xhr.loaded / xhr.total) * 100 + '% loaded') + }, + // called when loading has errors + (err) => { + console.log('An error happened', err) + } + ) + }) + } + + //loadAudio + loadAudio(url, name) { + return new Promise((resolve) => { + this.#AudioLoader.load( + url, + (audioBuffer) => { + this.#assets[name].buffer = audioBuffer + resolve(audioBuffer) + }, + (xhr) => { + console.log((xhr.loaded / xhr.total) * 100 + '% loaded') + }, + (err) => { + console.log('An error happened', err) + } + ) + }) + } + + + + + +} + +export default new LoaderManager() diff --git a/src/js/index.js b/src/js/index.js new file mode 100644 index 0000000..831c84c --- /dev/null +++ b/src/js/index.js @@ -0,0 +1,7 @@ +// Test import of a JavaScript module +import Scene from './scene' + +(() => { + // scene + new Scene() +})() diff --git a/src/js/scene.js b/src/js/scene.js new file mode 100644 index 0000000..6a81e59 --- /dev/null +++ b/src/js/scene.js @@ -0,0 +1,335 @@ +import * as THREE from 'three' +import { Pane } from 'tweakpane' + +import { + Color, + WebGLRenderer, + Scene, + PerspectiveCamera, + Mesh, + SphereGeometry, + MeshMatcapMaterial, + AxesHelper, + Object3D, + MeshBasicMaterial, + Vector3, + TorusGeometry, + CylinderGeometry, + MathUtils, + AudioListener, + AudioLoader, + AudioAnalyser, + Audio, + +} from 'three' +import { OrbitControls } from 'three/examples/jsm/controls/OrbitControls.js' +import Stats from 'stats-js' +import LoaderManager from './LoaderManager' +import Shape from './shape' +import gsap from 'gsap' + +export default class MainScene { + #canvas + #renderer + #scene + #camera + #controls + #stats + #width + #height + #mesh + #containerMesh = new Object3D() + #shapes = [] + #mouse = { + x: 0, + y: 0, + } + #sound + #analyser + + constructor() { + this.#canvas = document.querySelector('.scene') + + this.init() + } + + init = async () => { + // Preload assets before initiating the scene + const assets = [ + { + // Matcap texture simulates light on the surface of an object + // No lighting so less calculations + name: 'matcap', + texture: './matcap.png', + }, + { + // music + name: 'music', + audio: './perfect_pair.mp3', + }, + + ] + + await LoaderManager.load(assets) + + // this.setStats() + this.setScene() + this.setRender() + this.setCamera() + this.setControls() + // this.setAxesHelper() + + this.setShapes() + + this.handleResize() + + // start RAF + this.events() + } + + /** + * Our Webgl renderer, an object that will draw everything in our canvas + * https://threejs.org/docs/?q=rend#api/en/renderers/WebGLRenderer + */ + setRender() { + this.#renderer = new WebGLRenderer({ + canvas: this.#canvas, + antialias: true, + }) + } + + /** + * This is our scene, we'll add any object + * https://threejs.org/docs/?q=scene#api/en/scenes/Scene + */ + setScene() { + this.#scene = new Scene() + this.#scene.background = new Color(0xf8c291) + } + + /** + * Our Perspective camera, this is the point of view that we'll have + * of our scene. + * A perscpective camera is mimicing the human eyes so something far we'll + * look smaller than something close + * https://threejs.org/docs/?q=pers#api/en/cameras/PerspectiveCamera + */ + setCamera() { + const aspectRatio = this.#width / this.#height + const fieldOfView = 60 + const nearPlane = 0.1 + const farPlane = 10000 + + this.#camera = new PerspectiveCamera(fieldOfView, aspectRatio, nearPlane, farPlane) + this.#camera.position.y = 0 + this.#camera.position.x = 0 + this.#camera.position.z = 10 + this.#camera.lookAt(0, 0, 0) + + this.#scene.add(this.#camera) + } + + /** + * Threejs controls to have controls on our scene + * https://threejs.org/docs/?q=orbi#examples/en/controls/OrbitControls + */ + setControls() { + this.#controls = new OrbitControls(this.#camera, this.#renderer.domElement) + this.#controls.enableDamping = true + // this.#controls.dampingFactor = 0.04 + } + + /** + * Axes Helper + * https://threejs.org/docs/?q=Axesh#api/en/helpers/AxesHelper + */ + setAxesHelper() { + const axesHelper = new AxesHelper(3) + this.#scene.add(axesHelper) + } + + /** + * Create Shapes + * https://threejs.org/docs/?q=box#api/en/geometries/SphereGeometry + * with a Basic material + * https://threejs.org/docs/?q=mesh#api/en/materials/MeshBasicMaterial + */ + setShapes() { + // Adding all shapes into container mesh + const sphereGeo = new SphereGeometry(0.5, 32, 32) + const material = new MeshMatcapMaterial({ matcap: LoaderManager.assets['matcap'].texture }) + + // Sphere 1 + const sphere1 = new Shape({ + geometry: sphereGeo, + material, + parentMesh: this.#containerMesh, + position: new Vector3(0, 0, 0), + index: 0, + }); + + // Sphere 2 + const sphere2 = new Shape({ + geometry: sphereGeo, + material, + parentMesh: this.#containerMesh, + position: new Vector3(0, 0, 0), + angleOffset: Math.PI / 2, + index: 1, + }); + + // Sphere 3 + const sphere3 = new Shape({ + geometry: sphereGeo, + material, + parentMesh: this.#containerMesh, + position: new Vector3(0, 0, 0), + angleOffset: Math.PI, + index: 2, + }); + + // Sphere 4 + const sphere4 = new Shape({ + geometry: sphereGeo, + material, + parentMesh: this.#containerMesh, + position: new Vector3(0, 0, 0), + angleOffset: Math.PI * 1.5, + index: 3, + }); + + this.#shapes = [sphere1, sphere2, sphere3, sphere4] + + this.#scene.add(this.#containerMesh) + } + + + /** + * Build stats to display fps + */ + setStats() { + this.#stats = new Stats() + this.#stats.showPanel(0) // fps panel + document.body.appendChild(this.#stats.dom) + } + + /** + * List of events + */ + events() { + window.addEventListener('resize', this.handleResize, { passive: true }) + // start button click event + document.querySelector('#start').addEventListener('click', (e) => { + this.start(e); + e.target.removeEventListener('click', this.start); // This removes the event listener after the first click + }); + + // spacebar push should pause music + window.addEventListener('keydown', (e) => { + if (e.code === 'Space') { + this.#sound.isPlaying ? this.#sound.pause() : this.#sound.play(); + } + }) + + this.draw(0) + } + + // EVENTS + + /** + * Request animation frame function + * This function is called 60/time per seconds with no performance issue + * Everything that happens in the scene is drawed here + * @param {Number} now + */ + draw = (time) => { + // now: time in ms + // this.#stats.begin() + + if (this.#controls) this.#controls.update() // for damping + this.#renderer.render(this.#scene, this.#camera) + + // Scale shapes based on audio + if (this.#analyser) { + // Get the average volume + const averageVolume = this.#analyser.getAverageFrequency() / 256; // this gives a value between 0 and 1 + + // Map the average volume to a scale range, e.g., [1, 3] + const minScale = 0.5; + const maxScale = 5.0; + const scale = minScale + averageVolume * (maxScale - minScale); + + // Apply the scale to each shape + this.#shapes.forEach((shape) => { + shape.scale(scale, scale, scale); + }); + } + + // Update shapes with foreach loop + this.#shapes.forEach((shape) => { + shape.render(time) // Pass in time which is increasing ms value + // since page start to the render function to animate position of shape + }) + + // Rotate all shapes in container mesh with degrees based on mouse position + this.#containerMesh.rotation.y = MathUtils.degToRad(this.#mouse.x * 10) + this.#containerMesh.rotation.x = MathUtils.degToRad(this.#mouse.y * 10) + + // this.#stats.end() + this.raf = window.requestAnimationFrame(this.draw) + } + + /** + * On resize, we need to adapt our camera based + * on the new window width and height and the renderer + */ + handleResize = () => { + this.#width = window.innerWidth + this.#height = window.innerHeight + + // Update camera + this.#camera.aspect = this.#width / this.#height + this.#camera.updateProjectionMatrix() + + const DPR = window.devicePixelRatio ? window.devicePixelRatio : 1 + + this.#renderer.setPixelRatio(DPR) + this.#renderer.setSize(this.#width, this.#height) + } + + + // when start button clicked hide all text and zoom in to scene + start = () => { + + // hide text and then remove + gsap.to('#container', { + opacity: 0, + duration: 1, + delay: 0.2, + ease: 'expo.out', + onComplete: () => { // Add this callback + const containerElem = document.querySelector('#container'); + if (containerElem) { + containerElem.remove(); + } + } + }) + + // zoom in to scene + gsap.to(this.#camera.position, { x: 0, y: 7, z: 7, duration: 2.0, delay: 0.5, ease: 'expo.out' , + onComplete: () => { // Add this callback + // Create the listener and add it to the camera + const listener = new AudioListener(); + this.#camera.add(listener); + this.#sound = new Audio(listener); + this.#sound.setBuffer(LoaderManager.assets['music'].buffer); + this.#sound.play(); + this.#analyser = new AudioAnalyser(this.#sound, 256); + }}) + + // rotate shapes + gsap.to(this.#containerMesh.rotation, { y: Math.PI * 2, duration: 5, delay: 0.5, ease: 'expo.out' }) + } + + +} \ No newline at end of file diff --git a/src/js/shape.js b/src/js/shape.js new file mode 100644 index 0000000..a72bb15 --- /dev/null +++ b/src/js/shape.js @@ -0,0 +1,49 @@ +import { Mesh } from 'three' +import gsap from 'gsap' + +export default class Shape { + + constructor({ geometry, material, parentMesh, position, speed = 0.001, offsetspeed = 0, angleOffset = 0, index }) { + + this.mesh = new Mesh(geometry, material) + this.mesh.position.copy(position) // Set position of the mesh + + parentMesh.add(this.mesh) + + this.speed = speed + this.offsetspeed = offsetspeed + this.angleOffset = angleOffset; + this.initPosition = position + + // animate + gsap.fromTo( + this.mesh.scale, + { x: 0, y: 0, z: 0 }, + { x: 1, y: 1, z: 1, duration: 2, delay: 0.3 + index * 0.1, ease: 'expo.out' } + ) + } + + + // Scale method + scale(scaleX, scaleY, scaleZ) { + this.mesh.scale.set(scaleX, scaleY, scaleZ); + } + + + // Update position of the mesh + render = (time) => { + + const angle = time * this.speed + this.offsetspeed + this.angleOffset; + + // make shape move in circular path in x z plane + this.mesh.position.x = Math.sin(angle) * 3 + this.initPosition.x; + this.mesh.position.z = Math.cos(angle) * 3 + this.initPosition.z; + + } + + + + +} + + diff --git a/src/matcap.png b/src/matcap.png new file mode 100644 index 0000000..ad7e8f9 Binary files /dev/null and b/src/matcap.png differ diff --git a/src/perfect_pair.mp3 b/src/perfect_pair.mp3 new file mode 100644 index 0000000..8b39fc6 Binary files /dev/null and b/src/perfect_pair.mp3 differ diff --git a/src/style.css b/src/style.css new file mode 100644 index 0000000..da1abc3 --- /dev/null +++ b/src/style.css @@ -0,0 +1,74 @@ +body { + border: 0; + margin: 0; + padding: 0; +} + +#container { + font-family: 'Josefin Sans', sans-serif; + font-family: 'Jost', sans-serif; + color: white; + position: absolute; + top: 50%; + left: 50%; + transform: translate(-50%, -50%); + min-height: 100vh; + min-width: 100vw; + display: flex; + flex-direction: column; + justify-content: center; + align-items: center; + z-index: 2; + } + +canvas.scene { + position: absolute; + top: 0; + left: 0; + width: 100%; + height: 100%; + z-index: 1; +} + +h1{ + font-size: 4rem; + margin-bottom: 0px; +} + +h1.vert-move { + -webkit-animation: mover 1.0s infinite alternate; + animation: mover 0.8s infinite alternate; +} + +@keyframes mover { + 0% { transform: translateY(0); } + 100% { transform: translateY(-8px); } + } + +button { + font-family: 'Josefin Sans', sans-serif; + font-family: 'Jost', sans-serif; + color: white; + background-color: transparent; + margin-top: 0px; + align-items: center; + border-radius: 25px; + cursor: pointer; + font-size: 1.5rem; + padding: 0.3rem 1.2rem; + text-align: center; + text-decoration: none; + transition: box-shadow .3s,-webkit-box-shadow .25s; + white-space: nowrap; + border: 0; + user-select: none; + -webkit-user-select: none; + touch-action: manipulation; + } + + button:hover { + box-shadow: white 0 0 0 3px, transparent 0 0 0 0; + } + + + diff --git a/vite.config.js b/vite.config.js new file mode 100644 index 0000000..12cd4e9 --- /dev/null +++ b/vite.config.js @@ -0,0 +1,18 @@ +const isCodeSandbox = 'SANDBOX_URL' in process.env || 'CODESANDBOX_HOST' in process.env + +export default { + root: 'src/', + publicDir: '../static/', + base: './', + server: + { + host: true, + open: !isCodeSandbox // Open if it's not a CodeSandbox + }, + build: + { + outDir: '../dist', + emptyOutDir: true, + sourcemap: true + } +} \ No newline at end of file