WebGL – Rotation ”; Previous Next In this chapter, we will take an example to demonstrate how to rotate a triangle using WebGL. Example – Rotate a Triangle The following program shows how to rotate a triangle using WebGL. Live Demo <!doctype html> <html> <body> <canvas width = “400” height = “400” id = “my_Canvas”></canvas> <script> /*=================Creating a canvas=========================*/ var canvas = document.getElementById(”my_Canvas”); gl = canvas.getContext(”experimental-webgl”); /*===========Defining and storing the geometry==============*/ var vertices = [ -1,-1,-1, 1,-1,-1, 1, 1,-1 ]; var colors = [ 1,1,1, 1,1,1, 1,1,1 ]; var indices = [ 0,1,2 ]; //Create and store data into vertex buffer var vertex_buffer = gl.createBuffer (); gl.bindBuffer(gl.ARRAY_BUFFER, vertex_buffer); gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW); //Create and store data into color buffer var color_buffer = gl.createBuffer (); gl.bindBuffer(gl.ARRAY_BUFFER, color_buffer); gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(colors), gl.STATIC_DRAW); //Create and store data into index buffer var index_buffer = gl.createBuffer (); gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, index_buffer); gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint16Array(indices), gl.STATIC_DRAW); /*==========================Shaders=========================*/ var vertCode = ”attribute vec3 position;”+ ”uniform mat4 Pmatrix;”+ ”uniform mat4 Vmatrix;”+ ”uniform mat4 Mmatrix;”+ ”attribute vec3 color;”+//the color of the point ”varying vec3 vColor;”+ ”void main(void) { ”+//pre-built function ”gl_Position = Pmatrix*Vmatrix*Mmatrix*vec4(position, 1.);”+ ”vColor = color;”+ ”}”; var fragCode = ”precision mediump float;”+ ”varying vec3 vColor;”+ ”void main(void) {”+ ”gl_FragColor = vec4(vColor, 1.);”+ ”}”; var vertShader = gl.createShader(gl.VERTEX_SHADER); gl.shaderSource(vertShader, vertCode); gl.compileShader(vertShader); var fragShader = gl.createShader(gl.FRAGMENT_SHADER); gl.shaderSource(fragShader, fragCode); gl.compileShader(fragShader); var shaderProgram = gl.createProgram(); gl.attachShader(shaderProgram, vertShader); gl.attachShader(shaderProgram, fragShader); gl.linkProgram(shaderProgram); /*===========associating attributes to vertex shader ============*/ var Pmatrix = gl.getUniformLocation(shaderProgram, “Pmatrix”); var Vmatrix = gl.getUniformLocation(shaderProgram, “Vmatrix”); var Mmatrix = gl.getUniformLocation(shaderProgram, “Mmatrix”); gl.bindBuffer(gl.ARRAY_BUFFER, vertex_buffer); var position = gl.getAttribLocation(shaderProgram, “position”); gl.vertexAttribPointer(position, 3, gl.FLOAT, false,0,0) ; //position gl.enableVertexAttribArray(position); gl.bindBuffer(gl.ARRAY_BUFFER, color_buffer); var color = gl.getAttribLocation(shaderProgram, “color”); gl.vertexAttribPointer(color, 3, gl.FLOAT, false,0,0) ; //color gl.enableVertexAttribArray(color); gl.useProgram(shaderProgram); /*========================= MATRIX ========================= */ function get_projection(angle, a, zMin, zMax) { var ang = Math.tan((angle*.5)*Math.PI/180);//angle*.5 return [ 0.5/ang, 0 , 0, 0, 0, 0.5*a/ang, 0, 0, 0, 0, -(zMax+zMin)/(zMax-zMin), -1, 0, 0, (-2*zMax*zMin)/(zMax-zMin), 0 ]; } var proj_matrix = get_projection(40, canvas.width/canvas.height, 1, 100); var mov_matrix = [1,0,0,0, 0,1,0,0, 0,0,1,0, 0,0,0,1]; var view_matrix = [1,0,0,0, 0,1,0,0, 0,0,1,0, 0,0,0,1]; //translating z view_matrix[14] = view_matrix[14]-6; //zoom /*=======================rotation========================*/ function rotateZ(m, angle) { var c = Math.cos(angle); var s = Math.sin(angle); var mv0 = m[0], mv4 = m[4], mv8 = m[8]; m[0] = c*m[0]-s*m[1]; m[4] = c*m[4]-s*m[5]; m[8] = c*m[8]-s*m[9]; m[1] = c*m[1]+s*mv0; m[5] = c*m[5]+s*mv4; m[9] = c*m[9]+s*mv8; } /*=================Drawing===========================*/ var time_old = 0; var animate = function(time) { var dt = time-time_old; rotateZ(mov_matrix, dt*0.002); time_old = time; gl.enable(gl.DEPTH_TEST); gl.depthFunc(gl.LEQUAL); gl.clearColor(0.5, 0.5, 0.5, 0.9); gl.clearDepth(1.0); gl.viewport(0.0, 0.0, canvas.width, canvas.height); gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT); gl.uniformMatrix4fv(Pmatrix, false, proj_matrix); gl.uniformMatrix4fv(Vmatrix, false, view_matrix); gl.uniformMatrix4fv(Mmatrix, false, mov_matrix); gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, index_buffer); gl.drawElements(gl.TRIANGLES, indices.length, gl.UNSIGNED_SHORT, 0); window.requestAnimationFrame(animate); } animate(0); </script> </body> </html> If you run this example, it will produce the following output − Print Page Previous Next Advertisements ”;
Category: webgl
WebGL – Drawing Points
WebGL – Drawing Points ”; Previous Next We discussed earlier (in Chapter 5) how to follow a step-by-step process to draw a primitive. We have explained the process in five steps. You need to repeat these steps every time you draw a new shape. This chapter explains how to draw points with 3D coordinates in WebGL. Before moving further, let us take a relook at the five steps. Required Steps The following steps are required to create a WebGL application to draw points. Step 1 − Prepare the Canvas and Get the WebGL Rendering Context In this step, we obtain the WebGL Rendering context object using the method getContext(). Step 2 − Define the Geometry and Store it in the Buffer Objects Since we are drawing three points, we define three vertices with 3D coordinates and store them in buffers. var vertices = [ -0.5,0.5,0.0, 0.0,0.5,0.0, -0.25,0.25,0.0, ]; Step 3 − Create and Compile the Shader Programs In this step, you need to write vertex shader and fragment shader programs, compile them, and create a combined program by linking these two programs. Vertex Shader − In the vertex shader of the given example, we define a vector attribute to store 3D coordinates, and assign it to the gl_position variable. gl_pointsize is the variable used to assign a size to the point. We assign the point size as 10. var vertCode = ”attribute vec3 coordinates;” + ”void main(void) {” + ” gl_Position = vec4(coordinates, 1.0);” + ”gl_PointSize = 10.0;”+ ”}”; Fragment Shader − In the fragment shader, we simply assign the fragment color to the gl_FragColor variable var fragCode = ”void main(void) {” +” gl_FragColor = vec4(1, 0.5, 0.0, 1);” +”}”; Step 4 − Associate the Shader Programs to Buffer Objects In this step, we associate the buffer objects with the shader program. Step 5 − Drawing the Required Object We use the method drawArrays() to draw points. Since the number of points we want to draw are is three, the count value is 3. gl.drawArrays(gl.POINTS, 0, 3) Example – Draw Three Points using WebGL Here is the complete WebGL program to draw three points − Live Demo <!doctype html> <html> <body> <canvas width = “570” height = “570” id = “my_Canvas”></canvas> <script> /*================Creating a canvas=================*/ var canvas = document.getElementById(”my_Canvas”); gl = canvas.getContext(”experimental-webgl”); /*==========Defining and storing the geometry=======*/ var vertices = [ -0.5,0.5,0.0, 0.0,0.5,0.0, -0.25,0.25,0.0, ]; // Create an empty buffer object to store the vertex buffer var vertex_buffer = gl.createBuffer(); //Bind appropriate array buffer to it gl.bindBuffer(gl.ARRAY_BUFFER, vertex_buffer); // Pass the vertex data to the buffer gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW); // Unbind the buffer gl.bindBuffer(gl.ARRAY_BUFFER, null); /*=========================Shaders========================*/ // vertex shader source code var vertCode = ”attribute vec3 coordinates;” + ”void main(void) {” + ” gl_Position = vec4(coordinates, 1.0);” + ”gl_PointSize = 10.0;”+ ”}”; // Create a vertex shader object var vertShader = gl.createShader(gl.VERTEX_SHADER); // Attach vertex shader source code gl.shaderSource(vertShader, vertCode); // Compile the vertex shader gl.compileShader(vertShader); // fragment shader source code var fragCode = ”void main(void) {” + ” gl_FragColor = vec4(0.0, 0.0, 0.0, 0.1);” + ”}”; // Create fragment shader object var fragShader = gl.createShader(gl.FRAGMENT_SHADER); // Attach fragment shader source code gl.shaderSource(fragShader, fragCode); // Compile the fragmentt shader gl.compileShader(fragShader); // Create a shader program object to store // the combined shader program var shaderProgram = gl.createProgram(); // Attach a vertex shader gl.attachShader(shaderProgram, vertShader); // Attach a fragment shader gl.attachShader(shaderProgram, fragShader); // Link both programs gl.linkProgram(shaderProgram); // Use the combined shader program object gl.useProgram(shaderProgram); /*======== Associating shaders to buffer objects ========*/ // Bind vertex buffer object gl.bindBuffer(gl.ARRAY_BUFFER, vertex_buffer); // Get the attribute location var coord = gl.getAttribLocation(shaderProgram, “coordinates”); // Point an attribute to the currently bound VBO gl.vertexAttribPointer(coord, 3, gl.FLOAT, false, 0, 0); // Enable the attribute gl.enableVertexAttribArray(coord); /*============= Drawing the primitive ===============*/ // Clear the canvas gl.clearColor(0.5, 0.5, 0.5, 0.9); // Enable the depth test gl.enable(gl.DEPTH_TEST); // Clear the color buffer bit gl.clear(gl.COLOR_BUFFER_BIT); // Set the view port gl.viewport(0,0,canvas.width,canvas.height); // Draw the triangle gl.drawArrays(gl.POINTS, 0, 3); </script> </body> </html> It will produce the following result − Print Page Previous Next Advertisements ”;
WebGL – Context
WebGL – Context ”; Previous Next To write a WebGL application, first step is to get the WebGL rendering context object. This object interacts with the WebGL drawing buffer and can call all the WebGL methods. The following operations are performed to obtain the WebGL context − Create an HTML-5 canvas Get the canvas ID Obtain WebGL Creating HTML-5 Canvas Element In Chapter 5, we discussed how to create an HTML-5 canvas element. Within the body of the HTML-5 document, write a canvas, give it a name, and pass it as a parameter to the attribute id. You can define the dimensions of the canvas using the width and height attributes (optional). Example The following example shows how to create a canvas element with the dimensions 500 × 500. We have created a border to the canvas using CSS for visibility. Copy and paste the following code in a file with the name my_canvas.html. Live Demo <!DOCTYPE HTML> <html> <head> <style> #mycanvas{border:1px solid blue;} </style> </head> <body> <canvas id = “mycanvas” width = “300” height = “300”></canvas> </body> </html> It will produce the following result − Get the Canvas ID After creating the canvas, you have to get the WebGL context. The first thing to do to obtain a WebGL drawing context is to get the id of the current canvas element. Canvas ID is acquired by calling the DOM (Document Object Model) method getElementById(). This method accepts a string value as parameter, so we pass the name of the current canvas to it. For example, if the canvas name is my_canvas, then canvas ID is obtained as shown below− var canvas = document.getElementById(”my_Canvas”); Get the WebGL Drawing Context To get the WebGLRenderingContext object (or WebGL Drawing context object or simply WebGL context), call the getContext() method of the current HTMLCanvasElement. The syntax of getContext() is as follows − canvas.getContext(contextType, contextAttributes); Pass the strings webgl or experimental-webgl as the contentType. The contextAttributes parameter is optional. (While proceeding with this step, make sure your browser implements WebGL version 1 (OpenGL ES 2.0)). The following code snippet shows how to obtain the WebGL rendering context. Here gl is the reference variable to the obtained context object. var canvas = document.getElementById(”my_Canvas”); var gl = canvas.getContext(”experimental-webgl”); WebGLContextAttributes The parameter WebGLContextAttributes is not mandatory. This parameter provides various options that accept Boolean values as listed below − Sr.No. Attributes & Description 1 Alpha If its value is true, it provides an alpha buffer to the canvas. By default, its value is true. 2 depth If its value is true, you will get a drawing buffer which contains a depth buffer of at least 16 bits. By default, its value is true. 3 stencil If its value is true, you will get a drawing buffer which contains a stencil buffer of at least 8 bits. By default, its value is false. 4 antialias If its value is true, you will get a drawing buffer which performs anti-aliasing. By default, its value is true. 5 premultipliedAlpha If its value is true, you will get a drawing buffer which contains colors with pre-multiplied alpha. By default, its value is true. 6 preserveDrawingBuffer If its value is true, the buffers will not be cleared and will preserve their values until cleared or overwritten by the author. By default, its value is false. The following code snippet shows how to create a WebGL context with a stencil buffer, which will not perform anti-aliasing. var canvas = document.getElementById(”canvas1”); var context = canvas.getContext(”webgl”, { antialias: false, stencil: true }); At the time of creating the WebGLRenderingContext, a drawing buffer is created. The Context object manages OpenGL state and renders to the drawing buffer. WebGLRenderingContext It is the principal interface in WebGL. It represents the WebGL drawing context. This interface contains all the methods used to perform various tasks on the Drawing buffer. The attributes of this interface are given in the following table. Sr.No. Attributes & Description 1 Canvas This is a reference to the canvas element that created this context. 2 drawingBufferWidth This attribute represents the actual width of the drawing buffer. It may differ from the width attribute of the HTMLCanvasElement. 3 drawingBufferHeight This attribute represents the actual height of the drawing buffer. It may differ from the height attribute of the HTMLCanvasElement. Print Page Previous Next Advertisements ”;
Associating Attributes & Buffer Objects ”; Previous Next Each attribute in the vertex shader program points to a vertex buffer object. After creating the vertex buffer objects, programmers have to associate them with the attributes of the vertex shader program. Each attribute points to only one vertex buffer object from which they extract the data values, and then these attributes are passed to the shader program. To associate the Vertex Buffer Objects with the attributes of the vertex shader program, you have to follow the steps given below − Get the attribute location Point the attribute to a vertex buffer object Enable the attribute Get the Attribute Location WebGL provides a method called getAttribLocation() which returns the attribute location. Its syntax is as follows − ulong getAttribLocation(Object program, string name) This method accepts the vertex shader program object and the attribute values of the vertex shader program. The following code snippet shows how to use this method. var coordinatesVar = gl.getAttribLocation(shader_program, “coordinates”); Here, shader_program is the object of the shader program and coordinates is the attribute of the vertex shader program. Point the Attribute to a VBO To assign the buffer object to the attribute variable, WebGL provides a method called vertexAttribPointer(). Here is the syntax of this method − void vertexAttribPointer(location, int size, enum type, bool normalized, long stride, long offset) This method accepts six parameters and they are discussed below. Location − It specifies the storage location of an attribute variable. Under this option, you have to pass the value returned by the getAttribLocation() method. Size − It specifies the number of components per vertex in the buffer object. Type − It specifies the type of data. Normalized − This is a Boolean value. If true, non-floating data is normalized to [0, 1]; else, it is normalized to [-1, 1]. Stride − It specifies the number of bytes between different vertex data elements, or zero for default stride. Offset − It specifies the offset (in bytes) in a buffer object to indicate which byte the vertex data is stored from. If the data is stored from the beginning, offset is 0. The following snippet shows how to use vertexAttribPointer() in a program − gl.vertexAttribPointer(coordinatesVar, 3, gl.FLOAT, false, 0, 0); Enabling the Attribute Activate the vertex shader attribute to access the buffer object in a vertex shader. For this operation, WebGL provides enableVertexAttribArray() method. This method accepts the location of the attribute as a parameter. Here is how to use this method in a program − gl.enableVertexAttribArray(coordinatesVar); Print Page Previous Next Advertisements ”;
WebGL – Geometry
WebGL – Geometry ”; Previous Next After obtaining the WebGL context, you have to define the geometry for the primitive (object you want to draw) and store it. In WebGL, we define the details of a geometry – for example, vertices, indices, color of the primitive – using JavaScript arrays. To pass these details to the shader programs, we have to create the buffer objects and store (attach) the JavaScript arrays containing the data in the respective buffers. Note: Later, these buffer objects will be associated with the attributes of the shader program (vertex shader). Defining the Required Geometry A 2D or 3D model drawn using vertices is called a mesh. Each facet in a mesh is called a polygon and a polygon is made of 3 or more vertices. To draw models in the WebGL rendering context, you have to define the vertices and indices using JavaScript arrays. For example, if we want to create a triangle which lies on the coordinates {(5,5), (-5,5), (-5,-5)} as shown in the diagram, then you can create an array for the vertices as − var vertices = [ 0.5,0.5, //Vertex 1 0.5,-0.5, //Vertex 2 -0.5,-0.5, //Vertex 3 ]; Similarly, you can create an array for the indices. Indices for the above triangle indices will be [0, 1, 2] and can be defined as − var indices = [ 0,1,2 ] For a better understanding of indices, consider more complex models like square. We can represent a square as a set of two triangles. If (0,3,1) and (3,1,2) are the two triangles using which we intend to draw a square, then the indices will be defined as − var indices = [0,3,1,3,1,2]; Note − For drawing primitives, WebGL provides the following two methods − drawArrays() − While using this method, we pass the vertices of the primitive using JavaScript arrays. drawElements() − While using this method, we pass both vertices and indices of the primitive using JavaScript array. Buffer Objects A buffer object is a mechanism provided by WebGL that indicates a memory area allocated in the system. In these buffer objects, you can store data of the model you want to draw, corresponding to vertices, indices, color, etc. Using these buffer objects, you can pass multiple data to the shader program (vertex shader) through one of its attribute variables. Since these buffer objects reside in the GPU memory, they can be rendered directly, which in turn improves the performance. To process geometry, there are two types of buffer objects. They are − Vertex buffer object (VBO) − It holds the per-vertex data of the graphical model that is going to be rendered. We use vertex buffer objects in WebGL to store and process the data regarding vertices such as vertex coordinates, normals, colors, and texture coordinates. Index buffer objects (IBO) − It holds the indices (index data) of the graphical model that is going to be rendered. After defining the required geometry and storing them in JavaScript arrays, you need to pass these arrays to the buffer objects, from where the data will be passed to the shader programs. The following steps are to be followed to store data in the buffers. Create an empty buffer. Bind an appropriate array object to the empty buffer. Pass the data (vertices/indices) to the buffer using one of the typed arrays. Unbind the buffer (Optional). Creating a Buffer To create an empty buffer object, WebGL provides a method called createBuffer(). This method returns a newly created buffer object, if the creation was successful; else it returns a null value in case of failure. WebGL operates as a state machine. Once a buffer is created, any subsequent buffer operation will be executed on the current buffer until we unbound it. Use the following code to create a buffer − var vertex_buffer = gl.createBuffer(); Note − gl is the reference variable to the current WebGL context. Bind the Buffer After creating an empty buffer object, you need to bind an appropriate array buffer (target) to it. WebGL provides a method called bindBuffer() for this purpose. Syntax The syntax of bindBuffer() method is as follows − void bindBuffer (enum target, Object buffer) This method accepts two parameters and they are discussed below. target − The first variable is an enum value representing the type of the buffer we want to bind to the empty buffer. You have two predefined enum values as options for this parameter. They are − ARRAY_BUFFER which represents vertex data. ELEMENT_ARRAY_BUFFER which represents index data. Object buffer − The second one is the reference variable to the buffer object created in the previous step. The reference variable can be of a vertex buffer object or of an index buffer object. Example The following code snippet shows how to use the bindBuffer() method. //vertex buffer var vertex_buffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, vertex_buffer); //Index buffer var Index_Buffer = gl.createBuffer(); gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, index_buffer); Passing Data into the Buffer The next step is to pass the data (vertices/indices) to the buffer. Till now data is in the form of an array and before passing it to the buffer, we need to wrap it in one of the WebGL typed arrays. WebGL provides a method named bufferData() for this purpose. Syntax The syntax of bufferData() method is as follows − void bufferData (enum target, Object data, enum usage) This method accepts three parameters and they are discussed below − target − The first parameter is an enum value representing the type of the array buffer we used.The options for this parameter are − ARRAY_BUFFER which represents vertex data. ELEMENT_ARRAY_BUFFER which represents index data. Object data − The second parameter is the object value that contains the data to be written to the buffer object. Here we have to pass the data using typed arrays. Usage − The third parameter of this method is an enum variable that specifies how to use the buffer object data (stored data) to draw shapes. There are three options for this parameter
WebGL – Scaling
WebGL – Scaling ”; Previous Next In this chapter, we will take an example to demonstrate how to modify the scale of a triangle using WebGL. Scaling Scaling is nothing but increasing or decreasing the size of an object. For example, if a triangle has vertices of the size [a,b,c], then the triangle with the vertices [2a, 2b, 2c] will be double its size. Therefore, to scale a triangle, you have to multiply each vertices with the scaling factor. You can also scale a particular vertex. To scale a triangle, in the vertex shader of the program, we create a uniform matrix and multiply the coordinate values with this matrix. Later, we pass a 4×4 diagonal matrix having the scaling factors of x,y,z coordinates in the diagonal positions (last diagonal position 1). Required Steps The following steps are required to create a WebGL application to scale a triangle. Step 1 − Prepare the Canvas and Get the WebGL Rendering Context In this step, we obtain the WebGL Rendering context object using getContext(). Step 2 − Define the Geometry and Store it in the Buffer Objects Since we are drawing a triangle, we have to pass three vertices of the triangle, and store them in buffers. var vertices = [ -0.5,0.5,0.0, -0.5,-0.5,0.0, 0.5,-0.5,0.0, ]; Step 3 − Create and Compile the Shader Programs In this step, you need to write the vertex shader and fragment shader programs, compile them, and create a combined program by linking these two programs. Vertex Shader − In the vertex shader of the program, we define a vector attribute to store 3D coordinates. Along with it, we define a uniform matrix to store the scaling factors, and finally, we multiply these two values and assign it to gl_position which holds the final position of the vertices. var vertCode = ”attribute vec4 coordinates;” + ”uniform mat4 u_xformMatrix;” + ”void main(void) {” + ” gl_Position = u_xformMatrix * coordinates;” + ”}”; Fragment Shader − In the fragment shader, we simply assign the fragment color to the gl_FragColor variable. var fragCode = ”void main(void) {” +” gl_FragColor = vec4(1, 0.5, 0.0, 1);” +”}”; Step 4 − Associate the Shader Programs with the Buffer Objects In this step, we associate the buffer objects with the shader program. Step 5 − Drawing the Required Object Since we are drawing the triangle using indices, we use the drawArrays() method. To this method, we have to pass the number of vertices/elements to be considered. Since we are drawing a triangle, we will pass 3 as a parameter. gl.drawArrays(gl.TRIANGLES, 0, 3); Example – Scale a Triangle The following example shows how to scale a triangle − Live Demo <!doctype html> <html> <body> <canvas width = “300” height = “300” id = “my_Canvas”></canvas> <script> /*=================Creating a canvas=========================*/ var canvas = document.getElementById(”my_Canvas”); gl = canvas.getContext(”experimental-webgl”); /*===========Defining and storing the geometry==============*/ var vertices = [ -0.5,0.5,0.0, -0.5,-0.5,0.0, 0.5,-0.5,0.0, ]; //Create an empty buffer object and store vertex data var vertex_buffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, vertex_buffer); gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW); gl.bindBuffer(gl.ARRAY_BUFFER, null); /*========================Shaders============================*/ //Vertex shader source code var vertCode = ”attribute vec4 coordinates;” + ”uniform mat4 u_xformMatrix;” + ”void main(void) {” + ” gl_Position = u_xformMatrix * coordinates;” + ”}”; //Create a vertex shader program object and compile it var vertShader = gl.createShader(gl.VERTEX_SHADER); gl.shaderSource(vertShader, vertCode); gl.compileShader(vertShader); //fragment shader source code var fragCode = ”void main(void) {” + ” gl_FragColor = vec4(0.0, 0.0, 0.0, 0.1);” + ”}”; //Create a fragment shader program object and compile it var fragShader = gl.createShader(gl.FRAGMENT_SHADER); gl.shaderSource(fragShader, fragCode); gl.compileShader(fragShader); //Create and use combiened shader program var shaderProgram = gl.createProgram(); gl.attachShader(shaderProgram, vertShader); gl.attachShader(shaderProgram, fragShader); gl.linkProgram(shaderProgram); gl.useProgram(shaderProgram); /*===================scaling==========================*/ var Sx = 1.0, Sy = 1.5, Sz = 1.0; var xformMatrix = new Float32Array([ Sx, 0.0, 0.0, 0.0, 0.0, Sy, 0.0, 0.0, 0.0, 0.0, Sz, 0.0, 0.0, 0.0, 0.0, 1.0 ]); var u_xformMatrix = gl.getUniformLocation(shaderProgram, ”u_xformMatrix”); gl.uniformMatrix4fv(u_xformMatrix, false, xformMatrix); /* ===========Associating shaders to buffer objects============*/ gl.bindBuffer(gl.ARRAY_BUFFER, vertex_buffer); var coordinatesVar = gl.getAttribLocation(shaderProgram, “coordinates”); gl.vertexAttribPointer(coordinatesVar, 3, gl.FLOAT, false, 0, 0); gl.enableVertexAttribArray(coordinatesVar); /*=================Drawing the Quad========================*/ gl.clearColor(0.5, 0.5, 0.5, 0.9); gl.enable(gl.DEPTH_TEST); gl.clear(gl.COLOR_BUFFER_BIT); gl.viewport(0,0,canvas.width,canvas.height); gl.drawArrays(gl.TRIANGLES, 0, 3); </script> </body> </html> If you run this example, it will produce the following output − Print Page Previous Next Advertisements ”;
WebGL – Home
WebGL Tutorial PDF Version Quick Guide Resources Job Search Discussion WebGL (Web Graphics Library) is the new standard for 3D graphics on the Web, designed for rendering 2D graphics and interactive 3D graphics. This tutorial starts with a basic introduction to WebGL, OpenGL, and the Canvas element of HTML-5, followed by a sample application. This tutorial contains dedicated chapters for all the steps required to write a basic WebGL application. It also contains chapters that explain how to use WebGL for affine transformations such as translation, rotation, and scaling. Audience This tutorial will be extremely useful for all those readers who want to learn the basics of WebGL programming. Prerequisites It is an elementary tutorial and one can easily understand the concepts explained here with a basic knowledge of JavaScript or HTML-5 programming. However, it will help if you have some prior exposure to OpenGL language and matrix operation related to 3D graphics. Print Page Previous Next Advertisements ”;
WebGL – Sample Application
WebGL – Sample Application ”; Previous Next We have discussed the basics of WebGL and the WebGL pipeline (a procedure followed to render Graphics applications). In this chapter, we are going to take a sample application to create a triangle using WebGL and observe the steps followed in the application. Structure of WebGL Application WebGL application code is a combination of JavaScript and OpenGL Shader Language. JavaScript is required to communicate with the CPU OpenGL Shader Language is required to communicate with the GPU. Sample Application Let us now take a simple example to learn how to use WebGL to draw a simple triangle with 2D coordinates. Live Demo <!doctype html> <html> <body> <canvas width = “300” height = “300” id = “my_Canvas”></canvas> <script> /* Step1: Prepare the canvas and get WebGL context */ var canvas = document.getElementById(”my_Canvas”); var gl = canvas.getContext(”experimental-webgl”); /* Step2: Define the geometry and store it in buffer objects */ var vertices = [-0.5, 0.5, -0.5, -0.5, 0.0, -0.5,]; // Create a new buffer object var vertex_buffer = gl.createBuffer(); // Bind an empty array buffer to it gl.bindBuffer(gl.ARRAY_BUFFER, vertex_buffer); // Pass the vertices data to the buffer gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW); // Unbind the buffer gl.bindBuffer(gl.ARRAY_BUFFER, null); /* Step3: Create and compile Shader programs */ // Vertex shader source code var vertCode = ”attribute vec2 coordinates;” + ”void main(void) {” + ” gl_Position = vec4(coordinates,0.0, 1.0);” + ”}”; //Create a vertex shader object var vertShader = gl.createShader(gl.VERTEX_SHADER); //Attach vertex shader source code gl.shaderSource(vertShader, vertCode); //Compile the vertex shader gl.compileShader(vertShader); //Fragment shader source code var fragCode = ”void main(void) {” + ”gl_FragColor = vec4(0.0, 0.0, 0.0, 0.1);” + ”}”; // Create fragment shader object var fragShader = gl.createShader(gl.FRAGMENT_SHADER); // Attach fragment shader source code gl.shaderSource(fragShader, fragCode); // Compile the fragment shader gl.compileShader(fragShader); // Create a shader program object to store combined shader program var shaderProgram = gl.createProgram(); // Attach a vertex shader gl.attachShader(shaderProgram, vertShader); // Attach a fragment shader gl.attachShader(shaderProgram, fragShader); // Link both programs gl.linkProgram(shaderProgram); // Use the combined shader program object gl.useProgram(shaderProgram); /* Step 4: Associate the shader programs to buffer objects */ //Bind vertex buffer object gl.bindBuffer(gl.ARRAY_BUFFER, vertex_buffer); //Get the attribute location var coord = gl.getAttribLocation(shaderProgram, “coordinates”); //point an attribute to the currently bound VBO gl.vertexAttribPointer(coord, 2, gl.FLOAT, false, 0, 0); //Enable the attribute gl.enableVertexAttribArray(coord); /* Step5: Drawing the required object (triangle) */ // Clear the canvas gl.clearColor(0.5, 0.5, 0.5, 0.9); // Enable the depth test gl.enable(gl.DEPTH_TEST); // Clear the color buffer bit gl.clear(gl.COLOR_BUFFER_BIT); // Set the view port gl.viewport(0,0,canvas.width,canvas.height); // Draw the triangle gl.drawArrays(gl.TRIANGLES, 0, 3); </script> </body> </html> It will produce the following result − If you observe the above program carefully, we have followed five sequential steps to draw a simple triangle using WebGL. The steps are as follows − Step 1 − Prepare the canvas and get WebGL rendering context We get the current HTML canvas object and obtain its WebGL rendering context. Step 2 − Define the geometry and store it in buffer objects We define the attributes of the geometry such as vertices, indices, color, etc., and store them in the JavaScript arrays. Then, we create one or more buffer objects and pass the arrays containing the data to the respective buffer object. In the example, we store the vertices of the triangle in a JavaScript array and pass this array to a vertex buffer object. Step 3 − Create and compile Shader programs We write vertex shader and fragment shader programs, compile them, and create a combined program by linking these two programs. Step 4 − Associate the shader programs with buffer objects We associate the buffer objects and the combined shader program. Step 5 − Drawing the required object (triangle) This step includes operations such as clearing the color, clearing the buffer bit, enabling the depth test, setting the view port, etc. Finally, you need to draw the required primitives using one of the methods − drawArrays() or drawElements(). All these steps are explained further in this tutorial. Print Page Previous Next Advertisements ”;
WebGL – Graphics Pipeline
WebGL – Graphics Pipeline ”; Previous Next To render 3D graphics, we have to follow a sequence of steps. These steps are known as graphics pipeline or rendering pipeline. The following diagram depicts WebGL graphics pipeline. In the following sections, we will discuss one by one the role of each step in the pipeline. JavaScript While developing WebGL applications, we write Shader language code to communicate with the GPU. JavaScript is used to write the control code of the program, which includes the following actions − Initialize WebGL − JavaScript is used to initialize the WebGL context. Create arrays − We create JavaScript arrays to hold the data of the geometry. Buffer objects − We create buffer objects (vertex and index) by passing the arrays as parameters. Shaders − We create, compile, and link the shaders using JavaScript. Attributes − We can create attributes, enable them, and associate them with buffer objects using JavaScript. Uniforms − We can also associate the uniforms using JavaScript. Transformation matrix − Using JavaScript, we can create transformation matrix. Initially we create the data for the required geometry and pass them to the shaders in the form of buffers. The attribute variable of the shader language points to the buffer objects, which are passed as inputs to the vertex shader. Vertex Shader When we start the rendering process by invoking the methods drawElements() and drawArray(), the vertex shader is executed for each vertex provided in the vertex buffer object. It calculates the position of each vertex of a primitive polygon and stores it in the varying gl_position. It also calculates the other attributes such as color, texture coordinates, and vertices that are normally associated with a vertex. Primitive Assembly After calculating the position and other details of each vertex, the next phase is the primitive assembly stage. Here the triangles are assembled and passed to the rasterizer. Rasterization In the rasterization step, the pixels in the final image of the primitive are determined. It has two steps − Culling − Initially the orientation (is it front or back facing?) of the polygon is determined. All those triangles with improper orientation that are not visible in view area are discarded. This process is called culling. Clipping − If a triangle is partly outside the view area, then the part outside the view area is removed. This process is known as clipping. Fragment Shader The fragment shader gets data from the vertex shader in varying variables, primitives from the rasterization stage, and then calculates the color values for each pixel between the vertices. The fragment shader stores the color values of every pixel in each fragment. These color values can be accessed during fragment operations, which we are going to discuss next. Fragment Operations Fragment operations are carried out after determining the color of each pixel in the primitive. These fragment operations may include the following − Depth Color buffer blend Dithering Once all the fragments are processed, a 2D image is formed and displayed on the screen. The frame buffer is the final destination of the rendering pipeline. Frame Buffer Frame buffer is a portion of graphics memory that hold the scene data. This buffer contains details such as width and height of the surface (in pixels), color of each pixel, and depth and stencil buffers. Print Page Previous Next Advertisements ”;
WebGL – Introduction
WebGL – Introduction ”; Previous Next A few years back, Java applications – as a combination of applets and JOGL – were used to process 3D graphics on the Web by addressing the GPU (Graphical Processing Unit). As applets require a JVM to run, it became difficult to rely on Java applets. A few years later, people stopped using Java applets. The Stage3D APIs provided by Adobe (Flash, AIR) offered GPU hardware accelerated architecture. Using these technologies, programmers could develop applications with 2D and 3D capabilities on web browsers as well as on IOS and Android platforms. Since Flash was a proprietary software, it was not used as web standard. In March 2011, WebGL was released. It is an openware that can run without a JVM. It is completely controlled by the web browser. The new release of HTML 5 has several features to support 3D graphics such as 2D Canvas, WebGL, SVG, 3D CSS transforms, and SMIL. In this tutorial, we will be covering the basics of WebGL. What is OpenGL? OpenGL (Open Graphics Library) is a cross-language, cross-platform API for 2D and 3D graphics. It is a collection of commands. OpenGL4.5 is the latest version of OpenGL. The following table lists a set of technologies related to OpenGL. API Technology Used OpenGL ES It is the library for 2D and 3D graphics on embedded systems – including consoles, phones, appliances, and vehicles. OpenGL ES 3.1 is its latest version. It is maintained by the Khronos Group www.khronos.org JOGL It is the Java binding for OpenGL. JOGL 4.5 is its latest version and it is maintained by jogamp.org. WebGL It is the JavaScript binding for OpenGL. WebGL 1.0 is its latest version and it is maintained by the khronos group. OpenGLSL OpenGL Shading Language. It is a programming language which is a companion to OpenGL 2.0 and higher. It is a part of the core OpenGL 4.4 specification. It is an API specifically tailored for embedded systems such as those present on mobile phones and tablets. Note − In WebGL, we use GLSL to write shaders. What is WebGL? WebGL (Web Graphics Library) is the new standard for 3D graphics on the Web, It is designed for the purpose of rendering 2D graphics and interactive 3D graphics. It is derived from OpenGL”s ES 2.0 library which is a low-level 3D API for phones and other mobile devices. WebGL provides similar functionality of ES 2.0 (Embedded Systems) and performs well on modern 3D graphics hardware. It is a JavaScript API that can be used with HTML5. WebGL code is written within the <canvas> tag of HTML5. It is a specification that allows Internet browsers access to Graphic Processing Units (GPUs) on those computers where they were used. Who Developed WebGL An American-Serbian software engineer named Vladimir Vukicevic did the foundation work and led the creation of WebGL In 2007, Vladimir started working on an OpenGL prototype for Canvas element of the HTML document. In March 2011, Kronos Group created WebGL. Rendering Rendering is the process of generating an image from a model using computer programs. In graphics, a virtual scene is described using information like geometry, viewpoint, texture, lighting, and shading, which is passed through a render program. The output of this render program will be a digital image. There are two types of rendering − Software Rendering − All the rendering calculations are done with the help of CPU. Hardware Rendering − All the graphics computations are done by the GPU (Graphical processing unit). Rendering can be done locally or remotely. If the image to be rendered is way too complex, then rendering is done remotely on a dedicated server having enough of hardware resources required to render complex scenes. It is also called as server-based rendering. Rendering can also be done locally by the CPU. It is called as client-based rendering. WebGL follows a client-based rendering approach to render 3D scenes. All the processing required to obtain an image is performed locally using the client”s graphics hardware. GPU According to NVIDIA, a GPU is “a single chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines capable of processing a minimum of 10 million polygons per second.” Unlike multi-core processors with a few cores optimized for sequential processing, a GPU consists of thousands of smaller cores that process parallel workloads efficiently. Therefore, the GPU accelerates the creation of images in a frame buffer (a portion of ram which contains a complete frame data) intended for output to a display. GPU Accelerated Computing In GPU accelerated computing, the application is loaded into the CPU. Whenever it encounters a compute-intensive portion of the code, then that portion of code will be loaded and run on the GPU. It gives the system the ability to process graphics in an efficient way. GPU will have a separate memory and it runs multiple copies of a small portion of the code at a time. The GPU processes all the data which is in its local memory, not the central memory. Therefore, the data that is needed to be processed by the GPU should be loaded/copied to the GPU memory and then be processed. In the systems having the above architecture, the communication overhead between the CPU and GPU should be reduced to achieve faster processing of 3D programs. For this, we have to copy all the data and keep it on the GPU, instead of communicating with the GPU repeatedly. Browsers Supported The following tables show a list of browsers that support WebGL − Web Browsers Browser Name Version Support Internet Explorer 11 and above Complete support Google Chrome 39 and above Complete support Safari 8 Complete support Firefox 36 and above Partial support Opera 27 and above Partial support Mobile Browsers Browser Name Version Support Chrome for Android 42 Partial support Android browser 40 Partial support IOS Safari 8.3 Complete support Opera Mini 8 Does not support Blackberry Browser 10 Complete support IE mobile 10 Partial support