When does it make sense to use a Float32Array
instead of a standard JavaScript Array
for browser applications?
This performance test shows
I would assume that the glMatrix library uses Float32Array because it is primarily used in WebGL-applications, where matrices are represented as Float32Arrays (http://www.khronos.org/registry/webgl/specs/1.0/#5.14.10).
I emailed the developer of glMatrix
and my answer below includes his comments (points 2 & 3):
Creating a new object is generally quicker with Array
than Float32Array
. The gain is significant for small arrays, but is less (environment dependent) with larger arrays.
Accessing data from a TypedArray (eg. Float32Array
) is often faster than from a normal array, which means that most array operations (aside from creating a new object) are faster with TypedArrays.
As also stated by @emidander, glMatrix
was developed primarily for WebGL, which requires that vectors and matrices be passed as Float32Array
. So, for a WebGL application, the potentially costly conversion from Array
to Float32Array
would need to be included in any performance measurement.
So, not surprisingly, the best choice is application dependent:
If arrays are generally small, and/or number of operations on them is low so that the constructor time is a significant proportion of the array's lifespan, use Array
.
If code readability is as important as performance, then use Array
(i.e. use []
, instead of a constructor).
If arrays are very large and/or are used for many operations, then use a TypedArray.
For WebGL applications (or other applications that would otherwise require a type conversion), use Float32Array
(or other TypedArray).
In today browsers implementation, using Float32Array has impact in both writibility and performance if compared against vanilla Arrays. It seems that even gl-matrix authors agreed that the library need to be refactored to remove Float32Array dependency: https://github.com/toji/gl-matrix/issues/359