How to Change Basic Data Types in Julia
Overview
In fields like machine learning, 32-bit floating point numbers are used instead of 64-bit ones for improving computation speed and saving memory. Therefore, in PyTorch, when tensors are created, their data type is fundamentally 32-bit floating point numbers by default. In Julia, there’s a machine learning package called Flux.jl
, which takes Julia’s standard arrays as input for the neural networks it implements. The fact that it does not use separate data structures like tensors can be seen as an advantage, but the hassle of having to set the data type to Float32 manually can also be seen as a drawback. Below, we introduce a way to change the default data type.
Code1
ChangePrecision.jl
By using the @changeprecision
macro, the default data type can be changed within a begin ... end
block.
julia> Pkg.add("ChangePrecision")
julia> using ChangePrecision
julia> rand(3)
3-element Vector{Float64}:
0.580516564576538
0.33915094423556424
0.3612907828959878
julia> @changeprecision Float32 begin
rand(3)
end
3-element Vector{Float32}:
0.0459705
0.0033969283
0.579983
Environment
- OS: Windows10
- Version: Julia 1.8.2, ChangePrecision 1.0.0