logo

Handling Hidden Layers in Julia Flux 📂Machine Learning

Handling Hidden Layers in Julia Flux

Linear1

In Flux, a linear layer can be implemented as Dense().

  • Dense(in, out, σ=identity; bias=true, init=glorot_uniform)
  • Dense(W::AbstractMatrix, [bias, σ]

The default value for the activation function is the identity function. Well-known functions like relu, tanh, and sigmoid can be used.

julia> Dense(5, 2)
Dense(5, 2)         # 12 parameters

julia> Dense(5, 2, relu)
Dense(5, 2, relu)   # 12 parameters

julia> Dense(5, 2, sigmoid)
Dense(5, 2, σ)  

julia> d1 = Dense(ones(2, 5), false, tanh)  # using provided weight matrix
Dense(5, 2, tanh; bias=false)  # 10 parameters

julia> d1(ones(5))
2-element Vector{Float64}:
 0.9999092042625951
 0.9999092042625951

Convolution2

In Flux, a convolution layer can be implemented as Conv().

Conv(filter, in => out, σ = identity;
    stride = 1, pad = 0, dilation = 1, groups = 1, [bias, weight, init])
  • filter: Enter the kernel size as a tuple (m,n).
  • in => out: Enter the number of channels in the input layer and the number of channels in the output layer.

When using a convolution layer, it’s important to note that the array used as input must always be a 4-dimensional array. Even for a single black and white photo, it should not be entered as (Height, Width) but as (Height, Width, 1, 1). For 11 RGB color photos, the size should be (Height, Width, 3, 11).

julia> A = rand(Float32, 4,4,1,1)
4×4×1×1 Array{Float32, 4}:
[:, :, 1, 1] =
 0.645447  0.459486  0.427283   0.631972
 0.365584  0.821914  0.526104   0.853083
 0.61417   0.813586  0.0317676  0.0636574
 0.151384  0.824513  0.308647   0.260465

julia> c = Conv((2,2), 1 => 3, relu; bias = false)
Conv((2, 2), 1 => 3, relu, bias=false)  # 12 parameters

julia> c(A)
3×3×3×1 Array{Float32, 4}:
[:, :, 1, 1] =
 0.590182  0.166435  0.441105
 0.388874  0.207832  0.466098
 0.736932  0.206818  0.0

[:, :, 2, 1] =
 0.283607  0.12781  0.350034
 0.339908  0.0      0.0
 0.314565  0.0      0.120079

[:, :, 3, 1] =
 0.0       0.36956   0.0787188
 0.196548  0.715786  0.308148
 0.0       0.425667  0.055858

Pooling Layers3

  • maxpool(x, k::NTuple; pad=0, stride=k)
  • meanpool(x, k::NTuple; pad=0, stride=k)

Environment

  • OS: Windows10
  • Version: Julia 1.6.2, Flux 0.12.8