Intro to Julia

Julia Intro for QNumerics Summer School by Raye Kimmerer.

Other resources

Julia cheat sheet

Modern Julia Workflows

Performant Serial Julia

What scientists must know about hardware to write fast code

Linear Algebra

LinearAlgebra docs

using LinearAlgebra
using SparseArrays
using InteractiveUtils

A = rand(100,100)
100×100 Matrix{Float64}:
 0.689877   0.442898   0.337051    …  0.918166   0.931666   0.433926
 0.765819   0.904232   0.894542       0.152635   0.186052   0.0370883
 0.908652   0.586324   0.814448       0.28505    0.501902   0.986995
 0.628829   0.65289    0.960936       0.460261   0.76252    0.844714
 0.377951   0.182027   0.875066       0.936151   0.58245    0.125421
 0.93019    0.945633   0.519722    …  0.921993   0.604815   0.721738
 0.432319   0.359336   0.478482       0.569015   0.261859   0.177263
 0.208684   0.471807   0.861173       0.444832   0.218652   0.61552
 0.991222   0.248672   0.25356        0.978568   0.0260299  0.0491302
 0.697716   0.594812   0.681405       0.938027   0.738349   0.0676542
 ⋮                                 ⋱                        
 0.789347   0.413027   0.0628976      0.0384678  0.957868   0.759315
 0.337977   0.408965   0.648698       0.316743   0.303481   0.711143
 0.572835   0.755868   0.119204       0.316745   0.443184   0.429889
 0.255512   0.559662   0.257568       0.978429   0.216569   0.913379
 0.181709   0.0309443  0.00711086  …  0.985471   0.416796   0.420845
 0.534554   0.331107   0.0664834      0.933714   0.568428   0.068777
 0.0728202  0.889997   0.513793       0.276813   0.875836   0.979584
 0.983905   0.0848033  0.9529         0.145434   0.872911   0.80546
 0.374269   0.899537   0.757362       0.0246773  0.364742   0.0248242

Transpose and other operations are non-allocating, and instead create a wrapper of the original array. This prevents unecessary allocation and allows more opportunity to dispatch on array types.

B = transpose(A)
100×100 transpose(::Matrix{Float64}) with eltype Float64:
 0.689877   0.765819   0.908652  …  0.0728202  0.983905   0.374269
 0.442898   0.904232   0.586324     0.889997   0.0848033  0.899537
 0.337051   0.894542   0.814448     0.513793   0.9529     0.757362
 0.839381   0.778814   0.825507     0.0597193  0.37992    0.245574
 0.237912   0.233105   0.44183      0.852397   0.790491   0.632232
 0.0861786  0.635229   0.912296  …  0.578696   0.343724   0.420249
 0.0163067  0.0807648  0.753481     0.436245   0.863603   0.0628934
 0.875406   0.453432   0.753933     0.722678   0.539864   0.375855
 0.581084   0.216641   0.100293     0.48841    0.795991   0.258537
 0.0997436  0.278478   0.590193     0.931255   0.48242    0.494541
 ⋮                               ⋱                        
 0.026961   0.901935   0.388902     0.354084   0.541073   0.521093
 0.747279   0.216566   0.474953     0.0675228  0.760942   0.232325
 0.930372   0.309718   0.291894     0.934471   0.844834   0.405043
 0.907233   0.555821   0.760701     0.0429174  0.285344   0.828271
 0.482254   0.521344   0.68237   …  0.196166   0.741738   0.668487
 0.0578316  0.849002   0.899415     0.736748   0.113914   0.860721
 0.918166   0.152635   0.28505      0.276813   0.145434   0.0246773
 0.931666   0.186052   0.501902     0.875836   0.872911   0.364742
 0.433926   0.0370883  0.986995     0.979584   0.80546    0.0248242
C = copy(B)
100×100 Matrix{Float64}:
 0.689877   0.765819   0.908652  …  0.0728202  0.983905   0.374269
 0.442898   0.904232   0.586324     0.889997   0.0848033  0.899537
 0.337051   0.894542   0.814448     0.513793   0.9529     0.757362
 0.839381   0.778814   0.825507     0.0597193  0.37992    0.245574
 0.237912   0.233105   0.44183      0.852397   0.790491   0.632232
 0.0861786  0.635229   0.912296  …  0.578696   0.343724   0.420249
 0.0163067  0.0807648  0.753481     0.436245   0.863603   0.0628934
 0.875406   0.453432   0.753933     0.722678   0.539864   0.375855
 0.581084   0.216641   0.100293     0.48841    0.795991   0.258537
 0.0997436  0.278478   0.590193     0.931255   0.48242    0.494541
 ⋮                               ⋱                        
 0.026961   0.901935   0.388902     0.354084   0.541073   0.521093
 0.747279   0.216566   0.474953     0.0675228  0.760942   0.232325
 0.930372   0.309718   0.291894     0.934471   0.844834   0.405043
 0.907233   0.555821   0.760701     0.0429174  0.285344   0.828271
 0.482254   0.521344   0.68237   …  0.196166   0.741738   0.668487
 0.0578316  0.849002   0.899415     0.736748   0.113914   0.860721
 0.918166   0.152635   0.28505      0.276813   0.145434   0.0246773
 0.931666   0.186052   0.501902     0.875836   0.872911   0.364742
 0.433926   0.0370883  0.986995     0.979584   0.80546    0.0248242

When we copy the array it allocates and is no longer a wrapper

Diagonal(A)
100×100 LinearAlgebra.Diagonal{Float64, Vector{Float64}}:
 0.689877   ⋅         ⋅         ⋅        …   ⋅         ⋅         ⋅ 
  ⋅        0.904232   ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅         ⋅        0.814448   ⋅            ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅        0.170114      ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅        …   ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
 ⋮                                       ⋱                      
  ⋅         ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅        …   ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅            ⋅         ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅           0.276813   ⋅         ⋅ 
  ⋅         ⋅         ⋅         ⋅            ⋅        0.872911   ⋅ 
  ⋅         ⋅         ⋅         ⋅            ⋅         ⋅        0.0248242
subtypes(LinearAlgebra.Factorization)
21-element Vector{Any}:
 LinearAlgebra.AdjointFactorization
 LinearAlgebra.BunchKaufman
 LinearAlgebra.Cholesky
 LinearAlgebra.CholeskyPivoted
 LinearAlgebra.Eigen
 LinearAlgebra.GeneralizedEigen
 LinearAlgebra.GeneralizedSVD
 LinearAlgebra.GeneralizedSchur
 LinearAlgebra.Hessenberg
 LinearAlgebra.LDLt
 ⋮
 LinearAlgebra.QR
 LinearAlgebra.QRCompactWY
 LinearAlgebra.QRPivoted
 LinearAlgebra.SVD
 LinearAlgebra.Schur
 LinearAlgebra.TransposeFactorization
 SparseArrays.CHOLMOD.Factor
 SparseArrays.SPQR.QRSparse
 SparseArrays.UMFPACK.UmfpackLU
issymmetric(A*A')
true

How do we communicate this to the dispatch system?

D = Symmetric(A*A')
100×100 LinearAlgebra.Symmetric{Float64, Matrix{Float64}}:
 36.9767  26.9739  28.2335  27.2151  …  25.9279  24.2946  27.1502  23.8749
 26.9739  33.7431  26.9764  24.8684     21.7457  22.2812  24.5202  25.6285
 28.2335  26.9764  34.3579  25.2187     22.0134  23.5851  25.069   23.3198
 27.2151  24.8684  25.2187  31.805      20.7167  22.7103  26.3492  22.4928
 28.2948  28.2536  27.6541  23.691      22.7564  23.1357  26.6059  24.2253
 27.4578  26.4579  25.6641  26.418   …  21.9073  21.7889  23.3432  23.7683
 26.4034  25.6938  25.3785  23.7964     22.1252  21.0764  24.0081  23.3219
 29.3402  28.0172  28.1854  25.3263     23.47    23.8132  26.6657  24.9575
 24.0686  23.7854  23.4124  21.2582     19.8021  19.134   21.8775  21.8944
 27.9855  25.7246  26.2124  25.5573     23.5242  22.8471  25.6106  22.403
  ⋮                                  ⋱                             
 28.8021  26.6635  27.3892  26.0997     24.5376  24.0559  28.2537  25.6707
 25.3282  25.2905  25.1838  23.1105     21.9496  22.1394  24.7355  22.7307
 24.0841  24.2343  24.7356  23.4886     20.1049  22.4576  23.7177  21.3846
 28.1161  26.0753  27.4174  25.4254     24.6418  24.8304  26.3101  23.2327
 26.7324  24.3896  24.7577  23.5723  …  21.986   22.5309  24.979   22.7549
 25.9279  21.7457  22.0134  20.7167     28.5783  19.845   22.5999  19.972
 24.2946  22.2812  23.5851  22.7103     19.845   29.4085  22.2209  21.0628
 27.1502  24.5202  25.069   26.3492     22.5999  22.2209  32.6584  22.9931
 23.8749  25.6285  23.3198  22.4928     19.972   21.0628  22.9931  29.5572

Now we can dispatch on the fact that this matrix is symmetric, allowing us to select more efficient algorithms for this case.

Symmetric(A)
100×100 LinearAlgebra.Symmetric{Float64, Matrix{Float64}}:
 0.689877   0.442898   0.337051  …  0.918166   0.931666   0.433926
 0.442898   0.904232   0.894542     0.152635   0.186052   0.0370883
 0.337051   0.894542   0.814448     0.28505    0.501902   0.986995
 0.839381   0.778814   0.825507     0.460261   0.76252    0.844714
 0.237912   0.233105   0.44183      0.936151   0.58245    0.125421
 0.0861786  0.635229   0.912296  …  0.921993   0.604815   0.721738
 0.0163067  0.0807648  0.753481     0.569015   0.261859   0.177263
 0.875406   0.453432   0.753933     0.444832   0.218652   0.61552
 0.581084   0.216641   0.100293     0.978568   0.0260299  0.0491302
 0.0997436  0.278478   0.590193     0.938027   0.738349   0.0676542
 ⋮                               ⋱                        
 0.026961   0.901935   0.388902     0.0384678  0.957868   0.759315
 0.747279   0.216566   0.474953     0.316743   0.303481   0.711143
 0.930372   0.309718   0.291894     0.316745   0.443184   0.429889
 0.907233   0.555821   0.760701     0.978429   0.216569   0.913379
 0.482254   0.521344   0.68237   …  0.985471   0.416796   0.420845
 0.0578316  0.849002   0.899415     0.933714   0.568428   0.068777
 0.918166   0.152635   0.28505      0.276813   0.875836   0.979584
 0.931666   0.186052   0.501902     0.875836   0.872911   0.80546
 0.433926   0.0370883  0.986995     0.979584   0.80546    0.0248242

This is 'casting' to a symmetric matrix, it will happily create a symmetric matrix out of a non-symmetric one.

Sparse Arrays

A = sprand(100,100,0.1)
100×100 SparseArrays.SparseMatrixCSC{Float64, Int64} with 961 stored entries:
⎡⠢⡈⠀⡀⠄⠃⠠⠠⠀⠀⠀⠄⠀⠔⠀⡂⠬⠀⠄⠀⣀⠄⠠⡠⠀⠁⠈⠀⠄⠀⢅⠠⠈⠁⠰⠀⠀⠀⠑⠀⎤
⎢⠁⠅⠒⢈⠄⠄⠈⡊⠀⠠⠀⠴⠀⠀⠈⡀⠀⠄⠀⠤⢀⠈⣀⡠⠁⠀⠀⠠⠀⠀⠢⠀⠠⠀⠈⠠⠀⡢⠔⠀⎥
⎢⠀⡱⠄⠥⢀⠈⠀⡀⠀⢤⠠⡄⡐⡨⡠⠄⠀⠂⠄⢁⡒⠄⠀⠰⠨⡄⣰⠀⠄⠅⠐⠀⡠⠂⢀⢀⠍⠀⠀⠀⎥
⎢⢐⠌⠅⠀⢅⠁⠈⠀⠢⠖⠍⠄⠡⠅⠄⠃⠐⠂⡅⠄⠥⡂⠀⠠⠠⢠⢔⡀⠀⠆⠀⠰⠀⠄⠠⠐⢤⠃⠀⠄⎥
⎢⠁⠇⠀⠃⢀⠀⠀⠁⠐⠄⠀⡠⢀⠄⠀⠦⢀⠁⠀⡀⠀⠀⠄⠀⠀⠀⠂⠇⠀⠀⠨⠂⠐⠄⠀⠀⠄⠄⠈⠐⎥
⎢⠀⠠⠀⠠⠀⠄⠀⠈⠀⠄⠈⢜⠠⠄⠀⠄⡄⠀⢀⡄⡠⠆⠁⢀⢤⠈⠈⢆⠄⠄⠄⠄⠀⠁⠈⢔⡥⠄⢀⠈⎥
⎢⢀⠀⠌⠃⠀⠄⠀⠅⠡⠤⠪⠢⠈⢀⠂⣀⠀⠍⠘⠵⠐⡠⠸⢤⠌⠀⠐⠀⠰⠀⡄⠠⠰⠀⢀⠈⢠⠠⠰⠀⎥
⎢⠀⠔⠆⡀⢀⡤⠠⠄⠠⠀⠁⡄⠐⠩⠀⡢⠀⠔⠀⠔⠀⠄⠠⠅⠐⠅⠔⠁⠀⠀⠀⢀⢂⠀⠥⠀⠠⡀⠈⠨⎥
⎢⠀⠄⠀⡀⠀⡑⠀⠡⠊⡥⠀⡊⠡⠀⠈⠈⠀⠆⠀⠅⠀⠠⡉⢀⠰⠢⢱⠤⠄⠧⠡⠀⠠⠀⡀⠀⠀⡠⠌⠀⎥
⎢⠁⠁⠀⠌⠠⠀⠜⠀⠀⠄⠰⡈⣀⠍⡀⠡⠠⠅⠀⡁⠠⠑⡀⠄⠘⠔⠠⠀⠡⠢⠨⢀⠠⠀⠌⠜⠀⠂⠴⠤⎥
⎢⠀⠁⠀⡄⠄⠀⠀⠀⠀⠢⠐⠀⠀⠀⠀⠒⢀⠄⠠⡀⠐⠁⠀⠘⠄⠀⠠⢀⠤⠠⡨⠈⠀⠐⠀⠆⠀⠀⡘⠀⎥
⎢⠀⠀⠀⠂⢆⠖⠐⠒⠠⠈⢀⠀⠁⠀⠂⠎⡀⠠⡀⠅⠁⠢⠀⠔⠐⡐⠀⠀⡌⠀⢐⠠⠐⠈⢐⠒⠐⠀⠈⠠⎥
⎢⢐⠆⠄⡀⢈⠃⠀⠀⡀⠂⠀⠀⠀⠒⠠⠒⢀⠢⠄⠐⠀⠘⠘⠀⠢⠂⠈⠈⠠⠀⢂⠀⠃⠁⢀⠀⠐⠄⠎⠀⎥
⎢⠀⠑⠀⠀⠂⢠⠆⠠⠄⠀⠀⡆⠥⠄⢔⢀⠀⠂⠈⠀⠈⠘⠀⠀⠉⠀⠀⡀⢅⢃⠠⡀⠠⠄⠀⠑⢓⢁⠊⠂⎥
⎢⠤⢃⠀⠀⠐⠐⠃⡀⠁⠔⠁⠀⠐⠀⠀⠢⠁⠂⠐⠁⡀⠐⠑⠀⠢⡃⡉⠀⠀⠀⠈⢘⠠⢀⠂⢈⡨⠙⠋⠀⎥
⎢⡃⠀⠈⠀⠈⢈⠀⠂⠁⠀⠀⠂⠀⠀⡠⠂⠄⠀⠐⠀⠁⠂⡐⠘⠀⢀⢡⠂⠀⠀⢀⣐⠑⠀⠰⠈⠡⠠⢀⡂⎥
⎢⠌⠔⠀⢁⠐⠠⠀⡠⠀⠒⠂⠀⠂⡂⠠⠀⠐⠐⠁⠀⢈⡁⠉⠁⠉⠀⡠⠀⢀⠂⠀⡁⢀⠁⠀⠂⠐⡀⡐⠂⎥
⎢⠐⠀⢀⠀⠡⠂⢂⡀⠀⠐⠂⢢⠀⠄⠀⡐⠀⠀⠒⠠⠈⠂⠀⠀⠈⠋⠰⠀⠐⠚⠈⠂⢈⠀⡸⢉⠤⠊⠀⠀⎥
⎢⠂⠂⠐⡠⢀⠀⠀⠈⠀⣁⠡⠁⠀⠀⠠⠦⠀⠀⠀⠑⡀⠠⠲⠀⠀⢀⠀⠀⠀⠀⢐⠂⠁⠀⠎⠀⢬⠀⠀⠂⎥
⎣⠀⠀⠁⠠⠐⠀⠀⠀⠂⠅⢰⠄⠀⠀⡄⠐⠂⠁⠐⡀⠠⠀⠂⠐⠌⠁⢨⠀⢀⢈⠀⠄⠀⠀⠔⠘⡐⠘⠐⡐⎦