Skip to content

Remove dependencies #51

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Jul 20, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 4 additions & 6 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,26 +6,24 @@ version = "0.4.1"
[deps]
BandedMatrices = "aae01518-5342-5314-be14-df237901396f"
BlockBandedMatrices = "ffab5731-97b5-5995-9138-79e8c1846df0"
Cassette = "7057c7e9-c182-5462-911a-8362d720325c"
DiffEqDiffTools = "01453d9d-ee7c-5054-8395-0335cb756afa"
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
LightGraphs = "093fc24a-ae57-5d10-9952-331d41423f4d"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Requires = "ae029012-a4dd-5104-9daa-d747884805df"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
SpecialFunctions = "276daf66-3868-5448-9aa4-cd146d93841b"
VertexSafeGraphs = "19fa3120-7c27-5ec5-8db8-b0b0aa330d6f"
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"

[compat]
julia = "1"

[extras]
DiffEqDiffTools = "01453d9d-ee7c-5054-8395-0335cb756afa"
Cassette = "7057c7e9-c182-5462-911a-8362d720325c"
IterativeSolvers = "42fd0dbc-a981-5370-80f2-aaf504508153"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
SafeTestsets = "1bc83da4-3b8d-516f-aca4-4fe02f6d838f"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"

[targets]
test = ["Test", "DiffEqDiffTools", "IterativeSolvers", "Random"]
test = ["Test", "Cassette", "IterativeSolvers", "Random", "SafeTestsets", "Zygote"]
125 changes: 67 additions & 58 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,12 @@
This package is for exploiting sparsity in Jacobians and Hessians to accelerate
computations. Matrix-free Jacobian-vector product and Hessian-vector product
operators are provided that are compatible with AbstractMatrix-based libraries
like IterativeSolvers.jl for easy and efficient Newton-Krylov implementation.
Automatic and numerical differentiation are utilized and optional. In addition,
the ability to automatically detect the sparsity of a function, perform matrix
coloring, and utilize coloring in Jacobian and Hessian construction is provided.
like IterativeSolvers.jl for easy and efficient Newton-Krylov implementation. It is
possible to perform matrix coloring, and utilize coloring in Jacobian and Hessian
construction.

Optionally, automatic and numerical differentiation are utilized and the ability to
automatically detect the sparsity of a function is provided.

## Example

Expand All @@ -31,11 +33,13 @@ end
For this function, we know that the sparsity pattern of the Jacobian is a
`Tridiagonal` matrix. However, if we didn't know the sparsity pattern for
the Jacobian, we could use the `sparsity!` function to automatically
detect the sparsity pattern. We declare that it outputs a length 30 vector
and takes in a length 30 vector, and it spits out a `Sparsity` object
which we can turn into a `SparseMatrixCSC`:
detect the sparsity pattern. This function is only available if you
load Cassette.jl as well. We declare that the function `f` outputs a
vector of length 30 and takes in a vector of length 30, and `sparsity!` spits
out a `Sparsity` object which we can turn into a `SparseMatrixCSC`:

```julia
using Cassette
sparsity_pattern = sparsity!(f,output,input)
jac = Float64.(sparse(sparsity_pattern))
```
Expand Down Expand Up @@ -88,36 +92,6 @@ gmres!(res,J,v)

## Documentation

### Automated Sparsity Detection

Automated sparsity detection is provided by the `sparsity!` function whose
syntax is:

```julia
`sparsity!(f, Y, X, args...; sparsity=Sparsity(length(X), length(Y)), verbose=true)`
```

The arguments are:

- `f`: the function
- `Y`: the output array
- `X`: the input array
- `args`: trailing arguments to `f`. They are considered subject to change, unless wrapped as `Fixed(arg)`
- `S`: (optional) the sparsity pattern
- `verbose`: (optional) whether to describe the paths taken by the sparsity detection.

The function `f` is assumed to take arguments of the form `f(dx,x,args...)`.
`sparsity!` returns a `Sparsity` object which describes where the non-zeros
of the Jacobian occur. `sparse(::Sparsity)` transforms the pattern into
a sparse matrix.

This function utilizes non-standard interpretation, which we denote
combinatoric concolic analysis, to directly realize the sparsity pattern from the program's AST. It requires that the function `f` is a Julia function. It does not
work numerically, meaning that it is not prone to floating point error or
cancelation. It allows for branching and will automatically check all of the
branches. However, a while loop of indeterminate length which is dependent
on the input argument is not allowed.

### Matrix Coloring

Matrix coloring allows you to reduce the number of times finite differencing
Expand Down Expand Up @@ -230,29 +204,8 @@ autonum_hesvec!(du,f,x,v,
cache3 = ForwardDiff.Dual{DeivVecTag}.(x, v))

autonum_hesvec(f,x,v)


numback_hesvec!(du,f,x,v,
cache1 = similar(v),
cache2 = similar(v))

numback_hesvec(f,x,v)

# Currently errors! See https://github.com/FluxML/Zygote.jl/issues/241
autoback_hesvec!(du,f,x,v,
cache2 = ForwardDiff.Dual{DeivVecTag}.(x, v),
cache3 = ForwardDiff.Dual{DeivVecTag}.(x, v))

autoback_hesvec(f,x,v)
```

`numauto` and `autonum` both mix numerical and automatic differentiation, with
the former almost always being more efficient and is thus recommended. `numback` and
`autoback` methods are numerical/ForwardDiff over reverse mode automatic differentiation
respectively, where the reverse-mode AD is provided by Zygote.jl. Currently these methods
are not competitive against `numauto`, but as Zygote.jl gets optimized these will likely
be the fastest.

In addition,
the following forms allow you to provide a gradient function `g(dx,x)` or `dx=g(x)`
respectively:
Expand All @@ -271,6 +224,32 @@ auto_hesvecgrad!(du,g,x,v,
auto_hesvecgrad(g,x,v)
```

The `numauto` and `autonum` methods both mix numerical and automatic differentiation, with
the former almost always being more efficient and thus being recommended.

Optionally, if you load Zygote.jl, the following `numback`
and `autoback` methods are available and allow numerical/ForwardDiff over reverse mode
automatic differentiation respectively, where the reverse-mode AD is provided by Zygote.jl.
Currently these methods are not competitive against `numauto`, but as Zygote.jl gets
optimized these will likely be the fastest.

```julia
using Zygote # Required

numback_hesvec!(du,f,x,v,
cache1 = similar(v),
cache2 = similar(v))

numback_hesvec(f,x,v)

# Currently errors! See https://github.com/FluxML/Zygote.jl/issues/241
autoback_hesvec!(du,f,x,v,
cache2 = ForwardDiff.Dual{DeivVecTag}.(x, v),
cache3 = ForwardDiff.Dual{DeivVecTag}.(x, v))

autoback_hesvec(f,x,v)
```

#### J*v and H*v Operators

The following produce matrix-free operators which are used for calculating
Expand All @@ -287,3 +266,33 @@ These all have the same interface, where `J*v` utilizes the out-of-place
Jacobian-vector or Hessian-vector function, whereas `mul!(res,J,v)` utilizes
the appropriate in-place versions. To update the location of differentiation
in the operator, simply mutate the vector `u`: `J.u .= ...`.

### Automated Sparsity Detection

Automated sparsity detection is provided by the `sparsity!` function. This requires
`using Cassette` for Requires. The syntax is:

```julia
`sparsity!(f, Y, X, args...; sparsity=Sparsity(length(X), length(Y)), verbose=true)`
```

The arguments are:

- `f`: the function
- `Y`: the output array
- `X`: the input array
- `args`: trailing arguments to `f`. They are considered subject to change, unless wrapped as `Fixed(arg)`
- `S`: (optional) the sparsity pattern
- `verbose`: (optional) whether to describe the paths taken by the sparsity detection.

The function `f` is assumed to take arguments of the form `f(dx,x,args...)`.
`sparsity!` returns a `Sparsity` object which describes where the non-zeros
of the Jacobian occur. `sparse(::Sparsity)` transforms the pattern into
a sparse matrix.

This function utilizes non-standard interpretation, which we denote
combinatoric concolic analysis, to directly realize the sparsity pattern from the program's AST. It requires that the function `f` is a Julia function. It does not
work numerically, meaning that it is not prone to floating point error or
cancelation. It allows for branching and will automatically check all of the
branches. However, a while loop of indeterminate length which is dependent
on the input argument is not allowed.
2 changes: 1 addition & 1 deletion appveyor.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
environment:
matrix:
- julia_version: 1
- julia_version: 1.2
- julia_version: nightly

platform:
Expand Down
47 changes: 34 additions & 13 deletions src/SparseDiffTools.jl
Original file line number Diff line number Diff line change
@@ -1,16 +1,19 @@
module SparseDiffTools

using SparseArrays, LinearAlgebra, BandedMatrices, BlockBandedMatrices,
LightGraphs, VertexSafeGraphs, DiffEqDiffTools, ForwardDiff, Zygote,
SparseArrays
using BlockBandedMatrices:blocksize,nblocks
using BandedMatrices
using BlockBandedMatrices
using DiffEqDiffTools
using ForwardDiff
using LightGraphs
using Requires
using VertexSafeGraphs

using LinearAlgebra
using SparseArrays

using BlockBandedMatrices: blocksize, nblocks
using ForwardDiff: Dual, jacobian, partials, DEFAULT_CHUNK_THRESHOLD

using Requires
using Cassette
import Cassette: tag, untag, Tagged, metadata, hasmetadata, istagged, canrecurse
import Cassette: tagged_new_tuple, ContextTagged, BindingMeta, DisableHooks, nametype
import Core: SSAValue

export contract_color,
greedy_d1,
Expand All @@ -27,10 +30,7 @@ export contract_color,
autonum_hesvec,autonum_hesvec!,
num_hesvecgrad,num_hesvecgrad!,
auto_hesvecgrad,auto_hesvecgrad!,
numback_hesvec,numback_hesvec!,
autoback_hesvec,autoback_hesvec!,
JacVec,HesVec,HesVecGrad,
Sparsity, sparsity!, hsparsity
JacVec,HesVec,HesVecGrad


include("coloring/high_level.jl")
Expand All @@ -41,8 +41,17 @@ include("coloring/greedy_star2_coloring.jl")
include("coloring/matrix2graph.jl")
include("differentiation/compute_jacobian_ad.jl")
include("differentiation/jaches_products.jl")

function __init__()
@require Cassette="7057c7e9-c182-5462-911a-8362d720325c" begin
using .Cassette
using .Cassette: tag, untag, Tagged, metadata, hasmetadata, istagged, canrecurse
using .Cassette: tagged_new_tuple, ContextTagged, BindingMeta, DisableHooks, nametype

using Core: SSAValue

export Sparsity, hsparsity, sparsity!

include("program_sparsity/program_sparsity.jl")
include("program_sparsity/sparsity_tracker.jl")
include("program_sparsity/path.jl")
Expand All @@ -51,6 +60,18 @@ function __init__()
include("program_sparsity/linearity.jl")
include("program_sparsity/hessian.jl")
include("program_sparsity/blas.jl")

@require SpecialFunctions="276daf66-3868-5448-9aa4-cd146d93841b" begin
using .SpecialFunctions

include("program_sparsity/linearity_special.jl")
end
end

@require Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f" begin
export numback_hesvec, numback_hesvec!, autoback_hesvec, autoback_hesvec!

include("differentiation/jaches_products_zygote.jl")
end
end

Expand Down
Loading