-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #50 from JuliaConstraints/newdocs
Improves doc for PerfChecker
- Loading branch information
Showing
8 changed files
with
316 additions
and
27 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
# API | ||
|
||
Here's the API for PerfChecker.jl | ||
|
||
```@autodocs | ||
Modules=[PerfChecker] | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,36 @@ | ||
# BenchmarkTools Extension | ||
|
||
A benchmarking extension, based on `BenchmarkTools.jl`, has been interfaced with `PerfChecker.jl`. This section (will) provides some usage examples, documentation, and links to related notebooks. | ||
A benchmarking extension, based on `BenchmarkTools.jl`, has been interfaced with `PerfChecker.jl`. | ||
This section will provide some usage examples, documentation, and links to related notebooks. | ||
|
||
## Usage | ||
|
||
Like all other extensions, `BenchmarkTools` extension can be used in the following way: | ||
|
||
```julia | ||
julia> using BenchmarkTools, PerfChecker | ||
|
||
julia> @check :benchmark Dict(:option1 => "value1", :option2 => "value2", :PATH => @__DIR__) begin | ||
# the prelimnary code goes here | ||
using Example | ||
end begin | ||
# the code you want to be benchmarked | ||
Example.domath(10) # returns x + 5, 15 in this case. | ||
end | ||
``` | ||
|
||
## Options | ||
|
||
Options specific to this backend with their default values are defined as: | ||
```julia | ||
:threads => 1 | ||
:track => "none" | ||
:samples => BenchmarkTools.DEFAULT_PARAMETERS.samples | ||
:seconds => BenchmarkTools.DEFAULT_PARAMETERS.seconds | ||
:evals => BenchmarkTools.DEFAULT_PARAMETERS.evals | ||
:overhead => BenchmarkTools.DEFAULT_PARAMETERS.overhead | ||
:gctrial => BenchmarkTools.DEFAULT_PARAMETERS.gctrial | ||
:gcsample => BenchmarkTools.DEFAULT_PARAMETERS.gcsample | ||
:time_tolerance => BenchmarkTools.DEFAULT_PARAMETERS.time_tolerance | ||
:memory_tolerance => BenchmarkTools.DEFAULT_PARAMETERS.memory_tolerance | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
# Chairmarks Extension | ||
|
||
A benchmarking extension, based on `Chairmarks.jl`, has been interfaced with `PerfChecker.jl`. | ||
This section will provide some usage examples, documentation, and links to related notebooks. | ||
|
||
## Usage | ||
|
||
Like all other extensions, `BenchmarkTools` extension can be used in the following way: | ||
|
||
```julia | ||
julia> using Chairmarks, PerfChecker | ||
|
||
julia> @check :chairmark Dict(:option1 => "value1", :option2 => "value2", :PATH => @__DIR__) begin | ||
# the prelimnary code goes here | ||
using Example | ||
end begin | ||
# the code you want to be benchmarked | ||
Example.domath(10) # returns x + 5, 15 in this case. | ||
end | ||
``` | ||
|
||
## Options | ||
|
||
Options specific to this backend with their default values are defined as: | ||
```julia | ||
:threads => 1 | ||
:track => "none" | ||
:evals => nothing | ||
:seconds => 1, | ||
:samples => nothing | ||
:gc => true | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,7 +1,48 @@ | ||
# PerfChecker.jl | ||
|
||
Documentation for `PerfChecker.jl`. | ||
PerfChecker.jl is a package designed for package authors to easily performance test their packages. | ||
To achieve that, it provides the follwing features: | ||
|
||
```@autodocs | ||
Modules=[PerfChecker] | ||
- The main macro `@check`, which provides an easy-to-use interface over various interfaces, configurable for various backends via a dictionary. | ||
- (WIP) A CI for reproducible performance testing. | ||
- Visualization of different metrics from `@check` using Makie.jl | ||
|
||
## Usage | ||
|
||
The primary usage of PerfChecker.jl looks like this: | ||
|
||
```julia | ||
using PerfChecker | ||
# optional using custom backend like BenchmarkTools, Chairmark etc | ||
config = Dict(:option1 => "value1", :option2 => :value2) | ||
|
||
results = @check :name_of_backend config begin | ||
# preparatory code goes here | ||
end begin | ||
# the code block to be performance tested goes here | ||
end | ||
|
||
# Visualization of the results | ||
using Makie | ||
checkres_to_scatterlines(results) | ||
``` | ||
|
||
The config dictionary can take many options, depending on the backend. | ||
|
||
Some of the commonly used options are: | ||
- `:PATH` => The path where to the default environment of julia when creating a new process. | ||
- `:pkgs` => A list of versions to test performance for. Its defined as the `Tuple`, `(name::String, option::Symbol, versions::Vector{VersionNumber}, last_or_first::Bool)` Can be given as follows: | ||
- `name` is the name of the package. | ||
- `option` is one of the 5 symbols: | ||
- `:patches`: last patch or first patch of a version | ||
- `:breaking`: last breaking or next breaking version | ||
- `:major`: previous or next major version | ||
- `:minor`: previous or next minor version | ||
- `:custom`: custom version numbers (provide any boolean value for `last_or_first` in this case as it doesn't matter) | ||
- `versions`: The input for the provided `option` | ||
- `last_or_first`: Input for the provided `option` | ||
- `:tags` => A list of tags (a vector of symbols) to easily tag performance tests. | ||
- `:devops` => Giving a custom input to `Pkg.develop`. Intended to be used to test performance of a local development branch of a pacakge with previous versions. Often can be used as simply as `:devops => "MyPackageName"` | ||
- `:threads` => An integer to select the number of threads to start Julia with. | ||
|
||
Checkout the documentation of the other backends for more default options and the default values. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,101 @@ | ||
# Interfacing PerfChecker | ||
# Extending PerfChecker | ||
|
||
PerfChecker was build as an easy to extend interface. This section will cover the few method required. | ||
PerfChecker was build as an easy to extend interface. A good reference example for this is the Chairmarks extension. | ||
|
||
Extending PerfChecker works via PkgExtensions feature in Julia. There are 6 essential functions that need to be extended inside the Pkg extension. | ||
Each extension has a keyword symbol for it, which users can input to use the extension. | ||
|
||
## The Default Options | ||
|
||
Method to be overloaded: `PerfChecker.default_options(::Val{:myperfextension})::Dict` | ||
|
||
PerfChecker works via a config dictionary. Users can populate this dictionary with options and provide it to the main `check` macro to customize the performance testing to their liking. | ||
|
||
For Chairmarks.jl, it looks like this: | ||
```julia | ||
function PerfChecker.default_options(::Val{:chairmark}) | ||
return Dict( | ||
:threads => 1, | ||
:track => "none", | ||
:evals => nothing, | ||
:seconds => 1, | ||
:samples => nothing, | ||
:gc => true | ||
) | ||
end | ||
``` | ||
|
||
|
||
## Package initialization | ||
Method to be overloaded: `PerfChecker.initpkgs(::Val{:myperfextension})::Expr` | ||
|
||
This method is plainly to load the main package(s) associated with the custom backend. In case of Chairmarks.jl, it looks like this: | ||
```julia | ||
PerfChecker.initpkgs(::Val{:chairmark}) = quote | ||
using Chairmarks | ||
end | ||
``` | ||
|
||
## Preparatory Code: | ||
Method to be overloaded: `PerfChecker.prep(config_dict::Dict, block::Expr, ::Val{:myperfextension})::Expr` | ||
|
||
This method exists to run arbitrary "preparatory" code (represented by `block` parameter here) before running the code to be performance tested for. | ||
|
||
The output from here are stored inside the `:prep_result` key of the configuration dictionary. | ||
|
||
Example for Chairmarks.jl: | ||
```julia | ||
PerfChecker.prep(::Dict, block::Expr, ::Val{:chairmark}) = quote | ||
$block | ||
nothing | ||
end | ||
``` | ||
|
||
This just runs the code in `block` provided by the user. | ||
|
||
## Main Code to be Performance Tested | ||
Method to be overloaded: `PerfChecker.check(config_dict::Dict, block::Expr, ::Val{:myperfextension})::Expr`. | ||
|
||
Runs the appropriate code to run the performance testing on user's code. For Chairmarks.jl, it looks like this: | ||
```julia | ||
function PerfChecker.check(d::Dict, block::Expr, ::Val{:chairmark}) | ||
quote | ||
d = $d | ||
return @be $block evals=d[:evals] seconds=d[:seconds] samples=d[:samples] gc=d[:gc] | ||
end | ||
end | ||
``` | ||
|
||
The output from here are stored inside the `:check_result` key of the configuration dictionary. | ||
|
||
## Post Performance Testing Code | ||
Method to be overloaded: `PerfChecker.post(config_dict::Dict, ::Val{:myperfextension})` | ||
|
||
The code to be run after the performance testing is done. The output from here is converted into a table via the `to_table` method overloading. | ||
|
||
In the case of Chairmarks.jl: | ||
```julia | ||
PerfChecker.post(d::Dict, ::Val{:chairmark}) = d[:check_result] | ||
``` | ||
|
||
## Converting the result into a Table | ||
Method to be overloaded: `PerfChecker.to_table` | ||
|
||
Convert the output from `post` function into an appropriate table. | ||
|
||
In the case of Chairmarks.jl | ||
```julia | ||
function PerfChecker.to_table(chair::Chairmarks.Benchmark) | ||
l = length(chair.samples) | ||
times = [chair.samples[i].time for i in 1:l] | ||
gctimes = [chair.samples[i].gc_fraction for i in 1:l] | ||
bytes = [chair.samples[i].bytes for i in 1:l] | ||
allocs = [chair.samples[i].allocs for i in 1:l] | ||
return Table(times = times, gctimes = gctimes, bytes = bytes, allocs = allocs) | ||
end | ||
``` | ||
|
||
|
||
--- | ||
|
||
There are also other functions that can be overloaded, mostly related to plotting but these are the basic functions to extend PerfChecker for a custom backend. |
Oops, something went wrong.