forked from forgejo/forgejo
Vendor Update Go Libs (#13166)
* update github.com/alecthomas/chroma v0.8.0 -> v0.8.1 * github.com/blevesearch/bleve v1.0.10 -> v1.0.12 * editorconfig-core-go v2.1.1 -> v2.3.7 * github.com/gliderlabs/ssh v0.2.2 -> v0.3.1 * migrate editorconfig.ParseBytes to Parse * github.com/shurcooL/vfsgen to 0d455de96546 * github.com/go-git/go-git/v5 v5.1.0 -> v5.2.0 * github.com/google/uuid v1.1.1 -> v1.1.2 * github.com/huandu/xstrings v1.3.0 -> v1.3.2 * github.com/klauspost/compress v1.10.11 -> v1.11.1 * github.com/markbates/goth v1.61.2 -> v1.65.0 * github.com/mattn/go-sqlite3 v1.14.0 -> v1.14.4 * github.com/mholt/archiver v3.3.0 -> v3.3.2 * github.com/microcosm-cc/bluemonday 4f7140c49acb -> v1.0.4 * github.com/minio/minio-go v7.0.4 -> v7.0.5 * github.com/olivere/elastic v7.0.9 -> v7.0.20 * github.com/urfave/cli v1.20.0 -> v1.22.4 * github.com/prometheus/client_golang v1.1.0 -> v1.8.0 * github.com/xanzy/go-gitlab v0.37.0 -> v0.38.1 * mvdan.cc/xurls v2.1.0 -> v2.2.0 Co-authored-by: Lauris BH <lauris@nix.lv>
This commit is contained in:
parent
91f2afdb54
commit
12a1f914f4
656 changed files with 52967 additions and 25229 deletions
51
vendor/github.com/klauspost/compress/zstd/README.md
generated
vendored
51
vendor/github.com/klauspost/compress/zstd/README.md
generated
vendored
|
@ -5,7 +5,6 @@ It offers a very wide range of compression / speed trade-off, while being backed
|
|||
A high performance compression algorithm is implemented. For now focused on speed.
|
||||
|
||||
This package provides [compression](#Compressor) to and [decompression](#Decompressor) of Zstandard content.
|
||||
Note that custom dictionaries are only supported for decompression.
|
||||
|
||||
This package is pure Go and without use of "unsafe".
|
||||
|
||||
|
@ -232,41 +231,6 @@ nyc-taxi-data-10M.csv gzstd 1 3325605752 928656485 23876 132.83
|
|||
nyc-taxi-data-10M.csv gzkp 1 3325605752 924718719 16388 193.53
|
||||
```
|
||||
|
||||
### Converters
|
||||
|
||||
As part of the development process a *Snappy* -> *Zstandard* converter was also built.
|
||||
|
||||
This can convert a *framed* [Snappy Stream](https://godoc.org/github.com/golang/snappy#Writer) to a zstd stream.
|
||||
Note that a single block is not framed.
|
||||
|
||||
Conversion is done by converting the stream directly from Snappy without intermediate full decoding.
|
||||
Therefore the compression ratio is much less than what can be done by a full decompression
|
||||
and compression, and a faulty Snappy stream may lead to a faulty Zstandard stream without
|
||||
any errors being generated.
|
||||
No CRC value is being generated and not all CRC values of the Snappy stream are checked.
|
||||
However, it provides really fast re-compression of Snappy streams.
|
||||
|
||||
|
||||
```
|
||||
BenchmarkSnappy_ConvertSilesia-8 1 1156001600 ns/op 183.35 MB/s
|
||||
Snappy len 103008711 -> zstd len 82687318
|
||||
|
||||
BenchmarkSnappy_Enwik9-8 1 6472998400 ns/op 154.49 MB/s
|
||||
Snappy len 508028601 -> zstd len 390921079
|
||||
```
|
||||
|
||||
|
||||
```Go
|
||||
s := zstd.SnappyConverter{}
|
||||
n, err = s.Convert(input, output)
|
||||
if err != nil {
|
||||
fmt.Println("Re-compressed stream to", n, "bytes")
|
||||
}
|
||||
```
|
||||
|
||||
The converter `s` can be reused to avoid allocations, even after errors.
|
||||
|
||||
|
||||
## Decompressor
|
||||
|
||||
Staus: STABLE - there may still be subtle bugs, but a wide variety of content has been tested.
|
||||
|
@ -337,6 +301,21 @@ A re-used Decoder will still contain the dictionaries registered.
|
|||
|
||||
When registering multiple dictionaries with the same ID, the last one will be used.
|
||||
|
||||
It is possible to use dictionaries when compressing data.
|
||||
|
||||
To enable a dictionary use `WithEncoderDict(dict []byte)`. Here only one dictionary will be used
|
||||
and it will likely be used even if it doesn't improve compression.
|
||||
|
||||
The used dictionary must be used to decompress the content.
|
||||
|
||||
For any real gains, the dictionary should be built with similar data.
|
||||
If an unsuitable dictionary is used the output may be slightly larger than using no dictionary.
|
||||
Use the [zstd commandline tool](https://github.com/facebook/zstd/releases) to build a dictionary from sample data.
|
||||
For information see [zstd dictionary information](https://github.com/facebook/zstd#the-case-for-small-data-compression).
|
||||
|
||||
For now there is a fixed startup performance penalty for compressing content with dictionaries.
|
||||
This will likely be improved over time. Just be aware to test performance when implementing.
|
||||
|
||||
### Allocation-less operation
|
||||
|
||||
The decoder has been designed to operate without allocations after a warmup.
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue