Skip to content

Commit a4dba69

Browse files
committed
A new stack-based vector
1 parent 1aa4590 commit a4dba69

File tree

1 file changed

+185
-0
lines changed

1 file changed

+185
-0
lines changed

text/2978-stack_vec.md

Lines changed: 185 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,185 @@
1+
- Feature Name: `stack_vec`
2+
- Start Date: 2020-09-27
3+
- RFC PR: [rust-lang/rfcs#0000](https://github.com/rust-lang/rfcs/pull/0000)
4+
- Rust Issue: [rust-lang/rust#0000](https://github.com/rust-lang/rust/issues/0000)
5+
6+
# Summary
7+
[summary]: #summary
8+
9+
This RFC proposes the creation of a new "growable" vector named `StackVec` that manages stack memory and can be seen as an alternative for the built-in structure that handles heap-allocated memory, aka `alloc::vec::Vec<T>`.
10+
11+
# Motivation
12+
[motivation]: #motivation
13+
14+
`core::collections::StackVec<T>` has several use-cases and should be conveniently added into the standard library due to its importance.
15+
16+
### Unification
17+
18+
There are a lot of different crates about the subject that tries to do roughly the same thing, a centralized implementation would stop the current fragmentation.
19+
20+
### Optimization
21+
22+
Stack-based allocation is generally faster than heap-based allocation and can be used as an optimization in places that otherwise would have to call an allocator. Some resource-constrained embedded devices can also benefit from it.
23+
24+
### Building block
25+
26+
Just like `Vec`, `StackVec` is a primitive vector where high-level structures can use it as a building block. For example, a stack-based matrix or binary heap.
27+
28+
### Useful in the real world
29+
30+
`arrayvec` is one of the most downloaded project of `crates.io` and is used by thousand of projects, including Rustc itself.
31+
32+
### Compiler internals
33+
34+
Unstable features can be used internally to make operations more efficient, e.g., `specialization`.
35+
36+
# Guide-level explanation
37+
[guide-level-explanation]: #guide-level-explanation
38+
39+
Instead of relying on a heap-allocator, stack-based memory area is added and removed on-demand in a last-in-first-out (LIFO) order according to the calling workflow of a program. Let's illustrate an imaginary function `super()` that allocates 32 bits before calling another function named `sub()` that also allocates 32 bits of auxiliary memory:
40+
41+
```txt
42+
0 bits 32 bits 64 bits
43+
44+
45+
| | -> | | -> | |
46+
--------- --------- ---------
47+
| super | | sub |
48+
--------- ---------
49+
| super |
50+
---------
51+
```
52+
53+
* Now `super()` returns and the exactly same thing happens in reverse order, dropping everything when out of scope.
54+
55+
```txt
56+
64 bits 32 bits 0 bits
57+
58+
59+
| | -> | | -> | |
60+
--------- --------- ---------
61+
| sub | | super |
62+
--------- ---------
63+
| super |
64+
---------
65+
```
66+
67+
`StackVec` takes advantage of this predictable behavior to reserve an exactly amount of uninitialized bytes up-front and these bytes form a buffer where elements can be included dynamically.
68+
69+
```rust
70+
fn main() {
71+
// `stack_vec` has a pre-allocated memory of 2048 bits that can store up to 64 decimals.
72+
let mut stack_vec: StackVec<i32, 64> = StackVec::new();
73+
74+
// Although reserved, there isn't anything explicitly stored yet
75+
assert_eq!(stack_vec.len(), 0);
76+
77+
// Initializes the first 32 bits with a simple '1' decimal or
78+
// 00000000 00000000 00000000 00000001 bits
79+
stack_vec.push(1);
80+
81+
// Our vector memory is now split into a 32/2016 pair of initialized and
82+
// uninitialized memory respectively
83+
assert_eq!(stack_vec.len(), 1);
84+
}
85+
```
86+
87+
Of course, fixed buffers lead to some inflexibility because unlike `Vec`, the underlying capacity can not expand at run-time and there will never be more than 64 elements in the above example.
88+
89+
# Reference-level explanation
90+
[reference-level-explanation]: #reference-level-explanation
91+
92+
The most natural module for `StackVec` is `core::collections` with an API that basically mimics most of the current `Vec<T>` surface.
93+
94+
```rust
95+
// core::collections
96+
97+
pub struct StackVec<T, const N: usize> {
98+
data: MaybeUninit<[T, N]>,
99+
len: usize
100+
}
101+
102+
impl<T, const N: usize> StackVec<T, N> {
103+
// Much of the `Vec` API goes here
104+
}
105+
```
106+
107+
# Drawbacks
108+
[drawbacks]: #drawbacks
109+
110+
### Additional complexity
111+
112+
New and existing users are likely to find it difficult to differentiate the purpose of each vector type, especially people that don't have a theoretical background in memory management.
113+
114+
### The current ecosystem is fine
115+
116+
Even with all the fragmentation, different types of memory usage is an overkill in certain situations. If someone wants to use stack memory in an embedded application, then it is just a matter of grabbing an appropriated crate.
117+
118+
# Prior art
119+
[prior-art]: #prior-art
120+
121+
These are the most known structures:
122+
123+
* `arrayvec::ArrayVec`: Uses declarative macros and an `Array` trait for implementations but lacks support for arbitrary sizes.
124+
* `heapless::Vec`: With the usage of `typenum`, can support arbitrary sizes without a nightly compiler.
125+
* `staticvec::StaticVec`: Uses unstable constant generics for arrays of arbitrary sizes.
126+
* `tinyvec::ArrayVec`: Supports fixed and arbitrary (unstable feature) sizes but requires `T: Default` for security reasons.
127+
128+
As seen, there isn't an implementation that stands out among the others because all of them roughly share the same purpose and functionality. Noteworthy is the usage of constant generics that makes it possible to create an efficient and unified approach for arbitrary array sizes.
129+
130+
# Unresolved questions
131+
[unresolved-questions]: #unresolved-questions
132+
133+
### Nomenclature
134+
135+
`StackVec` will probably avoid conflicts with existing crates but another name might be a better option.
136+
137+
### Prelude
138+
139+
Should it be included in the prelude?
140+
141+
### Macros
142+
143+
```rust
144+
// Instance with 1i32, 2i32 and 3i32
145+
let _: StackVec<i32, 33> = stack_vec![1, 2, 3];
146+
147+
// Instance with 1i32 and 1i32
148+
let _: StackVec<i32, 64> = stack_vec![1; 2];
149+
```
150+
151+
# Future possibilities
152+
[future-possibilities]: #future-possibilities
153+
154+
### Dynamic array
155+
156+
An hydric approach between heap and stack memory could also be provided natively in the future.
157+
158+
```rust
159+
pub struct DynVec<T, const N: usize> {
160+
// Hides internal implementation
161+
data: DynVecData,
162+
}
163+
164+
impl<T, const N: usize> DynVec<T, N> {
165+
// Much of the `Vec` API goes here
166+
}
167+
168+
// This is just an example. `Vec<T>` could be `Box` and `enum` an `union`.
169+
enum DynVecData<T, const N: usize> {
170+
Heap(Vec<T>),
171+
Inline(StackVec<T, N>),
172+
}
173+
```
174+
175+
The above description is very similar to what `smallvec` already does.
176+
177+
### Generic collections and generic strings
178+
179+
Many structures that use `alloc::vec::Vec` as the underlying storage can also use stack or hybrid memory, for example, an hypothetical `GenericString<S>`, where `S` is the storage, could be split into:
180+
181+
```rust
182+
type DynString = GenericString<DynVec<u8>>;
183+
type HeapString = GenericString<Vec<u8>>;
184+
type StackString = GenericString<StackVec<u8>>;
185+
```

0 commit comments

Comments
 (0)