Add book draft
parent
5b0a6269a9
commit
a0143dd098
@ -0,0 +1 @@
|
||||
book
|
@ -0,0 +1,6 @@
|
||||
[book]
|
||||
authors = ["The async-std maintainers"]
|
||||
language = "en"
|
||||
multilingual = false
|
||||
src = "src"
|
||||
title = "Async programming in Rust with async-std"
|
@ -0,0 +1,23 @@
|
||||
# Summary
|
||||
|
||||
- [Overview](./overview.md)
|
||||
- [`async-std`](./overview/async-std.md)
|
||||
- [`async-task`](./overview/async-task.md)
|
||||
- [`std::future` and `futures-rs`](./overview/std-and-library-futures.md)
|
||||
- [Stability guarantees](./overview/stability-guarantees.md)
|
||||
- [Async concepts using async-std](./concepts.md)
|
||||
- [Futures](./concepts/futures.md)
|
||||
- [Tasks](./concepts/tasks.md)
|
||||
- [Async read/write](./concepts/async-read-write.md)
|
||||
- [Streams and Channels](./concepts/streams.md)
|
||||
- [Tutorials](./tutorials/index.md)
|
||||
- [Integrating std::thread](./tutorials/integrating-std-thread.md)
|
||||
- [Async Patterns](./patterns.md)
|
||||
- [Fork/Join](./patterns/fork-join.md)
|
||||
- [Accepting requests](./patterns/accepting-concurrent-requests.md)
|
||||
- [Proper Shutdown](./patterns/proper-shutdown.md)
|
||||
- [Background Tasks](./patterns/background-tasks.md)
|
||||
- [Testing](./patterns/testing.md)
|
||||
- [Security practices](./security/index.md)
|
||||
- [Security disclosures and policy](./security/policy.md)
|
||||
- [Glossary](./glossary.md)
|
@ -0,0 +1,15 @@
|
||||
# Async concepts using async-std
|
||||
|
||||
[Rust Futures][futures] have the reputation of being hard. We don't think this is the case. They are, in our opinion, one of the easiest concurrency concepts around and have an intuitive explanation.
|
||||
|
||||
However, there are good reasons for that perception. Futures have three concepts at their base that seem to be a constant source of confusion: deferred computation, asynchronicity and independence of execution strategy.
|
||||
|
||||
These concepts are not hard, but something many people are not used to. This base confusion is amplified by many implementations oriented on details and hard to understand. Most explanations of these implementations also target advanced users. We both try to provide easy to understand primitives and approachable overviews of the concepts.
|
||||
|
||||
Futures are a concept that abstracts over how code is run. By themselves, they do nothing. This is a weird concept in an imperative language, where usually one thing happens after the other - right now.
|
||||
|
||||
So how do Futures run? You decide! Futures do nothing without the piece of code _executing_ them. This part is called an _executor_. An _executor_ decides _when_ and _how_ to execute your futures. `async-task` is such an _executor_, `async-std` is a library providing the building blocks.
|
||||
|
||||
Let's start with a little bit of motivation, though.
|
||||
|
||||
[futures]: https://en.wikipedia.org/wiki/Futures_and_promises
|
@ -0,0 +1 @@
|
||||
# Async read/write
|
@ -0,0 +1,122 @@
|
||||
# Futures
|
||||
|
||||
> I have looked into the future, everyone is slightly older.
|
||||
|
||||
-- Future of the Left -- The Plot Against Common Sense
|
||||
|
||||
A notable point about Rust is [_fearless concurrency_][fearless-concurrency]. That is the notion that you should be empowered to do concurrent things, without giving up safety. Also, Rust being a low-level language, it's about fearless concurrency _without picking a specific implementation strategy_. This means we _must_ abstract over the strategy, to allow choice _later_, if we want to have any way to share code between users of different strategies.
|
||||
|
||||
Futures abstract over _computation_. They describe the "what", independent of the "where" and the "when". For that, they aim to break code into small, composable actions that can then be executed by a part of our system. Let's take a tour through what it means to compute things to find where we can abstract.
|
||||
|
||||
## Send and Sync
|
||||
|
||||
Luckily, concurrent Rust already has two well-known and effective concepts abstracting over sharing between Rust concurrent parts of a program: Send and Sync. Notably, both the Send and Sync traits abstract over _strategies_ of concurrent work, compose neatly, and don't prescribe an implementation.
|
||||
|
||||
As a quick summary, `Send` abstracts over passing data in a computation over to another concurrent computation (let's call it the receiver), losing access to it on the sender side. In many programming languages, this strategy is commonly implemented, but missing support from the language side expects you to keep up this behaviour yourself. This is a regular source of bugs: senders keeping handles to sent things around and maybe even working with them after sending. Rust mitigates this problem by making this behaviour known. Types can be `Send` or not (by implementing the appropriate marker trait), allowing or disallowing sending them around.
|
||||
|
||||
Note how we avoided any word like _"thread"_, but instead opted for "computation". The full power of `Send` (and subsequently also `Sync`) is that they relieve you of the burden of knowing _what_ shares. At the point of implementation, you only need to know which method of sharing is appropriate for the type at hand. This keeps reasoning local and is not influenced by whatever implementation the user of that type later uses.
|
||||
|
||||
`Sync` is about sharing data between two concurrent parts of a program. This is another common pattern: as writing to a memory location or reading while another party is writing is inherently unsafe, this access needs to be moderated through synchronisation.[^1] There are many common ways of two parties to agree on not using the same part in memory at the same time, for example mutexes and spinlocks. Again, Rust gives you the option of (safely!) not caring. Rust gives you the ability to express that something _needs_ synchronisation while not being specific about the _how_.
|
||||
|
||||
`Send` and `Sync` can be composed in interesting fashions, but that's beyond the scope here. You can find examples in the [Rust Book][rust-book-sync].
|
||||
|
||||
To sum up: Rust gives us the ability to safely abstract over important properties of concurrent programs: their data sharing. It does so in a very lightweight fashion: the language itself only knows about the two markers `Send` and `Sync` and helps us a little by deriving them itself, when possible. The rest is a library concern.
|
||||
|
||||
## An easy view of computation
|
||||
|
||||
While computation is a subject to write a whole [book][understanding-computation] about, a very simplified view of them suffices for us:
|
||||
|
||||
* computation is a sequence of composable operations
|
||||
* they can branch based on a decision
|
||||
* they either run to succession and yield a result or they can yield an error
|
||||
|
||||
## Deferring computation
|
||||
|
||||
As mentioned above `Send` and `Sync` are about data. But programs are not only about data, they also talk about _computing_ the data. And that's what [Futures][futures] do. We are going to have a close look at how that works in the next chapter. Let's look at what Futures allow us to express, in English. Futures go from this plan:
|
||||
|
||||
* Do X
|
||||
* If X succeeds, do Y
|
||||
|
||||
towards
|
||||
|
||||
* Start doing X
|
||||
* Once X succeeds, start doing Y
|
||||
|
||||
Remember the talk about "deferred computation" in the intro? That's all it is. Instead of telling the computer what to execute and decide upon _now_, you tell it what to start doing and how to react on potential events the... well... `Future`.
|
||||
|
||||
## Orienting towards the beginning
|
||||
|
||||
Let's have a look at a simple function, specifically the return value:
|
||||
|
||||
```rust
|
||||
fn compute_value() -> String {
|
||||
"test".into()
|
||||
}
|
||||
```
|
||||
|
||||
You can call that at any time, so you are in full control on when you call it. But here's the problem: the moment you call it, you transfer control to the called function. It returns a value.
|
||||
|
||||
Note that this return value talks about the past. The past has a drawback: all decisions have been made. It has an advantage: the outcome is visible. We can unwrap the presents of program past and then decide what to do with it.
|
||||
|
||||
But here's a problem: we wanted to abstract over _computation_ to be allowed to let someone else choose how to run it. That's fundamentally incompatible with looking at the results of previous computation all the time. So, let's find a type that describes a computation without running it. Let's look at the function again:
|
||||
|
||||
```rust
|
||||
fn compute_value() -> String {
|
||||
"test".into()
|
||||
}
|
||||
```
|
||||
|
||||
Speaking in terms of time, we can only take action _before_ calling the function or _after_ the function returned. This is not desireable, as it takes from us the ability to do something _while_ it runs. When working with parallel code, this would take from us the ability to start a parallel task while the first runs (because we gave away control).
|
||||
|
||||
This is the moment where we could reach for [threads][threads]. But threads are a very specific concurrency primitive and we said that we are searching for an abstraction.
|
||||
|
||||
What we are searching is something that represents ongoing work towards a result in the future. Whenever we say `something` in Rust, we almost always mean a trait. Let's start with an incomplete definition of the `Future` trait:
|
||||
|
||||
```rust
|
||||
trait Future {
|
||||
type Output;
|
||||
|
||||
fn poll(self: Pin<&mut Self>, cx: &mut Context) -> Poll<Self::Output>;
|
||||
}
|
||||
```
|
||||
|
||||
Ignore `Pin` and `Context` for now, you don't need them for high-level understanding. Looking at it closely, we see the following: it is generic over the `Output`. It provides a function called `poll`, which allows us to check on the state of the current computation.
|
||||
|
||||
Every call to `poll()` can result in one of these two cases:
|
||||
|
||||
1. The future is done, `poll` will return [`Poll::Ready`][poll-ready]
|
||||
2. The future has not finished executing, it will return [`Poll::Pending`][poll-pending]
|
||||
|
||||
This allows us to externally check if a `Future` has finished doing its work, or is finally done and can give us the value. The most simple way (but not efficient) would be to just constantly poll futures in a loop. There's optimistions here, and this is what a good runtime is does for you.
|
||||
|
||||
Note that calling `poll` after case 1 happened may result in confusing behaviour. See the [futures-docs][futures-docs] for details.
|
||||
|
||||
## Async
|
||||
|
||||
While the `Future` trait has existed in Rust for a while, it was inconvenient to build and describe them. For this, Rust now has a special syntax: `async`. It takes the idea introduced above: if we want to have a function that sets up a deferred computation, we call it an `async` function:
|
||||
|
||||
```rust
|
||||
async fn compute_value() -> String {
|
||||
"test".into()
|
||||
}
|
||||
```
|
||||
|
||||
When this function is called, it will produce a `Future<Output=String>` instead of immediately returning a String. (Or, more precisely, generate a type for you that implements `Future<Output=String>`.)
|
||||
|
||||
## Conclusion
|
||||
|
||||
Working from values, we searched for something that expresses _working towards a value available sometime later_. From there, we talked about the concept of polling.
|
||||
|
||||
A `Future` is any data type that does not represent a value, but the ability to _produce a value at some point in the future_. Implementations of this are very varied and detailled depending on use-case, but the interface is simple.
|
||||
|
||||
From here on, we are going to introduce you to two other important concepts: `tasks` and `streams`, to then talk about how we combine the three to build things.
|
||||
|
||||
|
||||
[^1]: Two parties reading while it is guaranteed that no one is writing is always safe.
|
||||
|
||||
[poll-ready]: https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Ready
|
||||
[poll-pending]: https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Pending
|
||||
[futures-docs]: https://doc.rust-lang.org/std/future/trait.Future.html
|
||||
[fearless-concurrency]: https://blog.rust-lang.org/2015/04/10/Fearless-Concurrency.html
|
||||
[understanding-computation]: https://computationbook.com/
|
||||
[threads]: https://en.wikipedia.org/wiki/Thread_(computing)
|
@ -0,0 +1 @@
|
||||
# Streams
|
@ -0,0 +1 @@
|
||||
# tasks
|
@ -0,0 +1,11 @@
|
||||
# Overview
|
||||
|
||||
`async-std` and `async-task` along with their [supporting libraries][organization] are a two libraries making your life in async programming easier. They provide fundamental implementations for downstream libraries and applications alike.
|
||||
|
||||
`async-std` provides an interface to all important primitives: filesystem operations, network operations and concurrency basics like timers. It also exposes `async-task` in a model similar to the `thread` module found in the Rust standard lib. The name reflects the approach of this library: it is a closely modeled to the Rust main standard library as possible, replacing all components by async counterparts. You can read more about `async-std` in [the overview chapter][overview-std].
|
||||
|
||||
`async-task` is a library for implementing asynchronous tasks quickly. For the purpose of this documentation, you will mainly interact with it through the `async_std::task` module. Still, it has some nice properties to be aware of, which you can read up on in the [`async-task` overview chapter][overview-task].
|
||||
|
||||
[organization]: https://github.com/async-std/async-std
|
||||
[overview-std]: overview/async-std/
|
||||
[overview-task]: overview/async-task/
|
@ -0,0 +1 @@
|
||||
# async-std
|
@ -0,0 +1 @@
|
||||
# async-task
|
@ -0,0 +1,29 @@
|
||||
# Stability and SemVer
|
||||
|
||||
`async-std` and `async-task` follow https://semver.org/.
|
||||
|
||||
In short: we are versioning our software as `MAJOR.MINOR.PATCH`. We increase the:
|
||||
|
||||
* MAJOR version when there are incompatible API changes,
|
||||
* MINOR version when we introducece functionality in a backwards-compatible manner
|
||||
* PATCH version when we make backwards-compatible bug fixes
|
||||
|
||||
## Future expectations
|
||||
|
||||
`async-std` uses the `AsyncRead/AsyncWrite/AsyncSeek/AsyncBufRead` and the `Stream` traits from the `futures-rs` library. We expect those to be conservatively updated and in lockstep. Breaking changes in these traits will lead to a major version upgrade, for which we will provide migration documentation.
|
||||
|
||||
## Minimum version policy
|
||||
|
||||
The current tentative policy is that the minimum Rust version required to use this crate can be increased in minor version updates. For example, if `async-std` 1.0 requires Rust 1.37.0, then `async-std` 1.0.z for all values of z will also require Rust 1.37.0 or newer. However, `async-std` 1.y for y > 0 may require a newer minimum version of Rust.
|
||||
|
||||
In general, this crate will be conservative with respect to the minimum supported version of Rust. With `async/await` being a new feature though, we will track changes in a measured pace.
|
||||
|
||||
## Security fixes
|
||||
|
||||
Security fixes will be applied to _all_ minor branches of this library in all _supported_ major revisions. This policy might change in the future, in which case we give at least _3 month_ of ahead notice.
|
||||
|
||||
## Credits
|
||||
|
||||
This policy is based on [burntsushis regex crate][regex-policy].
|
||||
|
||||
[regex-policy]: https://github.com/rust-lang/regex#minimum-rust-version-policy
|
@ -0,0 +1 @@
|
||||
# `std::future` and `futures-rs`
|
@ -0,0 +1 @@
|
||||
# Async Patterns
|
@ -0,0 +1 @@
|
||||
# Accepting requests
|
@ -0,0 +1 @@
|
||||
# Async read/write
|
@ -0,0 +1 @@
|
||||
# Background Tasks
|
@ -0,0 +1 @@
|
||||
# Fork/Join
|
@ -0,0 +1 @@
|
||||
# Proper Shutdown
|
@ -0,0 +1 @@
|
||||
# Testing
|
@ -0,0 +1,12 @@
|
||||
# Security
|
||||
|
||||
Writing a highly perfomant async core library is a task involving some instances of unsafe code.
|
||||
|
||||
We take great care in vetting all unsafe code included in `async-std` and do follow generally accepted practices.
|
||||
|
||||
In the case that you find a security-related bug in our library, please get in touch with our [security contact][security-policy].
|
||||
|
||||
Patches improving the resilience of the library or the testing setup are happily accepted on our [github org][github].
|
||||
|
||||
[security-policies]: /security/policy
|
||||
[github]: https://github.com/async-std/
|
@ -0,0 +1 @@
|
||||
# Tutorials
|
@ -0,0 +1,43 @@
|
||||
# Exercise: Waiting for `std::thread`
|
||||
|
||||
Parallel processing is usually done via [threads].
|
||||
Concurrent programming is usually done with systems similar to [async-task].
|
||||
These two worlds seem different - and in some regards, they are - though they
|
||||
are easy to connect.
|
||||
In this exercise, you will learn how to connect to concurrent/parallel components easily, by connecting a thread to a task.
|
||||
|
||||
## Understanding the problem
|
||||
|
||||
The standard thread API in Rust is `std::thread`. Specifically, it contains the [`spawn`] function, which allows us to start a thread:
|
||||
|
||||
```rust
|
||||
std::thread::spawn(|| {
|
||||
println!("in child thread");
|
||||
})
|
||||
println!("in parent thread");
|
||||
```
|
||||
|
||||
This creates a thread, _immediately_ [schedules] it to run, and continues. This is crucial: once the thread is spawned, it is independent of its _parent thread_. If you want to wait for the thread to end, you need to capture its [`JoinHandle`] and join it with your current thread:
|
||||
|
||||
```rust
|
||||
let thread = std::thread::spawn(|| {
|
||||
println!("in child thread");
|
||||
})
|
||||
thread.join();
|
||||
println!("in parent thread");
|
||||
```
|
||||
|
||||
This comes at a cost though: the waiting thread will [block] until the child is done. Wouldn't it be nice if we could just use the `.await` syntax here and leave the opportunity for another task to be scheduled while waiting?
|
||||
|
||||
## Backchannels
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
[threads]: TODO: wikipedia
|
||||
[async-task]: TODO: link
|
||||
[`spawn`]: TODO: link
|
||||
[`JoinHandle`]: TODO: link
|
||||
[schedules]: TODO: Glossary link
|
||||
[block]: TODO: Link to blocking
|
Loading…
Reference in New Issue