Merge pull request #56 from dcarosone/book-typos-and-small-edits

small typos and edits in first few pages
tosocketaddrs
James Munns 5 years ago committed by GitHub
commit 58804c2524
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -4,11 +4,11 @@
However, there are good reasons for that perception. Futures have three concepts at their base that seem to be a constant source of confusion: deferred computation, asynchronicity and independence of execution strategy.
These concepts are not hard, but something many people are not used to. This base confusion is amplified by many implementations oriented on details and hard to understand. Most explanations of these implementations also target advanced users. We both try to provide easy to understand primitives and approachable overviews of the concepts.
These concepts are not hard, but something many people are not used to. This base confusion is amplified by many implementations oriented on details. Most explanations of these implementations also target advanced users, and can be hard for beginners. We try to provide both easy-to-understand primitives and approachable overviews of the concepts.
Futures are a concept that abstracts over how code is run. By themselves, they do nothing. This is a weird concept in an imperative language, where usually one thing happens after the other - right now.
So how do Futures run? You decide! Futures do nothing without the piece of code _executing_ them. This part is called an _executor_. An _executor_ decides _when_ and _how_ to execute your futures. The `async-std::task` module provides you with and interface to such an executor.
So how do Futures run? You decide! Futures do nothing without the piece of code _executing_ them. This part is called an _executor_. An _executor_ decides _when_ and _how_ to execute your futures. The `async-std::task` module provides you with an interface to such an executor.
Let's start with a little bit of motivation, though.

@ -6,26 +6,28 @@ Futures abstract over *computation*. They describe the "what", independent of th
## Send and Sync
Luckily, concurrent Rust already has two well-known and effective concepts abstracting over sharing between Rust concurrent parts of a program: Send and Sync. Notably, both the Send and Sync traits abstract over *strategies* of concurrent work, compose neatly, and don't prescribe an implementation.
Luckily, concurrent Rust already has two well-known and effective concepts abstracting over sharing between concurrent parts of a program: Send and Sync. Notably, both the Send and Sync traits abstract over *strategies* of concurrent work, compose neatly, and don't prescribe an implementation.
As a quick summary, `Send` abstracts over passing data in a computation over to another concurrent computation (let's call it the receiver), losing access to it on the sender side. In many programming languages, this strategy is commonly implemented, but missing support from the language side expects you to keep up this behaviour yourself. This is a regular source of bugs: senders keeping handles to sent things around and maybe even working with them after sending. Rust mitigates this problem by making this behaviour known. Types can be `Send` or not (by implementing the appropriate marker trait), allowing or disallowing sending them around.
As a quick summary, `Send` abstracts over passing data in a computation over to another concurrent computation (let's call it the receiver), losing access to it on the sender side. In many programming languages, this strategy is commonly implemented, but missing support from the language side expects you to enforce the "losing access" behaviour yourself. This is a regular source of bugs: senders keeping handles to sent things around and maybe even working with them after sending. Rust mitigates this problem by making this behaviour known. Types can be `Send` or not (by implementing the appropriate marker trait), allowing or disallowing sending them around, and the ownership and borrowing rules prevent subsequent access.
Note how we avoided any word like *"thread"*, but instead opted for "computation". The full power of `Send` (and subsequently also `Sync`) is that they relieve you of the burden of knowing *what* shares. At the point of implementation, you only need to know which method of sharing is appropriate for the type at hand. This keeps reasoning local and is not influenced by whatever implementation the user of that type later uses.
`Sync` is about sharing data between two concurrent parts of a program. This is another common pattern: as writing to a memory location or reading while another party is writing is inherently unsafe, this access needs to be moderated through synchronisation.[^1] There are many common ways of two parties to agree on not using the same part in memory at the same time, for example mutexes and spinlocks. Again, Rust gives you the option of (safely!) not caring. Rust gives you the ability to express that something *needs* synchronisation while not being specific about the *how*.
`Sync` is about sharing data between two concurrent parts of a program. This is another common pattern: as writing to a memory location or reading while another party is writing is inherently unsafe, this access needs to be moderated through synchronisation.[^1] There are many common ways for two parties to agree on not using the same part in memory at the same time, for example mutexes and spinlocks. Again, Rust gives you the option of (safely!) not caring. Rust gives you the ability to express that something *needs* synchronisation while not being specific about the *how*.
`Send` and `Sync` can be composed in interesting fashions, but that's beyond the scope here. You can find examples in the [Rust Book][rust-book-sync].
[rust-book-sync]: https://doc.rust-lang.org/stable/book/ch16-04-extensible-concurrency-sync-and-send.html
To sum up: Rust gives us the ability to safely abstract over important properties of concurrent programs: their data sharing. It does so in a very lightweight fashion: the language itself only knows about the two markers `Send` and `Sync` and helps us a little by deriving them itself, when possible. The rest is a library concern.
To sum up: Rust gives us the ability to safely abstract over important properties of concurrent programs, their data sharing. It does so in a very lightweight fashion; the language itself only knows about the two markers `Send` and `Sync` and helps us a little by deriving them itself, when possible. The rest is a library concern.
## An easy view of computation
While computation is a subject to write a whole [book](https://computationbook.com/) about, a very simplified view of them suffices for us:
While computation is a subject to write a whole [book](https://computationbook.com/) about, a very simplified view suffices for us:
- computation is a sequence of composable operations
- they can branch based on a decision
- they either run to succession and yield a result or they can yield an error
- they either run to succession and yield a result, or they can yield an error
## Deferring computation
As mentioned above `Send` and `Sync` are about data. But programs are not only about data, they also talk about *computing* the data. And that's what [`Futures`][futures] do. We are going to have a close look at how that works in the next chapter. Let's look at what Futures allow us to express, in English. Futures go from this plan:
@ -82,14 +84,13 @@ Every call to `poll()` can result in one of these two cases:
1. The future is done, `poll` will return [`Poll::Ready`](https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Ready)
2. The future has not finished executing, it will return [`Poll::Pending`](https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Pending)
This allows us to externally check if a `Future` has finished doing its work, or is finally done and can give us the value. The most simple way (but not efficient) would be to just constantly poll futures in a loop. There's optimistions here, and this is what a good runtime is does for you.
This allows us to externally check if a `Future` has finished doing its work, or is finally done and can give us the value. The most simple way (but not efficient) would be to just constantly poll futures in a loop. There's optimisations here, and this is what a good runtime is does for you.
Note that calling `poll` after case 1 happened may result in confusing behaviour. See the [futures-docs](https://doc.rust-lang.org/std/future/trait.Future.html) for details.
## Async
While the `Future` trait has existed in Rust for a while, it was inconvenient to build and describe them. For this, Rust now has a special syntax: `async`. The example from above, implemented in `async-std`, would look like this:
use async_std::fs::File;
async fn read_file(path: &str) -> Result<String, io::Error> {
@ -115,7 +116,7 @@ When executing 2 or more of these functions at the same time, our runtime system
Working from values, we searched for something that expresses *working towards a value available sometime later*. From there, we talked about the concept of polling.
A `Future` is any data type that does not represent a value, but the ability to *produce a value at some point in the future*. Implementations of this are very varied and detailled depending on use-case, but the interface is simple.
A `Future` is any data type that does not represent a value, but the ability to *produce a value at some point in the future*. Implementations of this are very varied and detailed depending on use-case, but the interface is simple.
Next, we will introduce you to `tasks`, which we need to actually *run* Futures.

@ -1,4 +1,5 @@
# Tasks
Now that we know what Futures are, we now want to run them!
In `async-std`, the `tasks` (TODO: link) module is responsible for this. The simplest way is using the `block_on` function:
@ -10,7 +11,7 @@ use async_std::task;
async fn read_file(path: &str) -> Result<String, io::Error> {
let mut file = File::open(path).await?;
let mut contents = String::new();
file.read_to_string(&mut contents).await?;
file.read_to_string(&mut contents).await?;
contents
}
@ -54,12 +55,10 @@ task::spawn(async { })
For now, it is enough to know that once you `spawn`ed a task, it will continue running in the background. The `JoinHandle` in itself is a future that will finish once the `Task` ran to conclusion. Much like with `threads` and the `join` function, we can now call `block_on` on the handle to *block* the program (or the calling thread, to be specific) to wait for it to finish.
## Tasks in `async_std`
Tasks in `async_std` are one of the core abstractions. Much like Rusts `thread`s, they provide some practical functionality over the raw concept. `Tasks` have a relationship to the runtime, but they are in themselves separate. `async_std` tasks have a number of desirable properties:
- They are allocated in one single allocation
- All tasks have a *backchannel*, which allows them to propagate results and errors to the spawning task through the `JoinHandle`
- The carry desirable metadata for debugging
@ -98,7 +97,7 @@ fn main() {
}
```
```
```text
thread 'async-task-driver' panicked at 'test', examples/panic.rs:8:9
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace.
```
@ -115,7 +114,7 @@ task::block_on(async {
})
```
```
```text
thread 'async-task-driver' panicked at 'test', examples/panic.rs:8:9
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace.
Aborted (core dumped)

@ -2,8 +2,8 @@
![async-std logo](./images/horizontal_color.svg)
This book serves as high-level documentation for `async-std` and a way of learning async programming in Rust through it. As such, i focusses on the `async-std` and its task model give you.
This book serves as high-level documentation for `async-std` and a way of learning async programming in Rust through it. As such, it focusses on the `async-std` API and the task model it gives you.
Please note that the Rust project provides its own book on asynchronous programming, called ["Asynchronous Programming in Rust"][async-book], which we highly recommend reading along with this book, as it provides a different, wider view on the topic.
[async-book]: https://rust-lang.github.io/async-book/
[async-book]: https://rust-lang.github.io/async-book/

Loading…
Cancel
Save