@ -64,12 +66,14 @@ Note that this return value talks about the past. The past has a drawback: all d
But we wanted to abstract over *computation* and let someone else choose how to run it. That's fundamentally incompatible with looking at the results of previous computation all the time. So, let's find a type that *describes* a computation without running it. Let's look at the function again:
@ -79,10 +83,11 @@ This is the moment where we could reach for [threads](https://en.wikipedia.org/w
What we are searching for is something that represents ongoing work towards a result in the future. Whenever we say "something" in Rust, we almost always mean a trait. Let's start with an incomplete definition of the `Future` trait:
@ -105,14 +110,16 @@ Note that calling `poll` again after case 1 happened may result in confusing beh
While the `Future` trait has existed in Rust for a while, it was inconvenient to build and describe them. For this, Rust now has a special syntax: `async`. The example from above, implemented with `async-std`, would look like this:
This is an `async`*block*. Async blocks are necessary to call `async` functions, and will instruct the compiler to include all the relevant instructions to do so. In Rust, all blocks return a value and `async` blocks happen to return a value of the kind `Future`.
But let's get to the interesting part:
```rust
task::spawn(async { })
```rust,edition2018
# extern crate async_std;
# use async_std::task;
task::spawn(async { });
```
`spawn` takes a `Future` and starts running it on a `Task`. It returns a `JoinHandle`. Futures in Rust are sometimes called *cold* Futures. You need something that starts running them. To run a Future, there may be some additional bookkeeping required, e.g. whether it's running or finished, where it is being placed in memory and what the current state is. This bookkeeping part is abstracted away in a `Task`.
@ -72,7 +84,9 @@ Tasks in `async_std` are one of the core abstractions. Much like Rust's `thread`
`Task`s are assumed to run _concurrently_, potentially by sharing a thread of execution. This means that operations blocking an _operating system thread_, such as `std::thread::sleep` or io function from Rust's `std` library will _stop execution of all tasks sharing this thread_. Other libraries (such as database drivers) have similar behaviour. Note that _blocking the current thread_ is not in and by itself bad behaviour, just something that does not mix well with the concurrent execution model of `async-std`. Essentially, never do this:
```rust
```rust,edition2018
# extern crate async_std;
# use async_std::task;
fn main() {
task::block_on(async {
// this is std::fs, which blocks
@ -91,7 +105,9 @@ In case of `panic`, behaviour differs depending on whether there's a reasonable
In practice, that means that `block_on` propagates panics to the blocking component:
```rust
```rust,edition2018,should_panic
# extern crate async_std;
# use async_std::task;
fn main() {
task::block_on(async {
panic!("test");
@ -106,7 +122,10 @@ note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace.