Skip to content

🚀 async & future — Higher-level Abstractions

std::asyncstd::future cho phép chạy code bất đồng bộ mà không cần quản lý threads thủ công.

Tại sao Higher Abstractions?

Vấn đề với raw threads

cpp
#include <thread>
#include <iostream>

int compute() {
    // Làm gì đó nặng...
    return 42;
}

int main() {
    int result;
    
    std::thread t([&result]() {
        result = compute();  // ⚠️ Race condition potential
    });
    
    t.join();
    std::cout << result << std::endl;
    
    return 0;
}

Vấn đề:

  • Phải quản lý thread lifecycle (join, detach)
  • Return value phức tạp (cần shared variable)
  • Exception handling khó khăn

Giải pháp: std::async

cpp
#include <future>
#include <iostream>

int compute() {
    return 42;
}

int main() {
    // ✅ Simple and clean!
    std::future<int> result = std::async(compute);
    
    std::cout << result.get() << std::endl;  // 42
    
    return 0;
}

Analogy: Đặt hàng nhà hàng

┌─────────────────────────────────────────────────────────────────┐
│                    RESTAURANT ANALOGY                            │
├─────────────────────────────────────────────────────────────────┤
│                                                                 │
│   YOU (Main thread)          KITCHEN (Worker thread)            │
│   ─────────────────          ────────────────────────           │
│                                                                 │
│   1. Order food              std::async(cook_food)              │
│         │                                                       │
│         ▼                                                       │
│   ┌───────────┐              ┌───────────────────┐              │
│   │  RECEIPT  │ ◄─────────── │    PROMISE        │              │
│   │  (future) │              │ (kitchen commits) │              │
│   └───────────┘              └───────────────────┘              │
│         │                            │                          │
│         │   Do other things...       │ Cooking...               │
│         │                            │                          │
│         ▼                            ▼                          │
│   result.get()  ◄───────────  promise.set_value(food)           │
│   (Wait & receive)           (Food ready!)                      │
│                                                                 │
│   🍜 Enjoy!                                                     │
│                                                                 │
└─────────────────────────────────────────────────────────────────┘
  • future = Receipt (phiếu nhận hàng) — bạn giữ nó và đợi
  • promise = Kitchen's commitment — nhà bếp hứa sẽ làm xong
  • get() = Lấy hàng — chờ nếu chưa xong

std::async — Fire and Forget

Basic Usage

cpp
#include <future>
#include <iostream>
#include <chrono>

int heavyComputation(int x) {
    std::this_thread::sleep_for(std::chrono::seconds(2));
    return x * x;
}

int main() {
    std::cout << "Starting async task...\n";
    
    // Launch async task
    std::future<int> result = std::async(heavyComputation, 10);
    
    // Do other work while computation runs...
    std::cout << "Doing other work...\n";
    
    // Get result (blocks if not ready)
    int value = result.get();
    std::cout << "Result: " << value << std::endl;  // 100
    
    return 0;
}

Launch Policies

cpp
#include <future>
#include <iostream>
#include <thread>

void showThreadId() {
    std::cout << "Task thread: " << std::this_thread::get_id() << std::endl;
}

int main() {
    std::cout << "Main thread: " << std::this_thread::get_id() << std::endl;
    
    // std::launch::async — FORCE chạy trên thread mới
    auto f1 = std::async(std::launch::async, showThreadId);
    f1.get();  // Thread ID khác main
    
    // std::launch::deferred — Lazy evaluation, chạy khi gọi get()
    auto f2 = std::async(std::launch::deferred, showThreadId);
    f2.get();  // Thread ID GIỐNG main (chạy trên main thread)
    
    // Default: async | deferred — Compiler quyết định
    auto f3 = std::async(showThreadId);
    f3.get();  // Có thể giống hoặc khác main
    
    return 0;
}
┌─────────────────────────────────────────────────────────────────┐
│                    LAUNCH POLICIES                               │
├─────────────────────────────────────────────────────────────────┤
│                                                                 │
│  std::launch::async                                             │
│  ─────────────────                                              │
│  • FORCE tạo thread mới                                         │
│  • Task chạy NGAY LẬP TỨC                                       │
│  • Dùng khi cần parallel execution                              │
│                                                                 │
│  std::launch::deferred                                          │
│  ──────────────────────                                         │
│  • KHÔNG tạo thread mới                                         │
│  • Task chạy KHI GỌI get() hoặc wait()                          │
│  • Dùng khi muốn lazy evaluation                                │
│                                                                 │
│  Default (async | deferred)                                     │
│  ──────────────────────────                                     │
│  • Compiler/runtime quyết định                                  │
│  • ⚠️ Có thể không chạy nếu không gọi get()!                   │
│                                                                 │
└─────────────────────────────────────────────────────────────────┘

⚠️ Default Policy Trap

Với default policy, nếu bạn không gọi get() hoặc wait(), task có thể không bao giờ chạy!


std::future — Handle to Async Result

API Overview

cpp
std::future<T> f = std::async(...);

// Get result (blocks, can only call ONCE)
T value = f.get();

// Wait without getting result
f.wait();

// Wait with timeout
std::future_status status = f.wait_for(std::chrono::seconds(1));

if (status == std::future_status::ready) {
    // Result is ready
} else if (status == std::future_status::timeout) {
    // Still running
} else if (status == std::future_status::deferred) {
    // Hasn't started (deferred policy)
}

// Check if future is valid
bool valid = f.valid();

Timeout Pattern

cpp
#include <future>
#include <iostream>
#include <chrono>

int slowTask() {
    std::this_thread::sleep_for(std::chrono::seconds(5));
    return 42;
}

int main() {
    auto future = std::async(std::launch::async, slowTask);
    
    // Poll with timeout
    while (true) {
        auto status = future.wait_for(std::chrono::milliseconds(500));
        
        if (status == std::future_status::ready) {
            std::cout << "Result: " << future.get() << std::endl;
            break;
        }
        
        std::cout << "Still waiting...\n";
    }
    
    return 0;
}

Output:

Still waiting...
Still waiting...
Still waiting...
...
Result: 42

Exception Propagation

Exceptions trong async task được propagate khi gọi get():

cpp
#include <future>
#include <iostream>
#include <stdexcept>

int riskyTask() {
    throw std::runtime_error("Something went wrong!");
    return 42;
}

int main() {
    auto future = std::async(std::launch::async, riskyTask);
    
    try {
        int result = future.get();  // ❌ Exception thrown here!
    } catch (const std::exception& e) {
        std::cout << "Caught: " << e.what() << std::endl;
    }
    
    return 0;
}

💡 KEY INSIGHT

Exception được "lưu trữ" trong future và chỉ được throw khi bạn gọi get(). Đây là cách clean để handle errors từ async tasks.


std::promise — Producer Side

std::promiseproducer — nó "hứa" sẽ cung cấp một giá trị:

cpp
#include <future>
#include <thread>
#include <iostream>

void producer(std::promise<int>& prom) {
    // Do some work...
    std::this_thread::sleep_for(std::chrono::seconds(1));
    
    // Fulfill the promise
    prom.set_value(42);
}

int main() {
    std::promise<int> prom;
    std::future<int> fut = prom.get_future();  // Get future from promise
    
    std::thread t(producer, std::ref(prom));
    
    std::cout << "Waiting for result...\n";
    int result = fut.get();  // Block until promise is fulfilled
    std::cout << "Got: " << result << std::endl;
    
    t.join();
    return 0;
}

Setting Exceptions

cpp
void riskyProducer(std::promise<int>& prom) {
    try {
        // Something might fail...
        throw std::runtime_error("Failed!");
    } catch (...) {
        // Pass exception to future
        prom.set_exception(std::current_exception());
    }
}

Promise Use Cases

Use CaseDescription
Thread signalingNotify another thread that work is done
One-shot eventsSingle producer, single consumer
Custom async patternsWhen std::async doesn't fit

std::shared_future — Multiple Consumers

std::future chỉ có thể get() một lần. Dùng std::shared_future cho nhiều consumers:

cpp
#include <future>
#include <thread>
#include <iostream>
#include <vector>

int compute() {
    std::this_thread::sleep_for(std::chrono::seconds(1));
    return 42;
}

void consumer(std::shared_future<int> fut, int id) {
    int value = fut.get();  // ✅ Mỗi consumer đều có thể get()
    std::cout << "Consumer " << id << " got: " << value << std::endl;
}

int main() {
    std::shared_future<int> fut = std::async(std::launch::async, compute).share();
    
    std::vector<std::thread> consumers;
    for (int i = 0; i < 3; ++i) {
        consumers.emplace_back(consumer, fut, i);
    }
    
    for (auto& t : consumers) {
        t.join();
    }
    
    return 0;
}

Output:

Consumer 0 got: 42
Consumer 1 got: 42
Consumer 2 got: 42

Practical Examples

Parallel Computation

cpp
#include <future>
#include <vector>
#include <numeric>
#include <iostream>

long long parallelSum(const std::vector<int>& data) {
    size_t mid = data.size() / 2;
    
    // Compute first half async
    auto futureFirstHalf = std::async(std::launch::async, [&]() {
        return std::accumulate(data.begin(), data.begin() + mid, 0LL);
    });
    
    // Compute second half on this thread
    long long secondHalf = std::accumulate(data.begin() + mid, data.end(), 0LL);
    
    // Combine results
    return futureFirstHalf.get() + secondHalf;
}

int main() {
    std::vector<int> data(10000000, 1);  // 10 million 1s
    
    auto start = std::chrono::high_resolution_clock::now();
    long long sum = parallelSum(data);
    auto end = std::chrono::high_resolution_clock::now();
    
    std::cout << "Sum: " << sum << std::endl;
    std::cout << "Time: " 
              << std::chrono::duration_cast<std::chrono::milliseconds>(end - start).count()
              << "ms\n";
    
    return 0;
}

Multiple Async Tasks

cpp
#include <future>
#include <iostream>
#include <string>

std::string fetchFromDB() {
    std::this_thread::sleep_for(std::chrono::milliseconds(200));
    return "DB Data";
}

std::string fetchFromAPI() {
    std::this_thread::sleep_for(std::chrono::milliseconds(300));
    return "API Data";
}

std::string fetchFromCache() {
    std::this_thread::sleep_for(std::chrono::milliseconds(50));
    return "Cache Data";
}

int main() {
    // Fire all requests in parallel
    auto dbFuture = std::async(std::launch::async, fetchFromDB);
    auto apiFuture = std::async(std::launch::async, fetchFromAPI);
    auto cacheFuture = std::async(std::launch::async, fetchFromCache);
    
    // Wait for all (total time = max(200, 300, 50) = 300ms instead of 550ms)
    std::cout << cacheFuture.get() << std::endl;
    std::cout << dbFuture.get() << std::endl;
    std::cout << apiFuture.get() << std::endl;
    
    return 0;
}

📚 Tổng kết

ComponentRoleAnalogy
std::asyncLaunch async taskĐặt hàng
std::futureHandle to resultPhiếu nhận hàng
std::promiseProducer sideNhà bếp cam kết
std::shared_futureMultiple consumersCopy phiếu cho nhiều người
Launch PolicyBehavior
std::launch::asyncForce new thread
std::launch::deferredLazy, run on get()
DefaultImplementation-defined

➡️ Tiếp theo

Async/future vẫn có overhead (thread creation, synchronization). Để performance cực cao, chúng ta cần lock-free programming...

Atomics → — std::atomic cho lock-free basics.