The Hidden Control Loop of Modern Software#
Introduction#
Modern systems—whether they’re robots, ATMs, or AI assistants like ChatGPT—need to stay responsive. Imagine pressing a button on an ATM and the whole machine freezing until cash dispenses. Or asking ChatGPT something, and it makes you wait in a queue behind 100 people.
This is where Asynchronous (async) programming comes in. Async isn’t just a coding trick; it’s a principle of software efficiency, comparable to control theory in engineering.
Synchronous (Sync): Tasks run one after another, each must finish before the next starts.
Asynchronous (Async): Tasks can overlap in time, allowing the CPU to handle other work while waiting.
What is Async Programming?#
In traditional (synchronous) code:
Tasks run one after another.
If one task waits (e.g., network call, disk read), the whole program waits.
In async programming:
Tasks don’t block each other.
While one task waits, the CPU does other work.
Result → smooth multitasking without needing multiple CPUs/threads.
Async and Control Theory#
Control theory ensures systems remain stable by reacting quickly to changes. Delayed feedback can make a physical system unstable.
Async plays the same role in computing:
Without async → software stalls, delays cascade, users lose responsiveness.
With async → systems remain stable, responsive, and efficient.
Example:
A robot running sync code → can’t read sensors until motor stops. Unstable.
A robot with async → reads sensors + moves simultaneously. Stable.
Async in LLM Applications (AI)#
One of the best examples today is Large Language Models (LLMs):
Sync LLM requests: Each user request waits until the previous one is fully served.
→ Slow, costly, unscalable.Async LLM requests: While one request waits on GPU, others are scheduled.
→ Many users served at once.
Other AI benefits:
Streaming outputs → get words as they’re generated.
Parallel pipelines in RAG → data fetching, embedding, and querying overlap.
Cost efficiency → GPUs never sit idle.
Without async, tools like ChatGPT would feel like waiting in a long bank queue.
Async in Broader Computer Science#
Async bridges software and hardware:
Hardware (OS level) → event loops (
epoll, IOCP) notify when I/O is ready.Software (language level) → async/await in Python, JS, Rust, Go.
Applications:
Web servers (FastAPI, Node.js).
Robotics (ROS2 event-driven sensors).
Security (fuzzing multiple endpoints in parallel).
AI (non-blocking training/inference pipelines).
It’s a unifying technique across operating systems, software frameworks, and applications.
Demo: Async in Action#
import asyncio
import time
# Simulate an LLM request (each takes 3 seconds)
def sync_llm_request(name):
time.sleep(3)
print(f"{name} response ready")
async def async_llm_request(name):
await asyncio.sleep(3)
print(f"{name} response ready")
#
Synchronous version
def sync_main():
start = time.time()
for i in range(1, 4):
sync_llm_request(f"Request {i}")
end = time.time()
print(f"Synchronous total time: {end - start:.2f} seconds")
#
Asynchronous version
async def async_main():
start = time.time()
tasks = [async_llm_request(f"Request {i}") for i in range(1, 4)]
await asyncio.gather(*tasks)
end = time.time()
print(f"Asynchronous total time: {end - start:.2f} seconds")
# Run demo
print("\n
Sync Execution
")
sync_main()
print("\n
Async Execution
")
asyncio.run(async_main())Result:
Sync → 9 seconds.
Async → 3 seconds.
Just like in real AI servers → async cuts waiting time dramatically.
Conclusion#
Async programming isn’t just about speed. It’s about:
Responsiveness: Systems feel alive.
Efficiency: Hardware is fully utilized.
Scalability: Serve more users without more CPUs.
Just as control theory keeps machines stable, async keeps digital systems efficient and responsive. From robots to LLMs, async is the silent engine of modern computing.
Async is not faster computing—it’s smarter computing.
Haris
FAST-NUCES
BS Computer Science | Class of 2027
🔗 Portfolio: zenvila.github.io
🔗 GitHub: github.com/Zenvila
🔗 LinkedIn: linkedin.com/in/haris-shahzad-7b8746291**
🔬 Member: COLAB (Research Lab)**
