Complete Architecture & Flow Analysis
Multiple clients sending HTTP requests to proxy server on port 8080
Main process handling connections with thread pool and caching
Target web servers that proxy fetches content from
Up to 400 concurrent threads managed by semaphore
200MB cache with 10MB max element size using linked list
Semaphores for connection control, mutex for cache protection
Thread blocked on semaphore until connection slot available
Reading complete HTTP request from client socket
Searching cache with mutex protection
Serving response directly from cache
Connecting to remote server and streaming response
Adding response to cache with LRU eviction