Concurrent cache

During the system implementation, performance ceiling was reached. We have a lot of requests, that initialize complex queries, load servers. So the response time increases what is unacceptable. And extensive way is not our way.

Main part of the system is REST API, that must be improved.  

Log research showed, that backend receives huge amount of similar queries. And even more, requests are equal. First idea - to cache response and send next time.  

System is based on PHP/Laravel and caching mechanism is a middleware. The simpliest solution is unconditional caching, looks like this: 

public function handle($request, Closure $next)  
{  
    $queryParams = $request->query();  
    $queryParams['url'] = $request->path();  
    ksort($queryParams);  
    $paramHash = md5(serialize($queryParams));  
    $key = 'responseCache:' . $paramHash;  
    $response = null;  
    if ($request->isMethod('get') && ($response = \Cache::get($key)) === null) {  
        /** @var Response $response */  
        $response = $next($request);  
        if ($response->isSuccessful()) {  
            try {  
                \Cache::forever($key, $response);  
            } catch (\Exception $ex) {  
                \Log::info($ex->getMessage());  
            }  
        }  
    } elseif ($response === null) {  
        $response = $next($request);  
    }  
    return $response;  
}

This method forms a key and checks existence. If response is cached, it must be sent immediately, otherwise request will be fully processed and cached then. Only GET request will be processed. 

The cache is being cleaned elsewhere. 

But this is the simpliest mechanism, that is not very clever. Our system can process more complex queries and large amount of data. If many such queries comes, system will be unavailable for some time, and we can lose clients. Queries of each type must be processed once. 

In such situation it would be greate to know if request is already in processing. Then we can wait for it and send prepared response to all clients. 

Script put flag(that request processing is initialized) to cache, and check it in some period. 

Let's look at code:  

<?php namespace Application\Http\Middleware;  
   
use Closure;  
use Illuminate\Http\Response;  
   
/**  
* Middleware, that adds different headers to response  
* @package Application\Http\Middleware  
*/  
class RequestCache  
{  
    const SLEEP_TIME = 200000;  
   
    /**  
     * Handle an incoming request.  
     *  
     * @param \Illuminate\Http\Request $request  
     * @param \Closure                 $next  
     *  
     * @return mixed  
     */  
    public function handle($request, Closure $next)  
    {  
        $params = func_get_args();  
        $tags = ['response'];  
        if (count($params) > 2) {  
            $tags = array_merge($tags, array_slice($params, 2));  
        }  
        $queryParams = $request->query();  
        $queryParams['url'] = $request->path();  
        ksort($queryParams);  
        $paramHash = md5(serialize($queryParams));  
        $key = 'responseCache:' . $paramHash;  
        $pendingKey = $key . ':pending';  
        $response = null;  
        if ($request->isMethod('get') && ($response = $this->getCache($key, $pendingKey, $tags)) === null) {  
            \Cache::tags(array_merge($tags, ['pending']))->put($pendingKey, true, 0.3);  
            /** @var Response $response */  
            $response = $next($request);  
            if ($response->isSuccessful()) {  
                try {  
                    \Cache::tags($tags)->put($key, $response, 5);  
                } catch (\Exception $ex) {  
                    \Log::info($ex->getMessage());  
                }  
            }  
            \Cache::tags(array_merge($tags, ['pending']))->forget($pendingKey);  
        } elseif ($response === null) {  
            $response = $next($request);  
        }  
   
        return $response;  
    }  
   
    /**  
     * Get cached data  
     *  
     * @param string $key  
     * @param string $pendingKey  
     * @param array  $tags  
     *  
     * @return Response  
     */  
    protected function getCache($key, $pendingKey, $tags = [])  
    {  
        while (\Cache::tags(array_merge($tags, ['pending']))->has($pendingKey)) {  
            usleep(self::SLEEP_TIME);  
        }  
        if (\Cache::tags($tags)->has($key)) {  
            return \Cache::tags($tags)->get($key);  
        }  
   
        return null;  
    }  
}  

But application can't wait forever. Process, that put the flag, can die, so it must has a lifetime.

Then we need to attach middleware to our application and bind to route:

Route::group(['middleware' => ['api.requestCache:data']], function () {   
    Route::get('data', 'DataController@index');  
});

api.requestCache - middleware alias 

data - tag 

To simplify interaction with cached data, it is better to tag it. Cache could be dropped by these tags.  



Comments

Add
You didn't comment anything. Want to read something?. Want to be first?
You have to login

Could be interesting


Сергей 0

Remote observer more

If a project extends beyond a local machine, you will most probably have to get integrated with some third party systems.

I’d like to consider the case when such a third party system wants to receive notifications about any changes in our system. For example, the goods catalogue upgrade.

0 09.04.2017 18:10:01