I have this function which retrieves the page's title of all urls of a website :
function getTitle($url)
{
$pages = file_get_contents($url);
$title = preg_match('/<title[^>]*>(.*?)<\/title>/ims', $pages, $match) ? $match[1] : null;
return $title;
}
Then I do a loop and it works perfectly (Good Results), but I want to use a caching for 'file_get_contents', so I did :
function getTitle($url)
{
$pages = cache()->Cache::remember('key', now()->addDay(), fn() => file_get_contents($url));
$title = preg_match('/<title[^>]*>(.*?)<\/title>/ims', $pages, $match) ? $match[1] : null;
return $title;
}
On the one side the caching works (hyper fast now) but on the other side, all the titles are the same (Bad Results).
Where is my logic faulting ? This is the first time I'm using caching.
CodePudding user response:
The file depends on the $url, and you are caching the file_get_contents
independently from it, so no matter the $url value, the same cache is used.
make the cache dependent on the url. but I dont know if it's the performance upgrade you're looking for.
$pages = cache()->Cache::remember('key-'.$url, now()->addDay(), fn() => file_get_contents($url))
You should either make the cache last longer and refresh on creation or modification or have cron that refresh the cache daily.