Replies: 1 comment
-
Found a solution, it does work like this: // 55.000 models from database
$sitemap = $generator->getSitemap();
ModelName::all()->each(function ($model) use ($sitemap) {
$sitemap->add(
Url::create($model->url->show)
->setLastModificationDate($model->updated_at)
);
}); |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Dear people of Spatie,
For our site we use this package to crawl a little, and for a large part we manually add the urls to the sitemap like so:
My problem is that for every url added to the sitemap (
$generator->getSitemap()->add($url)
) there is a request torobots.txt
on my server as well. Look at my logs, this goes on for about 55.000 times:This is a bit of a drain on the server I rather not have. Is there a way that this can be reduced to just one time a request for the
robots.txt
and use this in memory over and over again?Beta Was this translation helpful? Give feedback.
All reactions