-
-
Notifications
You must be signed in to change notification settings - Fork 375
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
index.html should be excluded from service worker cache #474
Comments
How to do the above in
This will essentially disable offline capabilities as you cannot load index.html offline. However, the current solution also breaks as the old bundle is loaded as described above. It would be nice to have a built-in solution to this in preact cli. |
I think what we need to do is force the bundle hash to change when ANY chunk changes. Because we are extracting page chunks, which incapsulate the components, it COULD be the case that route bundles change but the main bundle hash doesn't change. Otherwise, the current setup is the way that service worker works. You will get a stale app on the first visit after the new assets have been cached. This includes Not caching the index does not solve this issue, as attempted in the linked PR, because an HTML file is needed in order for the offline app to boot. Disregarding that point, if the new HTML file still had an asset link to a "new" bundle whose hash did not change, then the same problem would occur. |
@lukeed @sdbondi @kimamula I spent little more time on this and from my research so far I uploaded my repro into separate repo here - but voila I could not reproduce the issue as reported originally. As documented in this repo, it seems at step 5 above, sw.js in fact has new hash for index.html which is downloaded correctly when browser is refreshed. Am I missing something? |
I guess if a route directly included in root's Simplest sollution to this is to extract webpack's asset manifest and add it in index.html. |
I agree. @osdevisnot As shown in your image, Therefore, the older version of It seems to me that setting
|
const SWPrecacheWebpackPlugin = helpers.getPluginsByName(config, 'SWPrecacheWebpackPlugin')[0]
if(SWPrecacheWebpackPlugin){
const {plugin} = SWPrecacheWebpackPlugin;
const unwanted = [
/^[^\.]*(\.html)?$/
]
plugin.options.staticFileGlobsIgnorePatterns = plugin.options.staticFileGlobsIgnorePatterns.concat(unwanted);
plugin.options.runtimeCaching = plugin.options.runtimeCaching || []
plugin.options.runtimeCaching.push({
urlPattern: /^[^\.]*(\.html)?$/,
handler: 'networkFirst'
})
} Then online versions are tried firstly, |
something similar i had in my mind. does total offline works with this? |
Even if I don't take a lot of time to test it, it's seem to work quite good |
Why is this one closed? Doesn't this bug occur every time a new version of bundle.{hash}.js is produced? |
I am not sure how much the vanilla service worker is helpful but there is my 2 pinch of salt on this problem statement
key is navigator.online condition check and if user is online then always fetch from network and register it in cache for latest html, and if its offline then go with cache response. hope it may help someone. |
@Ganpatkakar What file do you register serviceworker at? I register mine in the |
Do you want to request a feature or report a bug?
bug
What is the current behaviour?
bundle.{hash}.js tries to load the old version of route-{page}.chunk.{hash}.js when JS code is modified, which results in an error.
If the current behaviour is a bug, please provide the steps to reproduce.
$ npm run serve
)console.log('something has changed');
atrender()
method)$ npm run serve
skipWaiting()
, which triggers deletion of the old caches (such as route-profile.chunk.c26a1.js)What is the expected behaviour?
All the client side code should be updated to the latest version when JS code is modified.
Excluding index.html from the service worker cache (i.e., adding
/index\.html$/
here), which prevents the old index.html from requesting the old bundle at the step 6-ii, should fix the problem.The text was updated successfully, but these errors were encountered: