-
Notifications
You must be signed in to change notification settings - Fork 312
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
consider allow late importScripts/import for local URLs #1595
Comments
(taking a look at this with @mfalken) This seems fair to us, since you can |
Motivation is that perhaps we could also then treat extension URLs as local since they don't hit network. That would provide a path for extension service workers to do late import/importScripts, etc. It seemed like perhaps a principled way to solve the problem. |
@annevk do you have an opinion on this? Would it be fair to treat an extension URL as "local" if its guaranteed to not hit network? For example, I note that |
I'm not sure we considered For |
One non-extension use case is loading script from IDB as a blob. Today you would have to read it all into memory and use eval, but importing from blob url would be more natural. I know of one major site storing script in IDB, although I don't think they need to import it in their service worker today. I think the main alternative is #1585. That would support some form of I know we could just throw this all behind a big "if extension" conditional and do whatever we want, but it would be nice if we could find a more principled alignment with the spec. This leads to fewer surprises for developers, less maintenance burden, etc. |
It would be great to get the WECG to document/spec how WebExtensions upgrades relate to ServiceWorker upgrades. Is there an invariant that if a WebExtension ServiceWorker upgrades from v1 to v2 that there is a mechanism or strict ordering for the upgrade so that the v1 SW would always see the resources from the v1 WebExtension and the v2 SW would always see the resources from the v2 WebExtension? Or would implementing the local-only optimization create a problem where the SW's can effectively be out of sync with the "local" WebExtension resources? If that's addressed, this seems like a reasonable proposal. |
I can describe what happens currently in Chrome/Firefox. All files in an installed extension (from the addon gallery) are frozen to their original installed content, it will never change, so AFAICT there is no need for the upgrade event in the service worker. That said, browser makers may want to offer a simplified hot reload for the service worker of a locally developed unpacked extension, but it's not implemented yet. When the entire extension is updated (or the reload button is pressed on a locally developed unpacked extension), the browser simply uninstalls the old extension (retaining its data), and installs the new one. |
I'm not sure how this would work? We explicitly disallowed service workers from creating blob URLs, because lifetime of them would be even more confusing then blob URLs created elsewhere. For this particular use case it seems like the lifetime of them would be fine, but we'd have to re-enable blob URL creating in service workers for this to be possible. |
@asutherland When reloading an extension, we first Deactivate the extension, which stops the old service worker. When Deactivate is finished, we Activate the extension, which installs the new service worker. So, v1 SW would always see the resources from the v1 Extension and the v2 SW would always see the resources from the v2 Extension. |
Then it seems like letting local URL's be imported whenever is fine. I do wonder if it might make sense to be conservative and also define that the local URL must also be same-origin. This would mainly be to help side-step issues like extensions depending on the script resources of other extensions. (Maybe this is forbidden in practice by the underlying extension schemes, but there's no spec yet to clarify that.) This would forbid data URLs for late import as a side effect, but I'm not sure that's a bad thing, as it seems desirable to instead let import take Blobs or Responses or anything other than enabling createObjectURL or incentivizing data URLs (which can be a strain on devtools / introspection tools that don't optimize for multi-megabyte URLs). |
I'm not sure I understand why we would need to require late imports to be same-origin. No other worker has that restriction. What is it about this situation that would require a same-origin restriction? |
@ghazale-hosseinabadi is it possible for one extension to load URLs targeting resources in a different extension? Or are they isolated from one another? |
Extensions are isolated from one another and it is not possible for one extension to load URLs targeting resources in a different extension. |
@ghazale-hosseinabadi Is it possible for content to load URLs from extensions? |
While talking with @mfalken about this issue we started wondering what service workers do for data URL imports today. It seems we have some inconsistencies: https://sw-import-data-url.glitch.me/ Chrome seems to run Edit: But firefox does seem to support importScripts() of data urls in dedicated workers. |
Thank you both for looking into that! As you are probably aware, Firefox stores ServiceWorker scripts in its Cache API implementation, and this likely makes data URIs subject to the scheme enforcement of step 4 of Cache.put. |
@asutherland yes, please refer to https://developer.chrome.com/docs/extensions/mv3/content_scripts/: They can also access the URL of an extension's file with chrome.runtime.getURL() and use the result the same as other URLs. // Code for displaying /images/myimage.png: |
@ghazale-hosseinabadi, @asutherland, only files exposed explicitly via web_accessible_resources are accessible for the content scripts and web pages. By default, non-exposed URLs are not accessible from outside of the extension's own origin which is chrome-extension://<id>. Note that technically chrome.runtime.getURL doesn't provide access to a URL: this method simply adds a literal string containing the extension's origin e.g. |
In theory local scripts would not need to be offlined in any storage, so firefox's use of cache API would not be a problem if we made an early exemption in the algorithm like "if local, do a normal importScripts()". |
Agreed. If we're going to make an exception for 'local' always-available-offline-anyway URLs, then we'll bypass the map for them. |
Is the lifetime of blob urls tied to the client lifetime or is it more complicated than that? I agree it's a footgun if folks start creating blob urls in a service worker and expect to be able to use them in a page. I guess there's no easy way to create a blob URL that can only be used from the client that created it? It's still possible to turn these things into data URLs, so it might not be a big deal. |
I guess rather than encourage developers to use more blob URLs, we could somehow make Just making |
Yeah, continuing to not allow |
Ahh yeah, that works around the deadlock issue with responses. |
Oh, |
I think the issue is if |
I guess |
Yeah, let's just stick with |
I'm confused. Service workers being prevented from creating blob URLs seems orthogonal to if I'm not saying there is a strong use case there, but the creation of blob URLs and loading of blob URLs are separate things. |
See #1595 (comment) and w3c/FileAPI#153. |
Yea, I missed that. I have to say I don't understand why we would introduce some new kind of isolation instead of just making blob URLs only readable by contexts with the same StorageKey. Left a comment on that issue to that effect. |
Also, one of my goals with this issue was to minimize how many special exceptions we have to our general rules. Saying "local urls work like X" seemed like a simple rule. But saying "local URLs, except blob URLs, work like X" seems less principled to me and I don't really understand the motivation. |
But even if we do w3c/FileAPI#153, why would we need something in service worker importscripts to treat blob URL specially? Couldn't the importscripts algorithm just operate on all local url equally and let blob url loading algorithm handle whether to fail or not? |
Do we have consensus on this? To use the normal non-SW |
Yeah, ok, my objection to blob URLs isn't particularly strong. A page and a service worker would be in a different agent cluster, yeah? |
Yes. A service worker is always in a separate agent cluster I believe. The only way that would change is if we allowed nested dedicated workers in a service worker, which we want, but have not done yet. |
Ok, that sounds good to me. I'm happy with your proposal. |
I'm fine with this. I think this does introduce somewhat of an edge case in terms of potential breakage via changes to web_accessible_resources (which are local but not same-origin as I understand it) mentioned in #1595 (comment). However, I think for Firefox our plan will be to unregister (or update) any ServiceWorker registration touched by a web extension in any way when the extension is upgraded or uninstalled, which side-steps the issue. I don't think anyone from Safari/WebKit has weighed in? |
@youennf any thoughts on this? The proposal is to allow |
Seems ok to me, though I am not a big fan of blob URLs in general. |
My worry is allowing blob URLs to work temporarily and then later not being able to prevent them from working. |
Don't you already have that problem with It seems to me the best approach for your concern would be to get to consensus in w3c/FileAPI#153 whether to attempt agent cluster isolation or not. And if you want to attempt it, then you can get consensus there to not implement blob URL loading in service workers for now. But that seems orthogonal to whether to support late loading of local URLs in service workers. But if you can at least get consensus to try then implementations could have a check for blob's in the service worker code with a TODO comment saying that this should be converted to an agent cluster check in the blob loader in the future. I think in the end, though, this kind of restriction should be part of the blob spec/impl and not buried in some exceptional if-statement in the middle of the service worker spec/impl. |
Well, but I do agree on what this should look like long term and if you want to block resolving this on w3c/FileAPI#153 that would work for me as well. |
I don't think we need to block on w3c/FileAPI#153. It looks like @arichiv is already investigating agent cluster isolation in chrome. So I think an implementation check and TODO comment pointing to that investigation would be reasonable to prevent volume increase for now. |
@jakearchibald I am facing an issue with importing a local Firebase CDN file in the service worker (background.js) file. I can't seem to load other scripts or files locally from one file to another. Is there any way to achieve this? I have tried importScripts, import, injectScripts, and using URLs, but none of these approaches seem to work. file structure firebase-app.js // Firebase configuration object // Initialize Firebase // Initialize Firebase Cloud Messaging and request permission to get the token // Request for Notification Permission and Get Registration Token // Call the function to get the registration token {
} |
#1595 (comment) seems to be about browser extension service workers specifically. @oliverdunk, anything to comment from your end? |
Thanks Thomas!
Without seeing your project running and the errors you are getting, it's hard to say anything with certainty. That said, I don't think the problems here are related to this issue. As an example, there is a service worker specific version of the firebase messaging SDK (link). We don't have any Chrome guidance on making this work currently but there are some external blog posts you might find helpful, like this one. If you have any other questions or follow-up, please use the chromium-extensions mailing list and that way we can avoid adding too much noise on the spec discussion :) |
Hi @oliverdunk, thanks for your reply. I am unable to load script files from one file to another. Additionally, the Firebase Messaging CDN JS file contains live URL APIs and Chrome is blocking the background.js file with an "unregistered" error status code 3. My code setup includes HTML, CSS, JS, and manifest.json. |
Could we allow late
import()
andimportScripts()
after installation if the script URL is considered "local"?https://fetch.spec.whatwg.org/#local-scheme
So for example, you could do a late
import()
orimportScripts()
to adata:
orblob:
URL.The text was updated successfully, but these errors were encountered: