Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not all layers rendering when loading ~100 layers #735

Open
m-albert opened this issue Feb 19, 2025 · 9 comments
Open

Not all layers rendering when loading ~100 layers #735

m-albert opened this issue Feb 19, 2025 · 9 comments

Comments

@m-albert
Copy link

First of all thanks a lot for this amazing tool.

I'm trying to load ~100-1000 layers of dimensions tcxy from OME-Zarr files into neuroglancer using a URL.

I found that loading up to 70-90 layers works perfectly. However, when I increase the number of layers, some layers are not rendered. While they do show up in the layer list, they are not being rendered (also for those layers that are not rendered, the c' doesn't appear in the tab-like listing at the top). See the attached screenshots.

Interestingly, it's not always the same layers that are rendered. Just wanted to report this here and would be very happy for any hints of what could be happening.

PS: Is there a way to avoid the view list on top (which is getting pretty full)?

Thanks again!

Loading 70 layers:

Image

Loading 150 layers:

Image
@m-albert m-albert changed the title Problems when loading ~100 layers Not all layers rendering when loading ~100 layers Feb 19, 2025
@m-albert
Copy link
Author

m-albert commented Feb 19, 2025

Update: just found that Chrome's javascript console reports several instances of Failed to load resource: net::ERR_INSUFFICIENT_RESOURCES.

Also, there are many instances of
Failed to load resource: the server responded with a status of 404 (File not found) with references to files like ...tile25.ome.zarr/2/.zattrs. However, this also occurs when loading few tiles and rendering works fine. In my OME-Zarr files, ...tile25.ome.zarr/0, ...tile25.ome.zarr/1 etc. are arrays that contain different image resolutions, and .zattrs doesn't need to be present in zarr v0.2.

@fcollman
Copy link
Contributor

I’m not sure how performant it will be … but you can try 150 sources in one layer. Each source gets its own transform.

But I think for this application I would look at render, which will help you store tiles and transforms and then render them as one source in neuroglancer. Really only intended for 2d tiles arranged into 3d volumes so if you have volumes you will have to fake it, and might involve too much reformatting.

https://github.com/saalfeldlab/render

I think OME-zarr will eventually support transforms and then neuroglancer can support viewing them and your tiles/chunks could all be sub arrays of one source.

@fcollman
Copy link
Contributor

Do you have downsampling ? Without it rendering this large an image will not work.

@m-albert
Copy link
Author

m-albert commented Feb 19, 2025

Hey @fcollman thanks for your suggestions!

Do you have downsampling ? Without it rendering this large an image will not work.

I do have downsampling. Maybe actually too much as I'm downsampling by a factor of 2 in each level. I found that ERR_INSUFFICIENT_RESOURCES might be related to the number of browser requests performed (see https://codereview.chromium.org/18541) and increasing the downsampling factor might reduce this.

I’m not sure how performant it will be … but you can try 150 sources in one layer. Each source gets its own transform.

This sounds like an interesting option. Would you have a hint regarding how to assign several sources (including transforms) to the same layer using the state json? Currently I'm using a single source key for each layer with each single url/transform keys:

{'layers':
    [
        {'source':
            {
                'transform': ...,
                'url': ...,
            }
        }
    ],
    ...

I think OME-zarr will eventually support transforms and then neuroglancer can support viewing them and your tiles/chunks could all be sub arrays of one source.

That'd be amazing. Currently I'm already super happy that neuroglancer is reading the scale and translation transforms from the OME-Zarrs (0.4 spec) and placing the layers accordingly. I'm then passing further affine transforms using the state json.

But I think for this application I would look at render, which will help you store tiles and transforms and then render them as one source in neuroglancer. Really only intended for 2d tiles arranged into 3d volumes so if you have volumes you will have to fake it, and might involve too much reformatting.

Thanks also for suggesting render, I'll have a look at it. I do have both 2D and 3D data though and am really liking the volume rendering option in neuroglancer as well! And the configurable no-install web app 😍

@fcollman
Copy link
Contributor

In the source tab press plus in the ui under the transform of your first source. When you add a second you’ll see source can be a list.

@jbms
Copy link
Collaborator

jbms commented Feb 19, 2025

I think you are right that the issue is too many concurrent downloads. Neuroglancer has an option to limit the number of concurrent downloads, with a default limit of 100, but currently this only applies to chunk downloads, not metadata requests. Possible solutions:

  1. Detect the ERR_INSUFFICIENT_RESOURCES error and retry after a random delay with backoff
  2. Explicitly limit the number of concurrent non-chunk requests as well.

(1) would be simpler but I don't know if it is possible --- in many cases fetch errors are "opaque" to JavaScript code for browser security reasons.

@m-albert
Copy link
Author

@fcollman Amazing, thank you! While the rendering problem remains the same, loading a single layer with multiple sources can make menus / shading control more manageable in some cases (e.g. less space is occupied by the layer listing).

Image

@m-albert
Copy link
Author

@jbms Thanks a lot!

I think you are right that the issue is too many concurrent downloads. Neuroglancer has an option to limit the number of concurrent downloads, with a default limit of 100, but currently this only applies to chunk downloads, not metadata requests.

Interesting. Indeed my rendering problem seems to be unrelated to the concurrentDownloads setting (it stays the same with lower and higher values). Even when at zero, image data is not loaded as expected but not more bounding boxes appear despite fewer simultaneous chunk related requests.

Possible solutions:

  1. Detect the ERR_INSUFFICIENT_RESOURCES error and retry after a random delay with backoff
  2. Explicitly limit the number of concurrent non-chunk requests as well.

(1) would be simpler but I don't know if it is possible --- in many cases fetch errors are "opaque" to JavaScript code for browser security reasons.

Thanks for these hints! I might look into deploying neuroglancer and seeing whether I can tweak the behaviour along these lines. Although my javascript experience is limited 😅

@m-albert
Copy link
Author

Update: Should have thought of this before, rendering all layers or sources seems to work perfectly with Firefox (instead of Chrome)!

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants