Skip to content

Read Zarr Mesh Files #8682

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 92 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 88 commits
Commits
Show all changes
92 commits
Select commit Hold shift + click to select a range
4dd9409
WIP: Read zarr agglomerate files
fm3 May 19, 2025
380bd69
zarr group path
fm3 May 19, 2025
7fb643f
test reading from zarr array
fm3 May 19, 2025
f987ebf
axisOrder: make y optional
fm3 May 19, 2025
935571a
Merge branch 'master' into agglomerates-zarr
fm3 May 22, 2025
7c1cc8b
undo attempt to make axisOrder.y optional
fm3 May 22, 2025
5af14fc
read multi array, ignoring underlying storage and axis order
fm3 May 22, 2025
f055c1e
apply agglomerate
fm3 May 27, 2025
13ff0e3
offset can be long; pass tokencontext
fm3 May 27, 2025
d853389
WIP read agglomerate skeleton
fm3 May 27, 2025
56ce08b
fix reading agglomerate skeleton
fm3 May 27, 2025
2855d2b
Change DatasetArray shape from Int to Long. Implement reading largest…
fm3 May 27, 2025
4ed5483
remove unused agglomeratesForAllSegments
fm3 May 27, 2025
7f8f662
Merge branch 'master' into agglomerates-zarr
fm3 May 27, 2025
291aab5
add shortcut for shape.product==0; implement segmentIdsForAgglomerateId
fm3 May 27, 2025
0439bce
remove unused test
fm3 May 27, 2025
11027d7
implement positionForSegmentId; agglomerateIdsForSegmentIds
fm3 May 28, 2025
0c5d647
select mapping by request
fm3 May 28, 2025
90d97cd
shortcut for single-dimension shape+offset
fm3 May 28, 2025
8551f99
handle uint32 agglomerate_to_segments arrays
fm3 May 28, 2025
d183c99
useZarr=false to test ci
fm3 May 28, 2025
cd04466
change chunkIndices back to list
fm3 May 28, 2025
9eac53d
use headOption instead of list deconstruction
fm3 May 28, 2025
0105b8e
Merge branch 'master' into agglomerates-zarr
fm3 Jun 3, 2025
fd7a281
WIP distinguish btw hdf5 and zarr according to registered layer attac…
fm3 Jun 3, 2025
19641af
pass datasource id + layer
fm3 Jun 3, 2025
716794d
list attached agglomerate files
fm3 Jun 3, 2025
feb8cb4
format
fm3 Jun 3, 2025
5c924dc
Merge branch 'master' into agglomerates-zarr
fm3 Jun 4, 2025
f040483
use agglomeratefilekey as cache key for proper cache clear support
fm3 Jun 4, 2025
e781baf
clear agglomerate caches on layer/ds reload
fm3 Jun 4, 2025
419490f
avoid injection
fm3 Jun 4, 2025
9c78167
prioritize WebknossosZarrExplorer
fm3 Jun 5, 2025
fdd7b4a
cleanup
fm3 Jun 5, 2025
09dfc0c
changelog
fm3 Jun 5, 2025
10dc7c3
make dummy datasource id more explicit
fm3 Jun 10, 2025
73b2f80
WIP Read Zarr Meshfiles
fm3 Jun 10, 2025
ca30ebe
read metadata from zarr group header
fm3 Jun 10, 2025
29073e8
wip read neuroglancer segment manifests
fm3 Jun 10, 2025
41de8c8
enrich
fm3 Jun 10, 2025
adb7d07
find local offset in bucket
fm3 Jun 12, 2025
e63fb2a
sort meshfile services, lookup with MeshFileKey
fm3 Jun 12, 2025
fd8dc30
move more code
fm3 Jun 12, 2025
0f9e5c8
Merge branch 'master' into meshfile-zarr
fm3 Jun 16, 2025
00e775c
iterate on meshfile services
fm3 Jun 16, 2025
7d51512
adapt frontend to simplified protocol
fm3 Jun 16, 2025
5d1b768
keys
fm3 Jun 16, 2025
ca64481
Merge branch 'master' into agglomerates-zarr
fm3 Jun 16, 2025
8cf2853
Merge branch 'agglomerates-zarr' into meshfile-zarr
fm3 Jun 16, 2025
16b38d6
explore + list meshfiles
fm3 Jun 17, 2025
7c94d31
fix frontend type
fm3 Jun 17, 2025
93d560c
adapt schema to neuroglancerPrecomputed dataformat for attachments
fm3 Jun 17, 2025
74a0bc3
Merge branch 'master' into agglomerates-zarr
fm3 Jun 17, 2025
12a6423
Merge branch 'agglomerates-zarr' into meshfile-zarr
fm3 Jun 17, 2025
1d492d1
adapt to new json format
fm3 Jun 17, 2025
9af3004
clear caches
fm3 Jun 17, 2025
331d178
some cleanup
fm3 Jun 17, 2025
43d9051
in list request, only return successes
fm3 Jun 17, 2025
d6a9817
add migration to guide
fm3 Jun 17, 2025
73a7ac2
unify spelling meshFile
fm3 Jun 17, 2025
16ae134
fix class injection
fm3 Jun 17, 2025
8077858
Adapt full mesh service; introduce credentials for attachments
fm3 Jun 18, 2025
7d676bf
fix updating job status for jobs with no credit transactions
fm3 Jun 18, 2025
76cd9d6
fix adhocMag selection in create animation modal
fm3 Jun 18, 2025
09567eb
make typechecker happy
fm3 Jun 18, 2025
f79b171
Merge branch 'master' into agglomerates-zarr
fm3 Jun 18, 2025
e170ebe
pr feedback; add services as singletons for proper cache use
fm3 Jun 18, 2025
b76d473
Merge branch 'agglomerates-zarr' into meshfile-zarr
fm3 Jun 18, 2025
1360184
address coderabbit review suggestions
fm3 Jun 18, 2025
72d5c5b
typo
fm3 Jun 18, 2025
0c961eb
Fix dtype bug, remove singleton instantiations again
fm3 Jun 19, 2025
365ec5d
rename lookup function as suggested in pr review
fm3 Jun 19, 2025
be77185
add ucar dependency resolver
fm3 Jun 19, 2025
5920905
remove sciJava resolver
fm3 Jun 19, 2025
753c25b
Revert "remove sciJava resolver"
fm3 Jun 19, 2025
11d2611
Revert "add ucar dependency resolver"
fm3 Jun 19, 2025
a9aed9e
Merge branch 'master' into agglomerates-zarr
fm3 Jun 19, 2025
bb07185
Merge branch 'master' into agglomerates-zarr
fm3 Jun 19, 2025
d7a667a
Merge branch 'master' into agglomerates-zarr
fm3 Jun 23, 2025
2c53ae6
Merge branch 'agglomerates-zarr' into meshfile-zarr
fm3 Jun 23, 2025
cdd3075
Merge branch 'master' into meshfile-zarr
fm3 Jun 23, 2025
0fddb3d
unify function names
fm3 Jun 23, 2025
6f8ed21
Merge branch 'master' into meshfile-zarr
fm3 Jun 23, 2025
0f10892
Merge branch 'master' into meshfile-zarr
fm3 Jun 25, 2025
0eb73d2
implement pr feedback
fm3 Jun 25, 2025
9217eb5
unused import
fm3 Jun 25, 2025
537a824
Merge branch 'master' into meshfile-zarr
fm3 Jun 25, 2025
fd932e6
same cleanup also when looking up agglomerates
fm3 Jun 25, 2025
0b98d57
Merge branch 'master' into meshfile-zarr
fm3 Jun 25, 2025
56f8385
Update webknossos-datastore/app/com/scalableminds/webknossos/datastor…
fm3 Jun 25, 2025
c3dc703
Merge branch 'master' into meshfile-zarr
fm3 Jun 26, 2025
75fbccf
Merge branch 'master' into meshfile-zarr
MichaelBuessemeyer Jun 26, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 5 additions & 3 deletions app/controllers/WKRemoteWorkerController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -79,16 +79,18 @@ class WKRemoteWorkerController @Inject()(jobDAO: JobDAO,
for {
_ <- workerDAO.findOneByKey(key) ?~> "job.worker.notFound"
jobBeforeChange <- jobDAO.findOne(id)(GlobalAccessContext)
_ <- jobDAO.updateStatus(id, request.body)
_ <- jobDAO.updateStatus(id, request.body) ?~> "job.updateStatus.failed"
jobAfterChange <- jobDAO.findOne(id)(GlobalAccessContext) ?~> "job.notFound"
_ = jobService.trackStatusChange(jobBeforeChange, jobAfterChange)
_ <- jobService.cleanUpIfFailed(jobAfterChange) ?~> "job.cleanup.failed"
_ <- Fox.runIf(request.body.state == JobState.SUCCESS) {
creditTransactionService.completeTransactionOfJob(jobAfterChange._id)(GlobalAccessContext)
creditTransactionService
.completeTransactionOfJob(jobAfterChange._id)(GlobalAccessContext) ?~> "job.creditTransaction.failed"
}
_ <- Fox.runIf(
jobAfterChange.state != request.body.state && (request.body.state == JobState.FAILURE || request.body.state == JobState.CANCELLED)) {
creditTransactionService.refundTransactionForJob(jobAfterChange._id)(GlobalAccessContext)
creditTransactionService
.refundTransactionForJob(jobAfterChange._id)(GlobalAccessContext) ?~> "job.creditTransaction.refund.failed"
}
} yield Ok
}
Expand Down
27 changes: 21 additions & 6 deletions app/models/organization/CreditTransactionService.scala
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ package models.organization

import com.scalableminds.util.accesscontext.DBAccessContext
import com.scalableminds.util.objectid.ObjectId
import com.scalableminds.util.tools.{Fox, FoxImplicits}
import com.scalableminds.util.tools.{Empty, Failure, Fox, FoxImplicits, Full}
import com.typesafe.scalalogging.LazyLogging
import play.api.libs.json.{JsObject, Json}

Expand Down Expand Up @@ -42,15 +42,30 @@ class CreditTransactionService @Inject()(creditTransactionDAO: CreditTransaction

def completeTransactionOfJob(jobId: ObjectId)(implicit ctx: DBAccessContext): Fox[Unit] =
for {
transaction <- creditTransactionDAO.findTransactionForJob(jobId)
_ <- organizationService.assertOrganizationHasPaidPlan(transaction._organization)
_ <- creditTransactionDAO.commitTransaction(transaction._id)
transactionBox <- creditTransactionDAO.findTransactionForJob(jobId).shiftBox
_ <- transactionBox match {
case Full(transaction) =>
for {
_ <- organizationService.assertOrganizationHasPaidPlan(transaction._organization)
_ <- creditTransactionDAO.commitTransaction(transaction._id)
} yield ()
case Empty => Fox.successful(()) // Assume transaction-less Job
case f: Failure => f.toFox
}

} yield ()

def refundTransactionForJob(jobId: ObjectId)(implicit ctx: DBAccessContext): Fox[Unit] =
for {
transaction <- creditTransactionDAO.findTransactionForJob(jobId)
_ <- refundTransaction(transaction)
transactionBox <- creditTransactionDAO.findTransactionForJob(jobId).shiftBox
_ <- transactionBox match {
case Full(transaction) =>
for {
_ <- refundTransaction(transaction)
} yield ()
case Empty => Fox.successful(()) // Assume transaction-less Job
case f: Failure => f.toFox
}
} yield ()

private def refundTransaction(creditTransaction: CreditTransaction)(implicit ctx: DBAccessContext): Fox[Unit] =
Expand Down
9 changes: 9 additions & 0 deletions conf/evolutions/135-neuroglancer-attachment.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
START TRANSACTION;

do $$ begin ASSERT (select schemaVersion from webknossos.releaseInformation) = 134, 'Previous schema version mismatch'; end; $$ LANGUAGE plpgsql;

ALTER TYPE webknossos.LAYER_ATTACHMENT_DATAFORMAT ADD VALUE 'neuroglancerPrecomputed';

UPDATE webknossos.releaseInformation SET schemaVersion = 135;

COMMIT TRANSACTION;
1 change: 1 addition & 0 deletions conf/evolutions/reversions/135-neuroglancer-attachment.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
-- Removing enum types directly is not possible so no reversion is available for this.
10 changes: 8 additions & 2 deletions conf/messages
Original file line number Diff line number Diff line change
Expand Up @@ -265,6 +265,10 @@ mesh.file.listChunks.failed=Failed to load chunk list for segment {0} from mesh
mesh.file.loadChunk.failed=Failed to load mesh chunk for segment
mesh.file.open.failed=Failed to open mesh file for reading
mesh.file.readEncoding.failed=Failed to read encoding from mesh file
mesh.file.lookup.failed=Failed to look up mesh file “{0}”
mesh.file.readVersion.failed=Failed to read format version from file “{0}”
mesh.file.readMappingName.failed=Failed to read mapping name from mesh file “{0}”
mesh.meshFileName.required=Trying to load mesh from mesh file, but mesh file name was not supplied.

task.create.noTasks=Zero tasks were requested
task.create.failed=Failed to create Task
Expand Down Expand Up @@ -347,7 +351,9 @@ job.trainModel.notAllowed.organization = Training AI models is only allowed for
job.runInference.notAllowed.organization = Running inference is only allowed for datasets of your own organization.
job.paidJob.notAllowed.noPaidPlan = You are not allowed to run this job because your organization does not have a paid plan.
job.notEnoughCredits = Your organization does not have enough WEBKNOSSOS credits to run this job.
creditTransaction.notPaidPlan = Your organization does not have a paid plan.
job.updateStatus.failed = Failed to update long-running job’s status
job.creditTransaction.failed = Failed to perform credit transaction
job.creditTransaction.refund.failed = Failed to perform credit transaction refund

voxelytics.disabled = Voxelytics workflow reporting and logging are not enabled for this WEBKNOSSOS instance.
voxelytics.runNotFound = Workflow runs not found
Expand All @@ -368,10 +374,10 @@ folder.notFound=Could not find the requested folder
folder.delete.root=Cannot delete the organization’s root folder
folder.move.root=Cannot move the organization’s root folder
folder.update.notAllowed=No write access on this folder
folder.noWriteAccess=No write access in this folder
folder.update.name.failed=Failed to update the folder’s name
folder.update.teams.failed=Failed to update the folder’s allowed teams
folder.create.failed.teams.failed=Failed to create folder in this location
folder.noWriteAccess=No write access in this folder
folder.nameMustNotContainSlash=Folder names cannot contain forward slashes

segmentAnything.notEnabled=AI based quick select is not enabled for this WEBKNOSSOS instance.
Expand Down
6 changes: 3 additions & 3 deletions frontend/javascripts/admin/api/mesh.ts
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ type MeshSegmentInfo = {
};

type ListMeshChunksRequest = {
meshFile: APIMeshFileInfo;
meshFileName: string;
segmentId: number;
};

Expand Down Expand Up @@ -52,7 +52,7 @@ export function getMeshfileChunksForSegment(
params.append("editableMappingTracingId", editableMappingTracingId);
}
const payload: ListMeshChunksRequest = {
meshFile,
meshFileName: meshFile.name,
segmentId,
};
return Request.sendJSONReceiveJSON(
Expand All @@ -72,7 +72,7 @@ type MeshChunkDataRequest = {
};

type MeshChunkDataRequestList = {
meshFile: APIMeshFileInfo;
meshFileName: string;
requests: MeshChunkDataRequest[];
};

Expand Down
8 changes: 3 additions & 5 deletions frontend/javascripts/types/api_types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -1005,12 +1005,10 @@ export type ServerEditableMapping = {

export type APIMeshFileInfo = {
name: string;
path: string | null | undefined;
fileType: string | null | undefined;
mappingName?: string | null | undefined;
// 0 - is the first mesh file version
// 1-2 - the format should behave as v0 (refer to voxelytics for actual differences)
// 3 - is the newer version with draco encoding.
// 0 - unsupported (is the first mesh file version)
// 1-2 - unsupported (the format should behave as v0; refer to voxelytics for actual differences)
// 3+ - is the newer version with draco encoding.
formatVersion: number;
};
export type APIConnectomeFile = {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -365,7 +365,7 @@ function* loadPrecomputedMeshesInChunksForLod(
dataset,
getBaseSegmentationName(segmentationLayer),
{
meshFile,
meshFileName: meshFile.name,
// Only extract the relevant properties
requests: chunks.map(({ byteOffset, byteSize }) => ({
byteOffset,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -258,7 +258,7 @@ function CreateAnimationModal(props: Props) {
const adhocMagIndex = getMagInfo(layer.resolutions).getClosestExistingIndex(
preferredQualityForMeshAdHocComputation,
);
const adhocMag = layer.resolutions[adhocMagIndex];
const adhocMag = getMagInfo(layer.resolutions).getMagByIndexOrThrow(adhocMagIndex);

return Object.values(meshInfos)
.filter((meshInfo: MeshInformation) => meshInfo.isVisible)
Expand Down
4 changes: 2 additions & 2 deletions tools/postgres/schema.sql
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ CREATE TABLE webknossos.releaseInformation (
schemaVersion BIGINT NOT NULL
);

INSERT INTO webknossos.releaseInformation(schemaVersion) values(134);
INSERT INTO webknossos.releaseInformation(schemaVersion) values(135);
COMMIT TRANSACTION;


Expand Down Expand Up @@ -163,7 +163,7 @@ CREATE TABLE webknossos.dataset_layer_additionalAxes(
);

CREATE TYPE webknossos.LAYER_ATTACHMENT_TYPE AS ENUM ('agglomerate', 'connectome', 'segmentIndex', 'mesh', 'cumsum');
CREATE TYPE webknossos.LAYER_ATTACHMENT_DATAFORMAT AS ENUM ('hdf5', 'zarr3', 'json');
CREATE TYPE webknossos.LAYER_ATTACHMENT_DATAFORMAT AS ENUM ('hdf5', 'zarr3', 'json', 'neuroglancerPrecomputed');
CREATE TABLE webknossos.dataset_layer_attachments(
_dataset TEXT CONSTRAINT _dataset_objectId CHECK (_dataset ~ '^[0-9a-f]{24}$') NOT NULL,
layerName TEXT NOT NULL,
Expand Down
5 changes: 5 additions & 0 deletions unreleased_changes/8682.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
### Added
- Precomputed Meshes can now also be read from the new zarr3-based format, and from remote object storage.

### Postgres Evolutions
- [135-neuroglancer-attachment.sql](conf/evolutions/135-neuroglancer-attachment.sql)
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,10 @@ import com.google.inject.name.Names
import com.scalableminds.webknossos.datastore.services._
import com.scalableminds.webknossos.datastore.services.mesh.{
AdHocMeshServiceHolder,
Hdf5MeshFileService,
MeshFileService,
NeuroglancerPrecomputedMeshFileService
NeuroglancerPrecomputedMeshFileService,
ZarrMeshFileService
}
import com.scalableminds.webknossos.datastore.services.uploading.UploadService
import com.scalableminds.webknossos.datastore.storage.{DataVaultService, RemoteSourceDescriptorService}
Expand All @@ -31,8 +33,14 @@ class DataStoreModule extends AbstractModule {
bind(classOf[ApplicationHealthService]).asEagerSingleton()
bind(classOf[DSDatasetErrorLoggingService]).asEagerSingleton()
bind(classOf[MeshFileService]).asEagerSingleton()
bind(classOf[ZarrMeshFileService]).asEagerSingleton()
bind(classOf[Hdf5MeshFileService]).asEagerSingleton()
bind(classOf[AgglomerateService]).asEagerSingleton()
bind(classOf[ZarrAgglomerateService]).asEagerSingleton()
bind(classOf[Hdf5AgglomerateService]).asEagerSingleton()
bind(classOf[NeuroglancerPrecomputedMeshFileService]).asEagerSingleton()
bind(classOf[RemoteSourceDescriptorService]).asEagerSingleton()
bind(classOf[ChunkCacheService]).asEagerSingleton()
bind(classOf[DatasetCache]).asEagerSingleton()
}
}
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
package com.scalableminds.webknossos.datastore.controllers

import com.google.inject.Inject
import com.scalableminds.util.tools.Fox
import com.scalableminds.webknossos.datastore.models.datasource.DataSourceId
import com.scalableminds.webknossos.datastore.services._
import com.scalableminds.webknossos.datastore.services.mesh.{
Expand All @@ -10,8 +9,7 @@ import com.scalableminds.webknossos.datastore.services.mesh.{
ListMeshChunksRequest,
MeshChunkDataRequestList,
MeshFileService,
MeshMappingHelper,
NeuroglancerPrecomputedMeshFileService
MeshMappingHelper
}
import play.api.libs.json.Json
import play.api.mvc.{Action, AnyContent, PlayBodyParsers}
Expand All @@ -21,7 +19,6 @@ import scala.concurrent.ExecutionContext
class DSMeshController @Inject()(
accessTokenService: DataStoreAccessTokenService,
meshFileService: MeshFileService,
neuroglancerPrecomputedMeshService: NeuroglancerPrecomputedMeshFileService,
fullMeshService: DSFullMeshService,
dataSourceRepository: DataSourceRepository,
val dsRemoteWebknossosClient: DSRemoteWebknossosClient,
Expand All @@ -38,22 +35,21 @@ class DSMeshController @Inject()(
accessTokenService.validateAccessFromTokenContext(
UserAccessRequest.readDataSources(DataSourceId(datasetDirectoryName, organizationId))) {
for {
meshFiles <- meshFileService.exploreMeshFiles(organizationId, datasetDirectoryName, dataLayerName)
neuroglancerMeshFiles <- neuroglancerPrecomputedMeshService.exploreMeshFiles(organizationId,
datasetDirectoryName,
dataLayerName)
allMeshFiles = meshFiles ++ neuroglancerMeshFiles
} yield Ok(Json.toJson(allMeshFiles))
(dataSource, dataLayer) <- dataSourceRepository.getDataSourceAndDataLayer(organizationId,
datasetDirectoryName,
dataLayerName)
meshFileInfos <- meshFileService.listMeshFiles(dataSource.id, dataLayer)
} yield Ok(Json.toJson(meshFileInfos))
}
}

def listMeshChunksForSegment(organizationId: String,
datasetDirectoryName: String,
dataLayerName: String,
/* If targetMappingName is set, assume that meshfile contains meshes for
/* If targetMappingName is set, assume that meshFile contains meshes for
the oversegmentation. Collect mesh chunks of all *unmapped* segment ids
belonging to the supplied agglomerate id.
If it is not set, use meshfile as is, assume passed id is present in meshfile
If it is not set, use meshFile as is, assume passed id is present in meshFile
Note: in case of an editable mapping, targetMappingName is its baseMapping name.
*/
targetMappingName: Option[String],
Expand All @@ -62,14 +58,11 @@ class DSMeshController @Inject()(
accessTokenService.validateAccessFromTokenContext(
UserAccessRequest.readDataSources(DataSourceId(datasetDirectoryName, organizationId))) {
for {
_ <- Fox.successful(())
mappingNameForMeshFile = meshFileService.mappingNameForMeshFile(organizationId,
datasetDirectoryName,
dataLayerName,
request.body.meshFile.name)
(dataSource, dataLayer) <- dataSourceRepository.getDataSourceAndDataLayer(organizationId,
datasetDirectoryName,
dataLayerName)
meshFileKey <- meshFileService.lookUpMeshFileKey(dataSource.id, dataLayer, request.body.meshFileName)
mappingNameForMeshFile <- meshFileService.mappingNameForMeshFile(meshFileKey)
segmentIds: Seq[Long] <- segmentIdsForAgglomerateIdIfNeeded(
dataSource.id,
dataLayer,
Expand All @@ -79,15 +72,7 @@ class DSMeshController @Inject()(
mappingNameForMeshFile,
omitMissing = false
)
chunkInfos <- if (request.body.meshFile.isNeuroglancerPrecomputed) {
neuroglancerPrecomputedMeshService.listMeshChunksForMultipleSegments(request.body.meshFile.path, segmentIds)
} else {
meshFileService.listMeshChunksForSegmentsMerged(organizationId,
datasetDirectoryName,
dataLayerName,
request.body.meshFile.name,
segmentIds)
}
chunkInfos <- meshFileService.listMeshChunksForSegmentsMerged(meshFileKey, segmentIds)
} yield Ok(Json.toJson(chunkInfos))
}
}
Expand All @@ -99,13 +84,11 @@ class DSMeshController @Inject()(
accessTokenService.validateAccessFromTokenContext(
UserAccessRequest.readDataSources(DataSourceId(datasetDirectoryName, organizationId))) {
for {
(data, encoding) <- if (request.body.meshFile.isNeuroglancerPrecomputed) {
neuroglancerPrecomputedMeshService.readMeshChunk(request.body.meshFile.path, request.body.requests)
} else {
meshFileService
.readMeshChunk(organizationId, datasetDirectoryName, dataLayerName, request.body)
.toFox ?~> "mesh.file.loadChunk.failed"
}
(dataSource, dataLayer) <- dataSourceRepository.getDataSourceAndDataLayer(organizationId,
datasetDirectoryName,
dataLayerName)
meshFileKey <- meshFileService.lookUpMeshFileKey(dataSource.id, dataLayer, request.body.meshFileName)
(data, encoding) <- meshFileService.readMeshChunk(meshFileKey, request.body.requests) ?~> "mesh.file.loadChunk.failed"
} yield {
if (encoding.contains("gzip")) {
Ok(data).withHeaders("Content-Encoding" -> "gzip")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -266,7 +266,7 @@ class DataSourceController @Inject()(
(dataSource, dataLayer) <- dataSourceRepository.getDataSourceAndDataLayer(organizationId,
datasetDirectoryName,
dataLayerName)
agglomerateList = agglomerateService.listAgglomerates(dataSource.id, dataLayer)
agglomerateList = agglomerateService.listAgglomeratesFiles(dataSource.id, dataLayer)
} yield Ok(Json.toJson(agglomerateList))
}
}
Expand Down Expand Up @@ -450,7 +450,8 @@ class DataSourceController @Inject()(
layerName: Option[String]): InboxDataSource = {
val (closedAgglomerateFileHandleCount, clearedBucketProviderCount, removedChunksCount) =
binaryDataServiceHolder.binaryDataService.clearCache(organizationId, datasetDirectoryName, layerName)
val closedMeshFileHandleCount = meshFileService.clearCache(organizationId, datasetDirectoryName, layerName)
val closedMeshFileHandleCount =
meshFileService.clearCache(DataSourceId(organizationId, datasetDirectoryName), layerName)
val reloadedDataSource: InboxDataSource = dataSourceService.dataSourceFromDir(
dataSourceService.dataBaseDir.resolve(organizationId).resolve(datasetDirectoryName),
organizationId)
Expand Down
Loading