i making 3d model viwer in cesiumjs. geojson requested url , works fine small models. incase of complex json multipolygon files , json url request data exceeds 16mb cesium takes long in parsing , rendering , if make call 1 other model crashes web browser error out of memory. checked chrome along side , eats around 900 memory 16mb geojson url. there better way of memory management? or tips how tackle problem. 1 solution 3d mesh simplification before sending geojson model asking related memory management on client side.
keep in mind part of time browser running json.parse on 16 megabyte file. synchronous operation in browsers , there's nothing can until browsers add asynchronous json parsing. first thing recommend run geojson through topojson on server, should provide significant size savings. decrease both transmission time , json.parse time. sure server compressing data, add further savings. have seen 16mb files go down under meg these techniques.
that being said, that's first part of problem. real question not how big geojson how many features have? if it's 16mb file ~15000 features, above suggestion solve problem. if has 10s of thousands of features, going run problems not solved. provided details on our mailing list yesterday: https://groups.google.com/d/msg/cesium-dev/f6iky9aeg1i/8drkhlbnli4j
we're looking ways optimize , further improve cesium. feel free reach out our mailing list , provide of large sample data if can, can use use case when optimize things in future.