Reimporting the edited segments into metadataTo ensure a clean migration, shut down the non-coordinator services to ensure that metadata state will notWhen migrating from Derby, the coordinator processes will still need to be up initially, as they host the Derby database. Rewriting the load specs in the exported segment data to reflect the new deep storage location Exporting Druid's segments table from metadata Copying segments from local deep storage to the new deep storage Moment Sketches for Approximate Quantiles moduleIf you have been running an evaluation Druid cluster using local deep storage and wish to migrate to aMore production-capable deep storage system such as S3 or HDFS, this document describes the necessary steps.Migration of deep storage involves the following steps at a high level: For more details go to about and documentation, and dont forget.Import metadataAfter generating the CSV exports with the modified segment data, you can reimport the contents of the Druid segments table from the generated CSVs.Please refer to import commands for examples. After a successful export, you can shut down the coordinator. Export segments with rewritten load specsDruid provides an Export Metadata Tool for exporting metadata from Derby into CSV filesBy setting deep storage migration options, the export-metadata tool will export CSV files where the segment load specs have been rewritten to load from your new deep storage location.Run the export-metadata tool on your existing cluster, using the migration options appropriate for your new deep storage location, and save the CSV files it generates.
![]() ![]()
0 Comments
Leave a Reply. |
Details
AuthorKevin ArchivesCategories |