.. _accumulo_tools: Command Line Tools ================== This chapter describes GeoMesa Tools, a set of command line tools for feature management, query planning and explanation, ingest, and export from the command line. Installing SFT and Converter Definitions ---------------------------------------- Starting with version 1.2.3, GeoMesa Tools ships with embedded SimpleFeatureType and GeoMesa Converter definitions for common data types including Twitter, GeoNames, T-drive, and many more. See :ref:`installing_sft_and_converter_definitions`. To see which SFT and converter configurations are installed, try the ``geomesa env`` command described below. Running the command line tools ------------------------------ Run ``geomesa`` without any arguments to produce the following usage text:: $ geomesa Usage: geomesa [command] [command options] Commands: add-attribute-index Run a Hadoop map reduce job to add an index for attributes add-index Add or update indices for an existing GeoMesa feature type classpath Display the GeoMesa classpath config-table Perform table configuration operations configure Configure the local environment for GeoMesa configure-age-off List/set/remove age-off for a GeoMesa feature type convert Convert files using GeoMesa's internal SFT converter framework create-schema Create a GeoMesa feature type delete-catalog Delete a GeoMesa catalog completely (and all features in it) delete-features Delete features from a table in GeoMesa. Does not delete any tables or schema information. delete-raster Delete a GeoMesa Raster table env Examine the current GeoMesa environment explain Explain how a GeoMesa query will be executed export Export features from a GeoMesa data store export-bin Export features from a GeoMesa data store in a binary format. gen-avro-schema Generate an Avro schema from a SimpleFeatureType get-names List GeoMesa feature types for a given catalog get-schema Describe the attributes of a given GeoMesa feature type get-sft-config Get the SimpleFeatureType of a feature help Show help ingest Ingest/convert various file formats into GeoMesa ingest-raster Ingest raster files into GeoMesa keywords Add/Remove/List keywords on an existing schema query-raster-stats Export queries and statistics about the last X number of queries to a CSV file. remove-schema Remove a schema and associated features from a GeoMesa catalog stats-analyze Analyze statistics on a GeoMesa feature type stats-bounds View or calculate bounds on attributes in a GeoMesa feature type stats-count Estimate or calculate feature counts in a GeoMesa feature type stats-histogram View or calculate counts of attribute in a GeoMesa feature type, grouped by sorted values stats-top-k Enumerate the most frequent values in a GeoMesa feature type version Display the installed GeoMesa version This usage text lists the available commands. To see help for an individual command, run ``geomesa help ``, which for example will give you something like this:: $ geomesa help get-type-names Usage: get-type-names [options] Options: --auths Accumulo authorizations * -c, --catalog Catalog table for GeoMesa datastore -i, --instance Accumulo instance name --keytab Path to Accumulo user Kerberos keytab file --mock Run everything with a mock accumulo instance instead of a real one Default: false -p, --password Accumulo password (will prompt if not supplied and not using Kerberos) * -u, --user Accumulo user name --visibilities Default feature visibilities -z, --zookeepers Zookeepers (host[:port], comma separated) The Accumulo username and either a password or a path to a Kerberos keytab file is required for each command. Specify the username and password in each command by using ``-u`` (or ``--username``) and ``-p`` (or ``--password``), respectively. One can also only specify the username on the command line using ``-u`` or ``--username`` and type the password in an additional prompt, where the password will be hidden from the shell history. To use Kerberos authentication instead, use ``--keytab`` with a path to a Kerberos keytab file containing an entry for the specified user. Since a keytab file allows authentication without any further constraints, it should be protected appropriately. In nearly all commands below, one can add ``--instance-name``, ``--zookeepers``, ``--auths``, and ``--visibilities`` (or in short form ``-i``, ``-z``, ``-a``, and ``-v``) arguments to properly configure the Accumulo data store connector. The Accumulo instance name and Zookeepers string can usually be automatically assigned as long as Accumulo is configured correctly. The Auths and Visibilities strings will have to be added as arguments to each command, if needed. Command overview ---------------- Tools Configuration ^^^^^^^^^^^^^^^^^^^ configure ~~~~~~~~~ Used to configure the current environment for using the commandline tools. This is frequently run after the tools are first installed to ensure the environment is configured correctly:: $ geomesa configure classpath ~~~~~~~~~ Prints out the current classpath configuration:: $ geomesa classpath Creating and deleting feature types ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ create-schema ~~~~~~~~~~~~~ Used to create a feature type (``SimpleFeatureType``) in a GeoMesa catalog:: $ geomesa create -u username -p password \ -i instance -z zoo1,zoo2,zoo3 \ -c test_create \ -f testing \ -s fid:String:index=true,dtg:Date,geom:Point:srid=4326 \ --dtg dtg get-schema ~~~~~~~~~~ Display details about the attributes of a specified feature type:: $ geomesa get-schema -u username -p password -c test_delete -f testing get-sft-config ~~~~~~~~~~~~~~ Get the specified feature type as a typesafe config:: $ geomesa get-sft-config -u username -p password -c test_catalog -f test_feature --format typesafe Get the specified feature type as an encoded feature schema string:: $ geomesa get-sft-config -u username -p password -c test_catalog -f test_feature --format spec keywords ~~~~~~~~ Add or remove keywords to a specified schema:: Repeat the -a or -r flags to add or remove multiple keywords The ``--removeAll`` option removes all keywords The ``-l`` option lists the schema's keywords following all operations If there is whitespace within a keyword, enclose it in quotes for proper functionality:: $ geomesa keywords -u username -p password \ -a keywordB -a keywordC -r keywordA -l \ -i instance -z zoo1,zoo2,zoo3 \ -c catalog -f featureTypeName get-names ~~~~~~~~~ List all known feature types in a GeoMesa catalog:: $ geomesa get-names -u username -p password -c test_catalog remove-schema ~~~~~~~~~~~~~ Used to remove a feature type (``SimpleFeatureType``) in a GeoMesa catalog. This will also delete any feature of that type in the data store:: $ geomesa remove-schema -u username -p password \ -i instance -z zoo1,zoo2,zoo3 \ -c test_catalog -f testfeature1 $ geomesa remove-schema -u username -p password \ -i instance -z zoo1,zoo2,zoo3 \ -c test_catalog --pattern 'testfeatures\d+' Manipulating data ^^^^^^^^^^^^^^^^^ convert ~~~~~~~ Convert files using the internal SFT (``SimpleFeatureType``) converter framework:: $ geomesa convert -spec example --converter example-csv \ -F json ./exampledata.csv $ geomesa convert -s example -C example-csv -F avro --gzip 4 \ --max-features 10 -o exampleout.avro ./exampledata.csv .. note:: Output data has been converted by the internal SFT converters as defined by the provided converter config. This most likely means a new converter config will be required to ingest (or re-convert) the converted data. Ingesting and exporting data ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ .. _export: export ~~~~~~ Export GeoMesa features. The "attribute expressions" specified by the ``-a`` option are comma-separated expressions in the format:: attribute[=filter_function_expression]|derived-attribute=filter_function_expression `filter_function_expression` is an expression of filter function applied to attributes, literals and filter functions, i.e. can be nested. Example export commands:: $ geomesa export -u username -p password \ -c test_catalog -f test_feature \ -a "geom,text,user_name" --format csv \ -q "include" -m 100 $ geomesa export -u username -p password \ -c test_catalog -f test_feature \ -a "geom,text,user_name" --format gml \ -q "user_name='JohnSmith'" $ geomesa export -u username -p password \ -c test_catalog -f test_feature \ -a "user_name,buf=buffer(geom\, 2)" \ --format csv -q "[[ user_name like `John%' ] AND [ bbox(geom, 22.1371589, 44.386463, 40.228581, 52.379581, 'EPSG:4326') ]]" For fine-grained control, query hints can be set using the `--hints` parameter, in the form `key1=value1;key2=value2`. See :ref:`query_hints` and :ref:`analytic_queries` for more information. .. _ingest: ingest ~~~~~~ Used to convert and ingest data from various file formats as GeoMesa features. GeoMesa defines several common converter factories for formats such as delimited text (TSV, CSV), fixed width files, JSON, XML, and Avro. New converter factories (e.g. for custom binary formats) can be registered on the classpath using Java SPI. Shapefile ingest is also supported. Files can be either local or in HDFS. You cannot mix target files (e.g. local and HDFS). .. note:: The header, if present, is not parsed by ``ingest`` for information. It is assumed that all lines are valid entries. Converters and SFTs are specified in the `HOCON `__ format and loaded using the `TypeSafe configuration library `__. They can be referenced by name using the ``-s`` and ``-C`` args. To define new converters for the users can package a ``reference.conf`` file inside a jar and drop it in the ``$GEOMESA_ACCUMULO_HOME/lib`` directory or add config definitions to the ``$GEOMESA_TOOLS/conf/application.conf`` file which includes some examples. SFT and Converter specifications should use the path prefixes ``geomesa.converters.`` and ``geomesa.sfts.`` For example, here's a simple CSV file to ingest named ``example.csv``:: FID,Name,Age,LastSeen,Friends,Lat,Lon 23623,Harry,20,2015-05-06,"Will, Mark, Suzan",-100.236523,23 26236,Hermione,25,2015-06-07,"Edward, Bill, Harry",40.232,-53.2356 3233,Severus,30,2015-10-23,"Tom, Riddle, Voldemort",3,-62.23 .. note:: ID is a reserved word, for a full list of reserved words see :ref:`reserved-words`. To ingest this file, a SimpleFeatureType named ``renegades`` and a converter named ``renegades-csv`` can be placed in the ``application.conf`` file:: # cat $GEOMESA_ACCUMULO_HOME/conf/application.conf geomesa { sfts { renegades = { attributes = [ { name = "fid", type = "Integer", index = false } { name = "name", type = "String", index = true } { name = "age", type = "Integer", index = false } { name = "lastseen", type = "Date", index = true } { name = "friends", type = "List[String]", index = true } { name = "geom", type = "Point", index = true, srid = 4326, default = true } ] } } converters { renegades-csv = { type = "delimited-text" format = "CSV" options { skip-lines = 1 //skip the header } id-field = "toString($fid)" fields = [ { name = "fid", transform = "$1::int" } { name = "name", transform = "$2::string" } { name = "age", transform = "$3::int" } { name = "lastseen", transform = "date('YYYY-MM-dd', $4)" } { name = "friends", transform = "parseList('string', $5)" } { name = "lon", transform = "$6::double" } { name = "lat", transform = "$7::double" } { name = "geom", transform = "point($lon, $lat)" } ] } } } The SFT and Converter can be referenced by name and the following commands can ingest the file:: $ geomesa ingest -u username -p password \ -c geomesa_catalog -i instance --threads 1 \ -s renegades -C renegades-csv example.csv # use the Hadoop file system instead $ geomesa ingest -u username -p password \ -c geomesa_catalog -i instance \ -s renegades -C renegades-csv hdfs:///some/hdfs/path/to/example.csv SFT and Converter configs can also be provided as strings or filenames to the ``-s`` and ``-C`` arguments. The syntax is very similar to the ``application.conf`` and ``reference.conf`` format. Config specifications must be nested using the paths ``geomesa.converters.`` and ``geomesa.sfts.`` as shown below:: # A nested SFT config provided as a string or file to the -s argument specifying # a type named "renegades" # # cat /tmp/renegades.sft geomesa.sfts.renegades = { attributes = [ { name = "fid", type = "Integer", index = false } { name = "name", type = "String", index = true } { name = "age", type = "Integer", index = false } { name = "lastseen", type = "Date", index = true } { name = "friends", type = "List[String]", index = true } { name = "geom", type = "Point", index = true, srid = 4326, default = true } ] } Similarly, converter configurations must be nested when passing them directly to the ``-C`` argument:: # a nested converter definition # cat /tmp/renegades.convert geomesa.converters.renegades-csv = { type = "delimited-text" format = "CSV" options { skip-lines = 0 // don't skip lines in distributed ingest } id-field = "toString($fid)" fields = [ { name = "fid", transform = "$1::int" } { name = "name", transform = "$2::string" } { name = "age", transform = "$3::int" } { name = "lastseen", transform = "date('YYYY-MM-dd', $4)" } { name = "friends", transform = "parseList('string', $5)" } { name = "lon", transform = "$6::double" } { name = "lat", transform = "$7::double" } { name = "geom", transform = "point($lon, $lat)" } ] } Using the SFT and Converter config files we can then ingest our csv file with this command:: # ingest command $ geomesa ingest -u username -p password \ -c geomesa_catalog -i instance \ -s /tmp/renegades.sft \ -C /tmp/renegades.convert hdfs:///some/hdfs/path/to/example.csv For more documentation on converter configuration, see :doc:`/user/convert/index`. Shape files may also be ingested:: $ geomesa ingest -u username -p password \ -c test_catalog -f shapeFileFeatureName /some/path/to/file.shp Enabling S3 Ingest ^^^^^^^^^^^^^^^^^^ Hadoop ships with implementations of S3-based filesystems, which can be enabled in the Hadoop configuration used with GeoMesa tools. Specifically, GeoMesa tools can perform ingests using both the second-generation (`s3n`) and third-generation (`s3a`) filesystems. Edit the ``$HADOOP_CONF_DIR/core-site.xml`` file in your Hadoop installation, as shown below (these instructions apply to Hadoop 2.5.0 and higher). Note that you must have the environment variable ``$HADOOP_MAPRED_HOME`` set properly in your environment. Some configurations can substitute ``$HADOOP_PREFIX`` in the classpath values below. .. warning:: AWS credentials are valuable! They pay for services and control read and write protection for data. If you are running GeoMesa on AWS EC2 instances, it is recommended to use the ``s3a`` filesystem. With ``s3a``, you can omit the Access Key Id and Secret Access keys from `core-site.xml` and rely on IAM roles. For ``s3a``: .. code-block:: xml mapreduce.application.classpath $HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*:$HADOOP_MAPRED_HOME/share/hadoop/tools/lib/* The classpath specifically for Map-Reduce jobs. This override is needed so that s3 URLs work on Hadoop 2.6.0+ fs.s3a.access.key XXXX YOURS HERE fs.s3a.secret.key XXXX YOURS HERE Valuable credential - do not commit to CM After you have enabled S3 in your Hadoop configuration you can ingest with GeoMesa tools. Note that you can still use the Kleene star (*) with S3.: .. code-block:: bash $ geomesa ingest -u username -p password -c geomesa_catalog -i instance -s yourspec -C convert s3a://bucket/path/file* For ``s3n``: .. code-block:: xml mapreduce.application.classpath $HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*:$HADOOP_MAPRED_HOME/share/hadoop/tools/lib/* The classpath specifically for mapreduce jobs. This override is needed so that s3 URLs work on hadoop 2.6.0+ fs.s3n.impl org.apache.hadoop.fs.s3native.NativeS3FileSystem Tell hadoop which class to use to access s3 URLs. This change became necessary in hadoop 2.6.0 fs.s3n.awsAccessKeyId XXXX YOURS HERE fs.s3n.awsSecretAccessKey XXXX YOURS HERE S3n paths are prefixed in hadoop with ``s3n://`` as shown below:: $ geomesa ingest -u username -p password \ -c geomesa_catalog -i instance -s yourspec \ -C convert s3n://bucket/path/file s3n://bucket/path/* Enabling Azure Ingest ^^^^^^^^^^^^^^^^^^^^^ Hadoop ships with implementations of Azure-based filesystems, which can be enabled in the Hadoop configuration used with GeoMesa tools. Specifically, GeoMesa tools can perform ingests using the (`wasb`) and (`wasbs`) filesystems. Edit the ``$HADOOP_CONF_DIR/core-site.xml`` file in your Hadoop installation as shown below (these instructions apply to Hadoop 2.5.0 and higher). In addition, the hadoop-azure and azure-storage jars need to be available. .. warning:: Azure credentials are valuable! They pay for services and control read and write protection for data. Be sure to keep your core-site.xml configuration file safe. It is recommended that you use Azure's SSL enable file protocal variant `wasbs` where possible. Configuration: To enable, place the following in your Hadoop Installation's core-site.xml. .. code-block:: xml fs.azure.account.key.ACCOUNTNAME.blob.core.windows.net XXXX YOUR ACCOUNT KEY After you have enabled Azure in your Hadoop configuration you can ingest with GeoMesa tools. Note that you can still use the Kleene star (*) with Azure.: .. code-block:: bash $ geomesa ingest -u username -p password \ -c geomesa_catalog -i instance -s yourspec \ -C convert wasb://CONTAINER@ACCOUNTNAME.blob.core.windows.net/files/* .. _accumulo_tools_raster: Working with raster data ^^^^^^^^^^^^^^^^^^^^^^^^ delete-raster ~~~~~~~~~~~~~ Delete a given GeoMesa raster table:: $ geomesa delete-raster -u username -p password -t somerastertable -f ingest-raster ~~~~~~~~~~~~~ Ingest one or multiple raster image files into Geomesa. Input files, GeoTIFF or DTED, should be located on the local file system. .. note:: Make sure GDAL is installed when doing chunking, which depends on the GDAL utility ``gdal_translate``. Input raster files are assumed to have CRS set to EPSG:4326. For non-EPSG:4326 files, they need to be converted into EPSG:4326 raster files before ingestion. An example of doing conversion with GDAL utility is ``gdalwarp -t_srs EPSG:4326 input_file out_file``. Example usage:: $ geomesa ingest-raster -u username -p password \ -t geomesa_raster -f /some/local/path/to/raster.tif query-rasterstats ~~~~~~~~~~~~~~~~~ Export queries and statistics about the `n` most recent raster queries to a CSV file:: $ geomesa query-rasterstats -u username -p password -t somerastertable -n 10 Performing system administration tasks ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ .. _add_index_command: add-index ~~~~~~~~~ Add or update indices for an existing feature type. This can be used to upgrade-in-place, converting an older index format into the latest. See :ref:`index_upgrades` for more information. Example usage:: $ geomesa add-index -u username -p password -i instance \ -z zoo1,zoo2,zoo3 -c test_catalog -f test_feature --index xz3 delete-catalog ~~~~~~~~~~~~~~ Delete a GeoMesa catalog table completely, along with all features in it. Example usage:: $ geomesa delete-catalog -u username -p password \ -i instance -z zoo1,zoo2,zoo3 -c test_catalog delete-features ~~~~~~~~~~~~~~~ Delete features from a table in GeoMesa. Does not delete any tables or schema information. Example usage:: $ geomesa delete-features -u username -p password \ -i instance -z zoo1,zoo2,zoo3 -c test_catalog \ -q 'dtg DURING 2016-02-02T00:00:00.000Z/2016-02-03T00:00:00.000Z' add-attribute-index ~~~~~~~~~~~~~~~~~~~ Add attribute indices for a specified list of attributes.:: $ geomesa add-attribute-index -u username -p password -i instance -z zoo1,zoo2,zoo3 -c test_catalog \ -f test_feature -a attribute1,attribute2 --coverage full This will launch a map-reduce job creating attribute indices for each attribute listed in ``-a``. This is essentially a convenience wrapper for invoking the job described in :ref:`attribute_indexing_job`. env ~~~ Examines the current GeoMesa tools environment, and prints out simple feature types converters that are available on the current classpath. The available types can be used for ingestion; see the :ref:`ingest` command. Use of this command without parameters will result in behavior similar to when the help command is used. Parameters allow you to specify what to print to out. These give you the ability to view a list of all simple feature types and converters, describe all the feature types and converters, or review a subset of these simple feature types and converters. There are a few options that permit you to specify the desired format when describing simple feature types. There are a few commands pertaining to the format of describing simple feature types. Example usage:: $ geomesa env --list-sfts explain ~~~~~~~ Explain how a given GeoMesa query will be executed:: $ geomesa explain -u username -p password \ -c test_catalog -f test_feature \ -q "INTERSECTS(geom, POLYGON ((41 28, 42 28, 42 29, 41 29, 41 28)))" .. _accumulo_tools_stats_analyze: stats-analyze ~~~~~~~~~~~~~ Analyze statistics for your data set. This may improve query planning. Example usage:: $ geomesa stats-analyze -u username -p password -c geomesa.data -f twitter Running stat analysis for feature type twitter... Stats analyzed: Total features: 8852601 Bounds for geom: [ -171.75, -45.5903996, 157.7302, 89.99997102 ] cardinality: 2119237 Bounds for dtg: [ '2016-02-01T00:09:12.000Z' to '2016-03-01T00:21:02.000Z' ] cardinality: 2161132 Bounds for user_id: [ '100000215' to '99999502' ] cardinality: 861283 Use 'stats-histogram' or 'stats-count' commands for more details stats-bounds ~~~~~~~~~~~~ Displays the bounds of your data for different attributes. You can use pre-calculated stats for a quick estimation, or get the definitive result by querying the data set using the '--no-cache' flag. Example usage:: $ geomesa stats-bounds -u username -p password -i instance -z zoo1,zoo2,zoo3 \ -c geomesa.data -f twitter user_id [ 100000215 to 99999502 ] cardinality: 861283 user_name [ unavailable ] text [ unavailable ] dtg [ 2016-02-01T00:09:12.000Z to 2016-03-01T00:21:02.000Z ] cardinality: 2161132 geom [ -171.75, -45.5903996, 157.7302, 89.99997102 ] cardinality: 2119237 $ geomesa stats-bounds -u username -p password -i instance -z zoo1,zoo2,zoo3 \ -c geomesa.data -f twitter --no-cache \ -q 'BBOX(geom,-70,45,-60,55) AND dtg DURING 2016-02-02T00:00:00.000Z/2016-02-03T00:00:00.000Z' Running stat query... user_id [ 1011811424 to 99124417 ] cardinality: 115 user_name [ bar_user to foo_user ] cardinality: 113 text [ bar to foo ] cardinality: 180 dtg [ 2016-02-02T00:01:07.000Z to 2016-02-02T23:59:41.000Z ] cardinality: 178 geom [ -69.87212338, 45.01259299, -60.08925, 53.8868369 ] cardinality: 155 stats-count ~~~~~~~~~~~ Counts the features in your data set. You can count total features, or features that match a CQL filter. You can use pre-calculated stats for a quick estimation, or get the definitive result by querying the data set using the '--no-cache' flag. Example usage:: $ geomesa stats-count -u username -p password -i instance -z zoo1,zoo2,zoo3 \ -c geomesa.data -f twitter Estimated count: 8852601 $ geomesa stats-count -u username -p password -i instance -z zoo1,zoo2,zoo3 \ -c geomesa.data -f twitter \ -q 'BBOX(geom,-70,45,-60,55) AND dtg DURING 2016-02-02T00:00:00.000Z/2016-02-03T00:00:00.000Z' Estimated count: 2681 $ geomesa stats-count -u username -p password -i instance -z zoo1,zoo2,zoo3 \ -c geomesa.data -f twitter --no-cache \ -q 'BBOX(geom,-70,45,-60,55) AND dtg DURING 2016-02-02T00:00:00.000Z/2016-02-03T00:00:00.000Z' Running stat query... Count: 182 stats-top-k ~~~~~~~~~~~ Enumerates the values for attributes in your data set. You can enumerate all values for all features, or only values for features that match a CQL filter. Example usage:: $ geomesa stats-top-k -u username -p password -i instance -z zoo1,zoo2,zoo3 \ -c geomesa.data -f twitter -a user_id -k 10 Top values for 'user_id': 3144822634 (26681) 388009236 (20553) 497145453 (19858) 563319506 (15848) 2841269945 (15763) 2924224280 (15731) 141302910 (15240) 2587789764 (14811) 56266341 (14487) 889599440 (14330) stats-histogram ~~~~~~~~~~~~~~~ Counts the features in your data set, grouped into sorted bins. You may specify the number of bins to group attribute into. You can count total features, or features that match a CQL filter. You can use pre-calculated stats for a quick estimation, or get the definitive result by querying the data set using the '--no-cache' flag. If you query a histogram for a geometry attribute, the result will be displayed in an ASCII heatmap. Example usage:: $ geomesa stats-histogram -u username -p password -i instance -z zoo1,zoo2,zoo3 \ -c geomesa.data -f twitter -a dtg --bins 10 Binned histogram for 'dtg': [ 2016-02-01T00:09:12.000Z to 2016-02-03T21:46:23.000Z ] 798968 [ 2016-02-03T21:46:23.000Z to 2016-02-06T19:23:34.000Z ] 868019 [ 2016-02-06T19:23:34.000Z to 2016-02-09T17:00:45.000Z ] 861720 [ 2016-02-09T17:00:45.000Z to 2016-02-12T14:37:56.000Z ] 833473 [ 2016-02-12T14:37:56.000Z to 2016-02-15T12:15:07.000Z ] 990292 [ 2016-02-15T12:15:07.000Z to 2016-02-18T09:52:18.000Z ] 842434 [ 2016-02-18T09:52:18.000Z to 2016-02-21T07:29:29.000Z ] 968936 [ 2016-02-21T07:29:29.000Z to 2016-02-24T05:06:40.000Z ] 862808 [ 2016-02-24T05:06:40.000Z to 2016-02-27T02:43:51.000Z ] 869208 [ 2016-02-27T02:43:51.000Z to 2016-03-01T00:21:02.000Z ] 956743 $ geomesa stats-histogram -u username -p password -i instance -z zoo1,zoo2,zoo3 \ -c geomesa.data -f twitter -a dtg --bins 10 --no-cache Running stat query... Binned histogram for 'dtg': [ 2016-02-01T00:09:12.000Z to 2016-02-03T21:46:23.000Z ] 805620 [ 2016-02-03T21:46:23.000Z to 2016-02-06T19:23:34.000Z ] 869361 [ 2016-02-06T19:23:34.000Z to 2016-02-09T17:00:45.000Z ] 859868 [ 2016-02-09T17:00:45.000Z to 2016-02-12T14:37:56.000Z ] 832458 [ 2016-02-12T14:37:56.000Z to 2016-02-15T12:15:07.000Z ] 986829 [ 2016-02-15T12:15:07.000Z to 2016-02-18T09:52:18.000Z ] 841580 [ 2016-02-18T09:52:18.000Z to 2016-02-21T07:29:29.000Z ] 970460 [ 2016-02-21T07:29:29.000Z to 2016-02-24T05:06:40.000Z ] 863484 [ 2016-02-24T05:06:40.000Z to 2016-02-27T02:43:51.000Z ] 871742 [ 2016-02-27T02:43:51.000Z to 2016-03-01T00:21:02.000Z ] 951199 config-table ~~~~~~~~~~~~ Perform various table configuration tasks. There are three sub-arguments: * **list** - List the configuration options for a GeoMesa table * **describe** - Describe a given configuration option for a table * **update** - Update a given configuration option for a table Example commands:: $ geomesa config-table list -u username -p password \ -c test_catalog -f test_feature -t st_idx $ geomesa config-table describe -u username -p password \ -c test_catalog -f test_feature -t attr_idx \ --param table.bloom.enabled $ geomesa config-table update -u username -p password \ -c test_catalog -f test_feature -t records \ --param table.bloom.enabled -n true configure-age-off ~~~~~~~~~~~~~~~~~ List, add or remove age-off on a given feature type. Age-off can be configured based on ingest time, or on a date-type simple feature attribute with the ``--dtg`` option. See :ref:`ageoff_accumulo` for more information. Example commands:: $ geomesa configure-age-off -c test_catalog -f test_feature --list $ geomesa configure-age-off -c test_catalog -f test_feature --set --expiry '1 day' $ geomesa configure-age-off -c test_catalog -f test_feature --set --expiry '1 day' --dtg dtg $ geomesa configure-age-off -c test_catalog -f test_feature --remove version ~~~~~~~ Prints out the version, git branch, and commit ID that the tools were built with:: $ geomesa version