When we work with Hive, it is common to build external tables where the location of the data is on HDFS or S3. We will show you different way on how you can get detailed information about your Hive tables, including the column names, the data types, the create statement, the location of your data and so on.
SHOW CREATE TABLE <TABLE>
This command returns the “create statement” of the table:
show create table coach_50_plus_users_rr_by_emotion
createtab_stmt CREATE EXTERNAL TABLE `my_table`( `name` string, `surname` string, `salary` float, `expenses` float, `age` int) ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe' STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat' LOCATION 's3://my-bucket/prefix/path' TBLPROPERTIES ( 'bucketing_version'='2', 'transient_lastDdlTime'='1638024262')
Other commands that you use in order to extract information about the Hive tables are:
desc formatted <table>; describe formatted <table>; describe extended <table>