I am working with huge data set and its growing everyday. Fear of loosing data which you have gathered after so much efforts can give you sleepless nights. One of the way to tack care of this risk is by taking backup frequently, whenever you need reload the data, you can simply reload it.
Creating backup file
pg_dump dbname > outfile
Loading from back up file
psql dbname < infile
please note that dbname needs to be created prior to loading this backup file and it will not only create database in the same format but also it will load data.
Dealing with huge datasets
If your are dealing with huge databses you can takke dump in zip file
pg_dump dbname | gzip > filename.gz
and load it as below
gunzip -c filename.gz | psql dbname
cat filename.gz | gunzip | psql dbname
alternatively you can split files if you are not comfortable with zip or have limitation of file size.
pg_dump dbname | split -b 1m - filename
cat filename* | psql dbname