My company is building a database from a large publicly available data set. When we complete it, we will have something like 500GB of data, but the data will never grow beyond that. It takes advantage of Postgres's polygon manipulation features, and as such has to stay in Postgres.
How can we host this database in the most cost effective way possible?
EDIT: I should mention that we are wanting to host this database in the cloud, since we don't have our own onsite servers.
EDIT 2: Sorry, let me elaborate. This database will integrated into a SAAS webapp, so potentially many users will be hitting the database at the same time. However, once we have it in place, the data will rarely change, and if it does change, it will only ever be added to, never deleted. Something like Linode, which we use for hosting the rest of the site, doesn't have enough storage space. We want to optimize cost, but secondarily we would prefer to not touch any hardware ourselves, so buying a large drive would be less than ideal.