You need to extend your database and processing to better deal with the problem.
The data provided by the remote service is in different format as you've already noted. So you need to separate the concerns of fetching the data and parsing it, because both things are independent to each other. For example, the format for one TLD can change over time.
So first of all you fetch the plain text data per domain and store it's meta-data:
- domain
- whois server
- timestamp of fetch operation
- response
- status code (if the protocol has this)
You can then later on within a second processing do the parsing. You can use the metadata that already exists to decide which parsing algorithm you need. That helps you to maintain your application over time as well.
After parsing went right, you've got the normalized format which is what you aim for.
Next to these technical processings, you should take care of the usage conditions offered by the whois service(s). Not everything that is technically possible, is legally or morally accepted. Take care and treat other persons personal records with the respect this deserves. Protect the data you collect, e.g. archive and scramble / lock-away data you don't need any longer for your on going processing.
See as well: