At my job we regularly need to grab data from external sources, whether that be via ftp, sftp, e-mail scraping, web services, or web scraping. The formats vary from screen scraping/parsing, to CSV, XML, JSON, or XLS.
A new leader has now entered the picture and is appauled that we code a new program (in Java, C#, etc) to handle each type of downloader. This leader has indemnified this practice in favor of using "off the shelf software" and "pulling data with databases".
What are others opinions of each direction? What reliable tools are out there that can eliminate coding downloaders? Are we really that "backwards" in our current ways?