I am integrating apache kafka with hdfs, and I need to automate all the versions used automatically to the latest versions. How do I do it?
Asked
Active
Viewed 42 times
-1
-
It's better you use Hortonworks or Cloudera, for example, which are large enough companies with more throughout test suites than yourself – OneCricketeer Nov 09 '18 at 15:14
1 Answers
0
There's ways to do this, but in general it's a very bad idea to always use automatically the latest versions. What happens when you release a new version to prod, and the build automation grabs a new release that you haven't tested with, that happens to break your system in production? Or there's a version released that actually has a bug?
There's a reason why most build systems (maven, gradle, etc) by default pin the dependencies to a specific version.
See these other questions (and top answers) for more details about this being a bad idea:
Gradle - getting the latest release version of a dependency
How do I tell Maven to use the latest version of a dependency?

mjuarez
- 16,372
- 11
- 56
- 73
-
What if we are automating using the latest version available in our library or helm chart or something like that? The tested versions will be updated, and we want to automatically do the version control. The automation is to be done in python. – Arvind Abraham Nov 09 '18 at 09:08
-
Kafka and Hadoop don't depend on each other, but individually, upgrading either can have significant consequences on existing data and other systems they interact with. In general, newer Kafka clients work well with old servers. Hadoop has less backwards compatibility @Arvind – OneCricketeer Nov 09 '18 at 15:17