0

Im trying to run this example Sentence Encoder with apach flink v. 1.17.0
But when i start app i get error

Exception in thread "main" org.apache.flink.runtime.rpc.exceptions.RpcLoaderException: Could not load RpcSystem.
    at org.apache.flink.runtime.rpc.RpcSystem.load(RpcSystem.java:106)
    at org.apache.flink.runtime.minicluster.MiniCluster.lambda$new$0(MiniCluster.java:253)
    at org.apache.flink.runtime.minicluster.MiniCluster.start(MiniCluster.java:339)
    at org.apache.flink.client.program.PerJobMiniClusterFactory.submitJob(PerJobMiniClusterFactory.java:77)

Dont understand why it cause, even if simple text server started by ncat(on WIN).
My pom.xml is

    <dependencies>
        <dependency>
            <groupId>ai.djl</groupId>
            <artifactId>api</artifactId>
            <version>0.23.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java</artifactId>
            <version>1.17.1</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.17.1</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-java</artifactId>
            <version>1.17.1</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-runtime</artifactId>
            <version>1.17.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-asm-9</artifactId>
            <version>9.5-17.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-metrics-core</artifactId>
            <version>1.17.1</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-guava</artifactId>
            <version>30.1.1-jre-16.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients</artifactId>
            <version>1.17.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-rpc-core</artifactId>
            <version>1.17.1</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-optimizer</artifactId>
            <version>1.17.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-netty</artifactId>
            <version>4.1.91.Final-17.0</version>
        </dependency>
        <dependency>
            <groupId>com.esotericsoftware</groupId>
            <artifactId>kryo</artifactId>
            <version>5.5.0</version>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
            <version>2.0.7</version>
        </dependency>
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-lang3</artifactId>
            <version>3.12.0</version>
        </dependency>
    </dependencies>

It looks like something new as exception cause i dont found any theme like about this one.
In debug mode it fails when tried to proceed Iterator iterator = ServiceLoader.load(RpcSystemLoader.class).iterator(); in RpcSystem class

1 Answers1

0

You have dependencies in here which don't make a lot of sense (like flink-shaded-asm-9, flink-shaded-guava, flink-rpc-core etc). On https://nightlies.apache.org/flink/flink-docs-release-1.17/docs/dev/configuration/overview/ is explained what you need to have in there.

In order to run a DataStream API application, I would only expect:

        <!-- These dependencies are provided, because they should not be packaged into the JAR file. -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>

These are also the dependencies if you use the quickstart example.

Martijn Visser
  • 1,468
  • 1
  • 3
  • 9
  • For example that i linked those dependencies need. If i delete them i will get error like `Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flink/shaded/asm9/org/objectweb/asm/ClassVisitor` – Taras Danylchenko Jul 24 '23 at 18:38
  • I'm not sure how you can need them: they are strictly there for Flink to work internally. We just shade them to avoid that if users really have a requirement on those dependencies, they can include themselves without having to deal with version clashes. If you are trying to run it locally, are you making sure that you're setting your IDE to include dependencies that are marked as `provided` on the classpath? – Martijn Visser Jul 25 '23 at 12:27
  • Yes, i add in run configuration settings "add dependencies with provided scope to the classpath" – Taras Danylchenko Jul 26 '23 at 19:23
  • Looking at the repo you've linked, that uses Flink 1.14. I'm suspecting that it relies on internal implementations that might have been broken in Flink 1.15 or newer versions. – Martijn Visser Jul 27 '23 at 09:04
  • So, better to use old versions rather than new one (1.17) ? – Taras Danylchenko Jul 27 '23 at 10:01
  • If you want to use that repository, there's no other choice except asking the repo maintainers to update it to support the latest version or forking that repo, make it compatible with newer version and potentially contribute it back. – Martijn Visser Jul 28 '23 at 09:44