You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If run on Spark cluster i.e on standalone mode, in Executors logs one can see this:
8/06/28 13:28:16 WARN NettyUtil: Found Netty's native epoll transport in the classpath, but epoll is not available. Using NIO instead.
java.lang.UnsatisfiedLinkError: Could not find prefix added to io.netty.util.internal.NativeLibraryLoader to get com.datastax.shaded.netty.util.internal.NativeLibraryLoader. When shading, only adding a package prefix is supported
at com.datastax.shaded.netty.util.internal.NativeLibraryLoader.calculatePackagePrefix(NativeLibraryLoader.java:108)
at com.datastax.shaded.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:120)
at com.datastax.shaded.netty.channel.epoll.Native.loadNativeLibrary(Native.java:207)
at com.datastax.shaded.netty.channel.epoll.Native.<clinit>(Native.java:65)
at com.datastax.shaded.netty.channel.epoll.Epoll.<clinit>(Epoll.java:33)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at com.datastax.driver.core.NettyUtil.<clinit>(NettyUtil.java:68)
at com.datastax.driver.core.NettyOptions.eventLoopGroup(NettyOptions.java:99)
at com.datastax.driver.core.Connection$Factory.<init>(Connection.java:769)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1410)
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:399)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:161)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32)
at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111)
at com.datastax.spark.connector.writer.TableWriter.writeInternal(TableWriter.scala:210)
at com.datastax.spark.connector.writer.TableWriter.insert(TableWriter.scala:197)
at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:183)
at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:36)
at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:36)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Suppressed: java.lang.UnsatisfiedLinkError: Could not find prefix added to io.netty.util.internal.NativeLibraryLoader to get com.datastax.shaded.netty.util.internal.NativeLibraryLoader. When shading, only adding a package prefix is supported
at com.datastax.shaded.netty.util.internal.NativeLibraryLoader.calculatePackagePrefix(NativeLibraryLoader.java:108)
at com.datastax.shaded.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:120)
at com.datastax.shaded.netty.channel.epoll.Native.loadNativeLibrary(Native.java:210)
... 28 more
18/06/28 13:28:16 INFO Cluster: New Cassandra host scylladb.default.svc.cluster.local/10.3.0.13:9042 added
Most probably Gemini build needs to be updated.
The text was updated successfully, but these errors were encountered:
If run on Spark cluster i.e on standalone mode, in Executors logs one can see this:
Most probably Gemini build needs to be updated.
The text was updated successfully, but these errors were encountered: