Apache Spark Cassandra连接器错误超时等待服务器响应

当我使用spark-cassandra-connector_2.11-2.0.0-M3和spark-cassandra-connector-java_2.10-1.6.0-M1执行spark来保存Cassandra中的数据时,在日志文件中出现这个错误

我有3个节点Cassandra(v3.9.0)集群在单独的机器上。 Spark正在另一台机器上运行(devhost1 / 10.0.0.14)。

ERROR QueryExecutor:72 - Failed to execute: com.datastax.spark.connector.writer.RichBoundStatement@5598b553 com.datastax.driver.core.exceptions.OperationTimedOutException: [devhost2/10.0.0.15] Timed out waiting for server response at com.datastax.driver.core.RequestHandler$SpeculativeExecution.onTimeout(RequestHandler.java:753) at com.datastax.driver.core.Connection$ResponseHandler$1.run(Connection.java:1267) at io.netty.util.HashedWheelTimer$HashedWheelTimeout.expire(HashedWheelTimer.java:581) at io.netty.util.HashedWheelTimer$HashedWheelBucket.expireTimeouts(HashedWheelTimer.java:655) at io.netty.util.HashedWheelTimer$Worker.run(HashedWheelTimer.java:367) at java.lang.Thread.run(Thread.java:748) 2017-07-26 23:54:18 ERROR QueryExecutor:72 - Failed to execute: com.datastax.spark.connector.writer.RichBoundStatement@4e9c5c03 com.datastax.driver.core.exceptions.OperationTimedOutException: [devhost3/10.0.0.16] Timed out waiting for server response at com.datastax.driver.core.RequestHandler$SpeculativeExecution.onTimeout(RequestHandler.java:753) at com.datastax.driver.core.Connection$ResponseHandler$1.run(Connection.java:1267) at io.netty.util.HashedWheelTimer$HashedWheelTimeout.expire(HashedWheelTimer.java:581) at io.netty.util.HashedWheelTimer$HashedWheelBucket.expireTimeouts(HashedWheelTimer.java:655) at io.netty.util.HashedWheelTimer$Worker.run(HashedWheelTimer.java:367) at java.lang.Thread.run(Thread.java:748) 

连接使用SparkConfig和SparkSession在devhost1 / 10.0.0.14中创build,其中“spark.cassandra.connection.host”为“10.0.0.14,10.0.0.15,10.0.0.16”,Master为“spark://10.0.0.14: 7077"

Spark版本2.0.0 Java 1.8.0_131

请帮忙。 谢谢