-
Notifications
You must be signed in to change notification settings - Fork 13
Open
Description
We get the following error when we try to reproduce readme example, we are using scala 2.11 jar downloaded from maven. We are running from AWS EMR cluster, with the following version
Spark 2.1.1 (git revision 173acdf) built for Hadoop 2.7.3-amzn-2
val clusters: Dataset[Assignment] =
MCL.train(graph).assignments
:37: error: type mismatch;
found : org.apache.spark.rdd.RDD[org.apache.spark.mllib.clustering.Assignment]
required: org.apache.spark.sql.Dataset[org.apache.spark.mllib.clustering.Assignment]
Error occurred in an application involving default arguments.
MCL.train(graph).assignments
Metadata
Metadata
Assignees
Labels
No labels