Home > other >  Python SVM BinaryClassificationMetrics areaUnderROC
Python SVM BinaryClassificationMetrics areaUnderROC

Time:11-22

Model=SVMWithSGD. "train" (trainData, 3, 50, 1)
The from pyspark. Mllib. Evaluation import BinaryClassificationMetrics
Score=model predict (validationData. The map (lambda p: p.f eatures))
ScoreAndLabel=score. Zip (validationData. The map (lambda p: p.l Abel))
The metrics=BinaryClassificationMetrics (scoreAndLabel)
AUC=metrics. AreaUnderPR ()
Error:
Py4JJavaError Traceback (the most recent call last)
()
3 scoreAndLabel=score. Zip (validationData map (lambda p: p.l Abel))
4 the metrics=BinaryClassificationMetrics (scoreAndLabel)
- & gt; 5 AUC=metrics. AreaUnderPR ()
6
7 scoreAndLabel. Take (5)

/usr/local/spark/python/pyspark mllib/evaluation. Pyc in areaUnderPR (self)
70 Computes the area under the precision - recall curve.
71 ", "
"- & gt; 72 the return of the self. The call (" areaUnderPR ")
73
@ since 74 (' 1.4.0)

/usr/local/spark/python/pyspark/mllib/common pyc in call (self, name, * a)
144 def call (self, name, * a) :
145 ", "" the Call method of java_model" ", "
- & gt; 146 return callJavaFunc (self _sc getattr (self _java_model, name), * a)
147
148

/usr/local/spark/python/pyspark/mllib/common pyc in callJavaFunc (sc, func, * args)
121 ", "" Call the Java Function" ""
122 the args=[_py2java (sc, a) for a in the args]
- & gt; 123 return _java2py (sc, func (* args))
124
125

/usr/local/spark/python/lib/py4j - 0.10.1 - SRC. Zip/py4j java_gateway. Py in __call__ (self, * args)
Answer=931 self. Gateway_client. Send_command (command)
932 return_value=https://bbs.csdn.net/topics/get_return_value (
- & gt; 933 answer, self gateway_client, self. Target_id, self. Name)
934
935 for temp_arg temp_args in:

/usr/local/spark/python/pyspark/SQL/utils. Pyc in deco (* a, * * kw)
61 def deco (* a, * * kw) :
62 try:
- & gt; 63 return f (a, * * * kw)
64 the except py4j. Protocol. Py4JJavaError as e:
65 s=e.j ava_exception. ToString ()

/usr/local/spark/python/lib/py4j - 0.10.1 - SRC. Zip/py4j/protocol. Py in get_return_value (answer, gateway_client target_id, name)
310 the -raise Py4JJavaError (
311 "An error occurred while calling {0} {1} {2}. \ n".
- & gt; 312 format (target_id, ". ", name), value)
313 else:
314 the -raise Py4JError (

Py4JJavaError: An error occurred while calling o6150. AreaUnderPR.
: org. Apache. Spark. SparkException: Job aborted due to stage a failure: Task 1 in stage 3720.0 failed 4 times, the most recent failure: Lost Task 1.3 in stages (dar 6629, data3) : 3720.0 org. Apache. The spark. API. Python. PythonException: Traceback (the most recent call last) :
The File "/usr/local/spark/python/pyspark/worker. Py", line 172, the main in
The process ()
The File "/usr/local/spark/python/pyspark/worker. Py", line 167, in the process
Serializer. Dump_stream (func (split_index, iterator), outfile)
The File "/usr/local/spark/python/pyspark/serializers. Py", line 263, in dump_stream
V=list (itertools islice (iterator, batch))
The File "/usr/local/spark/python/pyspark/SQL/session. Py", line 505, prepare in
_verify_type (obj, schema)
The File "/usr/local/spark/python/pyspark/SQL/types. Py", line 1349, in _verify_type
_verify_type (v, f.d ataType, motor ullable)
The File "/usr/local/spark/python/pyspark/SQL/types. Py", line 1321, in _verify_type
Raise TypeError (" % s can not accept the object % r type in % s "% (dataType, obj, type (obj)))
TypeError: DoubleType can not accept the object in the type 1 & lt; The type 'int' & gt;

At org. Apache. Spark. API. Python. PythonRunner $$$1. -anon read (193). PythonRDD scala:
The at org. Apache. Spark. API. Python. PythonRunner $$$1. -anon & lt; init> (PythonRDD. Scala: 234)
At org.apache.spark.api.python.PythonRunner.com pute (PythonRDD. Scala: 152)
At org.apache.spark.api.python.PythonRDD.com pute (PythonRDD. Scala: 63)
At org.apache.spark.rdd.RDD.com puteOrReadCheckpoint (RDD. Scala: 319)
The at org. Apache. Spark. RDD. RDD. Iterator (RDD. Scala: 283)
At org.apache.spark.rdd.MapPartitionsRDD.com pute (MapPartitionsRDD. Scala: 38)
At org.apache.spark.rdd.RDD.com puteOrReadCheckpoint (RDD. Scala: 319)
The at org. Apache. Spark. RDD. RDD. Iterator (RDD. Scala: 283)
At org.apache.spark.rdd.MapPartitionsRDD.com pute (MapPartitionsRDD. Scala: 38)
At org.apache.spark.rdd.RDD.com puteOrReadCheckpoint (RDD. Scala: 319)
The at org. Apache. Spark. RDD. RDD. Iterator (RDD. Scala: 283)
At org.apache.spark.rdd.MapPartitionsRDD.com pute (MapPartitionsRDD. Scala: 38)
At org.apache.spark.rdd.RDD.com puteOrReadCheckpoint (RDD. Scala: 319)
The at org. Apache. Spark. RDD. RDD. Iterator (RDD. Scala: 283)
At org.apache.spark.rdd.MapPartitionsRDD.com pute (MapPartitionsRDD. Scala: 38)
nullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnull
  • Related