Skip to content
Snippets Groups Projects
  1. Jul 08, 2015
  2. May 30, 2015
    • Yanbo Liang's avatar
      [SPARK-7918] [MLLIB] MLlib Python doc parity check for evaluation and feature · 1617363f
      Yanbo Liang authored
      Check then make the MLlib Python evaluation and feature doc to be as complete as the Scala doc.
      
      Author: Yanbo Liang <ybliang8@gmail.com>
      
      Closes #6461 from yanboliang/spark-7918 and squashes the following commits:
      
      940e3f1 [Yanbo Liang] truncate too long line and remove extra sparse
      a80ae58 [Yanbo Liang] MLlib Python doc parity check for evaluation and feature
      1617363f
  3. May 20, 2015
  4. May 18, 2015
    • Xiangrui Meng's avatar
      [SPARK-6657] [PYSPARK] Fix doc warnings · 1ecfac6e
      Xiangrui Meng authored
      Fixed the following warnings in `make clean html` under `python/docs`:
      
      ~~~
      /Users/meng/src/spark/python/pyspark/mllib/evaluation.py:docstring of pyspark.mllib.evaluation.RankingMetrics.ndcgAt:3: ERROR: Unexpected indentation.
      /Users/meng/src/spark/python/pyspark/mllib/evaluation.py:docstring of pyspark.mllib.evaluation.RankingMetrics.ndcgAt:4: WARNING: Block quote ends without a blank line; unexpected unindent.
      /Users/meng/src/spark/python/pyspark/mllib/fpm.py:docstring of pyspark.mllib.fpm.FPGrowth.train:3: ERROR: Unexpected indentation.
      /Users/meng/src/spark/python/pyspark/mllib/fpm.py:docstring of pyspark.mllib.fpm.FPGrowth.train:4: WARNING: Block quote ends without a blank line; unexpected unindent.
      /Users/meng/src/spark/python/pyspark/sql/__init__.py:docstring of pyspark.sql.DataFrame.replace:16: WARNING: Field list ends without a blank line; unexpected unindent.
      /Users/meng/src/spark/python/pyspark/streaming/kafka.py:docstring of pyspark.streaming.kafka.KafkaUtils.createRDD:8: ERROR: Unexpected indentation.
      /Users/meng/src/spark/python/pyspark/streaming/kafka.py:docstring of pyspark.streaming.kafka.KafkaUtils.createRDD:9: WARNING: Block quote ends without a blank line; unexpected unindent.
      ~~~
      
      davies
      
      Author: Xiangrui Meng <meng@databricks.com>
      
      Closes #6221 from mengxr/SPARK-6657 and squashes the following commits:
      
      e3f83fe [Xiangrui Meng] fix sql and streaming doc warnings
      2b4371e [Xiangrui Meng] fix mllib python doc warnings
      1ecfac6e
  5. May 11, 2015
  6. May 10, 2015
  7. May 07, 2015
  8. Mar 05, 2015
    • Xiangrui Meng's avatar
      [SPARK-6090][MLLIB] add a basic BinaryClassificationMetrics to PySpark/MLlib · 0bfacd5c
      Xiangrui Meng authored
      A simple wrapper around the Scala implementation. `DataFrame` is used for serialization/deserialization. Methods that return `RDD`s are not supported in this PR.
      
      davies If we recognize Scala's `Product`s in Py4J, we can easily add wrappers for Scala methods that returns `RDD[(Double, Double)]`. Is it easy to register serializer for `Product` in PySpark?
      
      Author: Xiangrui Meng <meng@databricks.com>
      
      Closes #4863 from mengxr/SPARK-6090 and squashes the following commits:
      
      009a3a3 [Xiangrui Meng] provide schema
      dcddab5 [Xiangrui Meng] add a basic BinaryClassificationMetrics to PySpark/MLlib
      0bfacd5c
Loading