Skip to content
Snippets Groups Projects
Commit bf5987cb authored by Michael McCune's avatar Michael McCune Committed by Sean Owen
Browse files

[SPARK-19769][DOCS] Update quickstart instructions

## What changes were proposed in this pull request?

This change addresses the renaming of the `simple.sbt` build file to
`build.sbt`. Newer versions of the sbt tool are not finding the older
named file and are looking for the `build.sbt`. The quickstart
instructions for self-contained applications is updated with this
change.

## How was this patch tested?

As this is a relatively minor change of a few words, the markdown was checked for syntax and spelling. Site was built with `SKIP_API=1 jekyll serve` for testing purposes.

Author: Michael McCune <msm@redhat.com>

Closes #17101 from elmiko/spark-19769.
parent d743ea4c
No related branches found
No related tags found
No related merge requests found
...@@ -260,7 +260,7 @@ object which contains information about our ...@@ -260,7 +260,7 @@ object which contains information about our
application. application.
Our application depends on the Spark API, so we'll also include an sbt configuration file, Our application depends on the Spark API, so we'll also include an sbt configuration file,
`simple.sbt`, which explains that Spark is a dependency. This file also adds a repository that `build.sbt`, which explains that Spark is a dependency. This file also adds a repository that
Spark depends on: Spark depends on:
{% highlight scala %} {% highlight scala %}
...@@ -273,7 +273,7 @@ scalaVersion := "{{site.SCALA_VERSION}}" ...@@ -273,7 +273,7 @@ scalaVersion := "{{site.SCALA_VERSION}}"
libraryDependencies += "org.apache.spark" %% "spark-core" % "{{site.SPARK_VERSION}}" libraryDependencies += "org.apache.spark" %% "spark-core" % "{{site.SPARK_VERSION}}"
{% endhighlight %} {% endhighlight %}
For sbt to work correctly, we'll need to layout `SimpleApp.scala` and `simple.sbt` For sbt to work correctly, we'll need to layout `SimpleApp.scala` and `build.sbt`
according to the typical directory structure. Once that is in place, we can create a JAR package according to the typical directory structure. Once that is in place, we can create a JAR package
containing the application's code, then use the `spark-submit` script to run our program. containing the application's code, then use the `spark-submit` script to run our program.
...@@ -281,7 +281,7 @@ containing the application's code, then use the `spark-submit` script to run our ...@@ -281,7 +281,7 @@ containing the application's code, then use the `spark-submit` script to run our
# Your directory layout should look like this # Your directory layout should look like this
$ find . $ find .
. .
./simple.sbt ./build.sbt
./src ./src
./src/main ./src/main
./src/main/scala ./src/main/scala
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment