Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
S
spark
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Model registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
cs525-sp18-g07
spark
Commits
39215357
Commit
39215357
authored
12 years ago
by
Ravi Pandya
Browse files
Options
Downloads
Patches
Plain Diff
Windows command scripts for sbt and run
parent
f855e4fa
No related branches found
Branches containing commit
No related tags found
Tags containing commit
No related merge requests found
Changes
4
Hide whitespace changes
Inline
Side-by-side
Showing
4 changed files
core/src/main/scala/spark/deploy/worker/ExecutorRunner.scala
+2
-1
2 additions, 1 deletion
core/src/main/scala/spark/deploy/worker/ExecutorRunner.scala
run.cmd
+2
-0
2 additions, 0 deletions
run.cmd
run2.cmd
+68
-0
68 additions, 0 deletions
run2.cmd
sbt/sbt.cmd
+5
-0
5 additions, 0 deletions
sbt/sbt.cmd
with
77 additions
and
1 deletion
core/src/main/scala/spark/deploy/worker/ExecutorRunner.scala
+
2
−
1
View file @
39215357
...
...
@@ -75,7 +75,8 @@ class ExecutorRunner(
def
buildCommandSeq
()
:
Seq
[
String
]
=
{
val
command
=
jobDesc
.
command
val
runScript
=
new
File
(
sparkHome
,
"run"
).
getCanonicalPath
val
script
=
if
(
System
.
getProperty
(
"os.name"
).
startsWith
(
"Windows"
))
"run.cmd"
else
"run"
;
val
runScript
=
new
File
(
sparkHome
,
script
).
getCanonicalPath
Seq
(
runScript
,
command
.
mainClass
)
++
command
.
arguments
.
map
(
substituteVariables
)
}
...
...
This diff is collapsed.
Click to expand it.
run.cmd
0 → 100644
+
2
−
0
View file @
39215357
@echo
off
cmd
/V /E /C
call
%~dp0
run2
.cmd
%
*
\ No newline at end of file
This diff is collapsed.
Click to expand it.
run2.cmd
0 → 100644
+
68
−
0
View file @
39215357
@echo
off
set
SCALA_VERSION
=
2
.9.1
rem Figure out where the Spark framework is installed
set
FWDIR
=
%~dp0
rem Export this as SPARK_HOME
set
SPARK_HOME
=
%FWDIR%
rem Load environment variables from conf\spark-env.cmd, if it exists
if
exist
"
%FWDIR%
conf\spark-env.cmd"
call
"
%FWDIR%
conf\spark-env.cmd"
rem Check that SCALA_HOME has been specified
if
not
"x
%SCALA_HOME%
"
==
"x"
goto
scala_exists
echo
"SCALA_HOME is not set"
goto
exit
:scala
_exists
rem If the user specifies a Mesos JAR, put it before our included one on the classpath
set
MESOS_CLASSPATH
=
if
not
"x
%MESOS_JAR%
"
==
"x"
set
MESOS_CLASSPATH
=
%MESOS_JAR%
rem Figure out how much memory to use per executor and set it as an environment
rem variable so that our process sees it and can report it to Mesos
if
"x
%SPARK_MEM%
"
==
"x"
set
SPARK_MEM
=
512
m
rem Set JAVA_OPTS to be able to load native libraries and to set heap size
set
JAVA_OPTS
=
%SPARK_JAVA_OPTS%
-Djava
.library.path
=
%SPARK_LIBRARY_PATH%
-Xms
%SPARK_MEM%
-Xmx
%SPARK_MEM%
rem Load extra JAVA_OPTS from conf/java-opts, if it exists
if
exist
"
%FWDIR%
conf\java-opts.cmd"
call
"
%FWDIR%
conf\java-opts.cmd"
set
CORE_DIR
=
%FWDIR%
core
set
REPL_DIR
=
%FWDIR%
repl
set
EXAMPLES_DIR
=
%FWDIR%
examples
set
BAGEL_DIR
=
%FWDIR%
bagel
rem Build up classpath
set
CLASSPATH
=
%SPARK_CLASSPATH%
;
%MESOS_CLASSPATH%
;
%FWDIR%
conf
;
%CORE_DIR%
\target\scala
-
%SCALA_VERSION%
\classes
set
CLASSPATH
=
%CLASSPATH%
;
%CORE_DIR%
\target\scala
-
%SCALA_VERSION%
\test
-classes
;
%CORE_DIR%
\src\main\resources
set
CLASSPATH
=
%CLASSPATH%
;
%REPL_DIR%
\target\scala
-
%SCALA_VERSION%
\classes
;
%EXAMPLES_DIR%
\target\scala
-
%SCALA_VERSION%
\classes
for
/R
"
%CORE_DIR%
\lib"
%%j
in
(*
.jar
)
do
set
CLASSPATH
=
!CLASSPATH!
;
%%j
for
/R
"
%FWDIR%
\lib_managed\jars"
%%j
in
(*
.jar
)
do
set
CLASSPATH
=
!CLASSPATH!
;
%%j
for
/R
"
%FWDIR%
\lib_managed\bundles"
%%j
in
(*
.jar
)
do
set
CLASSPATH
=
!CLASSPATH!
;
%%j
for
/R
"
%REPL_DIR%
\lib"
%%j
in
(*
.jar
)
do
set
CLASSPATH
=
!CLASSPATH!
;
%%j
set
CLASSPATH
=
%CLASSPATH%
;
%BAGEL_DIR%
\target\scala
-
%SCALA_VERSION%
\classes
rem Figure out whether to run our class with java or with the scala launcher.
rem In most cases, we'd prefer to execute our process with java because scala
rem creates a shell script as the parent of its Java process, which makes it
rem hard to kill the child with stuff like Process.destroy(). However, for
rem the Spark shell, the wrapper is necessary to properly reset the terminal
rem when we exit, so we allow it to set a variable to launch with scala.
if
"
%SPARK_LAUNCH_WITH_SCALA%
"
NEQ
1
goto
java_runner
set
RUNNER
=
%SCALA_HOME%
\bin\scala
#
Java
options
will
be
passed
to
scala
as
JAVA_OPTS
set
EXTRA_ARGS
=
goto
run_spark
:java
_runner
set
CLASSPATH
=
%CLASSPATH%
;
%SCALA_HOME%
\lib\scala
-library
.jar
;
%SCALA_HOME%
\lib\scala
-compiler
.jar
;
%SCALA_HOME%
\lib\jline.jar
set
RUNNER
=
java
if
not
"x
%JAVA_HOME%
"
==
"x"
set
RUNNER
=
%JAVA_HOME%
\bin\java
rem The JVM doesn't read JAVA_OPTS by default so we need to pass it in
set
EXTRA_ARGS
=
%JAVA_OPTS%
:run
_spark
%RUNNER%
-cp
"
%CLASSPATH%
"
%EXTRA_ARGS%
%
*
:exit
\ No newline at end of file
This diff is collapsed.
Click to expand it.
sbt/sbt.cmd
0 → 100644
+
5
−
0
View file @
39215357
rem @echo off
set
EXTRA_ARGS
=
if
not
"
%MESOS_HOME%
x"
==
"x"
set
EXTRA_ARGS
=
-Djava
.library.path
=
%MESOS_HOME%
\lib\java
set
SPARK_HOME
=
%~dp0
..
java
-Xmx
1200
M
-XX
:MaxPermSize
=
200
m
%EXTRA_ARGS%
-jar
%SPARK_HOME%
\sbt\sbt
-launch
-*
.jar
"
%
*"
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment