Skip to content
Snippets Groups Projects
  • zsxwing's avatar
    a8d53afb
    [SPARK-5124][Core] A standard RPC interface and an Akka implementation · a8d53afb
    zsxwing authored
    This PR added a standard internal RPC interface for Spark and an Akka implementation. See [the design document](https://issues.apache.org/jira/secure/attachment/12698710/Pluggable%20RPC%20-%20draft%202.pdf) for more details.
    
    I will split the whole work into multiple PRs to make it easier for code review. This is the first PR and avoid to touch too many files.
    
    Author: zsxwing <zsxwing@gmail.com>
    
    Closes #4588 from zsxwing/rpc-part1 and squashes the following commits:
    
    fe3df4c [zsxwing] Move registerEndpoint and use actorSystem.dispatcher in asyncSetupEndpointRefByURI
    f6f3287 [zsxwing] Remove RpcEndpointRef.toURI
    8bd1097 [zsxwing] Fix docs and the code style
    f459380 [zsxwing] Add RpcAddress.fromURI and rename urls to uris
    b221398 [zsxwing] Move send methods above ask methods
    15cfd7b [zsxwing] Merge branch 'master' into rpc-part1
    9ffa997 [zsxwing] Fix MiMa tests
    78a1733 [zsxwing] Merge remote-tracking branch 'origin/master' into rpc-part1
    385b9c3 [zsxwing] Fix the code style and add docs
    2cc3f78 [zsxwing] Add an asynchronous version of setupEndpointRefByUrl
    e8dfec3 [zsxwing] Remove 'sendWithReply(message: Any, sender: RpcEndpointRef): Unit'
    08564ae [zsxwing] Add RpcEnvFactory to create RpcEnv
    e5df4ca [zsxwing] Handle AkkaFailure(e) in Actor
    ec7c5b0 [zsxwing] Fix docs
    7fc95e1 [zsxwing] Implement askWithReply in RpcEndpointRef
    9288406 [zsxwing] Document thread-safety for setupThreadSafeEndpoint
    3007c09 [zsxwing] Move setupDriverEndpointRef to RpcUtils and rename to makeDriverRef
    c425022 [zsxwing] Fix the code style
    5f87700 [zsxwing] Move the logical of processing message to a private function
    3e56123 [zsxwing] Use lazy to eliminate CountDownLatch
    07f128f [zsxwing] Remove ActionScheduler.scala
    4d34191 [zsxwing] Remove scheduler from RpcEnv
    7cdd95e [zsxwing] Add docs for RpcEnv
    51e6667 [zsxwing] Add 'sender' to RpcCallContext and rename the parameter of receiveAndReply to 'context'
    ffc1280 [zsxwing] Rename 'fail' to 'sendFailure' and other minor code style changes
    28e6d0f [zsxwing] Add onXXX for network events and remove the companion objects of network events
    3751c97 [zsxwing] Rename RpcResponse to RpcCallContext
    fe7d1ff [zsxwing] Add explicit reply in rpc
    7b9e0c9 [zsxwing] Fix the indentation
    04a106e [zsxwing] Remove NopCancellable and add a const NOP in object SettableCancellable
    2a579f4 [zsxwing] Remove RpcEnv.systemName
    155b987 [zsxwing] Change newURI to uriOf and add some comments
    45b2317 [zsxwing] A standard RPC interface and An Akka implementation
    a8d53afb
    History
    [SPARK-5124][Core] A standard RPC interface and an Akka implementation
    zsxwing authored
    This PR added a standard internal RPC interface for Spark and an Akka implementation. See [the design document](https://issues.apache.org/jira/secure/attachment/12698710/Pluggable%20RPC%20-%20draft%202.pdf) for more details.
    
    I will split the whole work into multiple PRs to make it easier for code review. This is the first PR and avoid to touch too many files.
    
    Author: zsxwing <zsxwing@gmail.com>
    
    Closes #4588 from zsxwing/rpc-part1 and squashes the following commits:
    
    fe3df4c [zsxwing] Move registerEndpoint and use actorSystem.dispatcher in asyncSetupEndpointRefByURI
    f6f3287 [zsxwing] Remove RpcEndpointRef.toURI
    8bd1097 [zsxwing] Fix docs and the code style
    f459380 [zsxwing] Add RpcAddress.fromURI and rename urls to uris
    b221398 [zsxwing] Move send methods above ask methods
    15cfd7b [zsxwing] Merge branch 'master' into rpc-part1
    9ffa997 [zsxwing] Fix MiMa tests
    78a1733 [zsxwing] Merge remote-tracking branch 'origin/master' into rpc-part1
    385b9c3 [zsxwing] Fix the code style and add docs
    2cc3f78 [zsxwing] Add an asynchronous version of setupEndpointRefByUrl
    e8dfec3 [zsxwing] Remove 'sendWithReply(message: Any, sender: RpcEndpointRef): Unit'
    08564ae [zsxwing] Add RpcEnvFactory to create RpcEnv
    e5df4ca [zsxwing] Handle AkkaFailure(e) in Actor
    ec7c5b0 [zsxwing] Fix docs
    7fc95e1 [zsxwing] Implement askWithReply in RpcEndpointRef
    9288406 [zsxwing] Document thread-safety for setupThreadSafeEndpoint
    3007c09 [zsxwing] Move setupDriverEndpointRef to RpcUtils and rename to makeDriverRef
    c425022 [zsxwing] Fix the code style
    5f87700 [zsxwing] Move the logical of processing message to a private function
    3e56123 [zsxwing] Use lazy to eliminate CountDownLatch
    07f128f [zsxwing] Remove ActionScheduler.scala
    4d34191 [zsxwing] Remove scheduler from RpcEnv
    7cdd95e [zsxwing] Add docs for RpcEnv
    51e6667 [zsxwing] Add 'sender' to RpcCallContext and rename the parameter of receiveAndReply to 'context'
    ffc1280 [zsxwing] Rename 'fail' to 'sendFailure' and other minor code style changes
    28e6d0f [zsxwing] Add onXXX for network events and remove the companion objects of network events
    3751c97 [zsxwing] Rename RpcResponse to RpcCallContext
    fe7d1ff [zsxwing] Add explicit reply in rpc
    7b9e0c9 [zsxwing] Fix the indentation
    04a106e [zsxwing] Remove NopCancellable and add a const NOP in object SettableCancellable
    2a579f4 [zsxwing] Remove RpcEnv.systemName
    155b987 [zsxwing] Change newURI to uriOf and add some comments
    45b2317 [zsxwing] A standard RPC interface and An Akka implementation
MimaExcludes.scala 24.38 KiB
/*
 * Licensed to the Apache Software Foundation (ASF) under one or more
 * contributor license agreements.  See the NOTICE file distributed with
 * this work for additional information regarding copyright ownership.
 * The ASF licenses this file to You under the Apache License, Version 2.0
 * (the "License"); you may not use this file except in compliance with
 * the License.  You may obtain a copy of the License at
 *
 *    http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

import com.typesafe.tools.mima.core._
import com.typesafe.tools.mima.core.ProblemFilters._

/**
 * Additional excludes for checking of Spark's binary compatibility.
 *
 * The Mima build will automatically exclude @DeveloperApi and @Experimental classes. This acts
 * as an official audit of cases where we excluded other classes. Please use the narrowest
 * possible exclude here. MIMA will usually tell you what exclude to use, e.g.:
 *
 * ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.rdd.RDD.take")
 *
 * It is also possible to exclude Spark classes and packages. This should be used sparingly:
 *
 * MimaBuild.excludeSparkClass("graphx.util.collection.GraphXPrimitiveKeyOpenHashMap")
 */
object MimaExcludes {
    def excludes(version: String) =
      version match {
        case v if v.startsWith("1.4") =>
          Seq(
            MimaBuild.excludeSparkPackage("deploy"),
            MimaBuild.excludeSparkPackage("ml"),
            // SPARK-5922 Adding a generalized diff(other: RDD[(VertexId, VD)]) to VertexRDD
            ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.graphx.VertexRDD.diff"),
            // These are needed if checking against the sbt build, since they are part of
            // the maven-generated artifacts in 1.3.
            excludePackage("org.spark-project.jetty"),
            MimaBuild.excludeSparkPackage("unused"),
            ProblemFilters.exclude[MissingClassProblem]("com.google.common.base.Optional"),
            ProblemFilters.exclude[IncompatibleResultTypeProblem](
              "org.apache.spark.rdd.JdbcRDD.compute"),
            ProblemFilters.exclude[IncompatibleResultTypeProblem](
              "org.apache.spark.broadcast.HttpBroadcastFactory.newBroadcast"),
            ProblemFilters.exclude[IncompatibleResultTypeProblem](
              "org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast"),
            ProblemFilters.exclude[MissingClassProblem](
              "org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorActor")
          ) ++ Seq(
          // SPARK-6510 Add a Graph#minus method acting as Set#difference
            ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.graphx.VertexRDD.minus")
          )

        case v if v.startsWith("1.3") =>
          Seq(
            MimaBuild.excludeSparkPackage("deploy"),
            MimaBuild.excludeSparkPackage("ml"),
            // These are needed if checking against the sbt build, since they are part of
            // the maven-generated artifacts in the 1.2 build.
            MimaBuild.excludeSparkPackage("unused"),
            ProblemFilters.exclude[MissingClassProblem]("com.google.common.base.Optional")
          ) ++ Seq(
            // SPARK-2321