How to implement generic function that receives differents types scala
NickName:Diego Sanz Ask DateTime:2017-06-08T20:57:52

How to implement generic function that receives differents types scala

    import org.apache.spark.{ SparkConf, SparkContext }
    import org.apache.spark.rdd.RDD

    class BaseType(val a: String) extends Serializable {
     override def toString = "(" + a + ")"
    }
    class TypeA(a: String, val b: String) extends BaseType(a) {
     override def toString = "(" + a + "," + b + ")"
    }
    class TypeB(a: String, val b: String) extends BaseType(a) {
     override def toString = "(" + a + "," + b + ")"
    }

   object EntityInheritance {
    def main(args: Array[String]) = {

    val sparkConf = new SparkConf()
      .setMaster("local[*]")
      .setAppName("EntityInheritance Sample")

    val sc = new SparkContext(sparkConf)

    val text_file = sc.textFile("/dqa/sample_logs/tipologies/entityInheritance.txt")
    val items = text_file.flatMap(_.split("\n"))

        val itemsRDDa = items.map(newInstanceA(_))
        itemsRDDa.foreach { rdd => println(rdd) }
        val countAa = countersAttributeA[TypeA](itemsRDDa)

        val itemsRDDb = items.map(newInstanceB(_))
    itemsRDDb.foreach { rdd => println(rdd) }
        val countBa = countersAttributeA[TypeB](itemsRDDb)

    sc.stop()
  }

    def newInstanceA(str: String): TypeA = {
      val parts = str.split(" ")
      new TypeA(parts(0), parts(1))
    }

    def newInstanceB(str: String): TypeB = {
        val parts = str.split(" ")
        new TypeB(parts(0), parts(1))
    }

    // I want to implement a generic function that receives RDD[TypeA] or RDD[TypeB]
    // it's a simple example
     def countersAttributeA[A](rdd: org.apache.spark.rdd.RDD[A]) = {
       rdd
        .map(s => (s.a, 1))
        .reduceByKey(_ + _)
     }
   }

Hello, I have a problem but is possible that this idea is isn't good.

I trying to implement a generic function that receives different types. When create a different objects for example TypeA and TypeB I want to send to counterAttributeA -> count number of apparences of attribute 'a', but the console send this error:

[error] /src/main/scala/org/sparklambda/testing/EntityInheritance.scala:53: value a is not a member of type parameter A
[error]       .map(s => (s.a, 1))
[error]                    ^
[error] one error found

Anyone you can help me? Thank's for all.

Copyright Notice:Content Author:「Diego Sanz」,Reproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/44436433/how-to-implement-generic-function-that-receives-differents-types-scala

More about “How to implement generic function that receives differents types scala” related questions

How to implement generic function that receives differents types scala

import org.apache.spark.{ SparkConf, SparkContext } import org.apache.spark.rdd.RDD class BaseType(val a: String) extends Serializable { override def toString = "(" + a + ")" }

Show Detail

Scala cast to generic type (for generic numerical function)

I'm trying to implement a generic function that wraps a mathematical Java function. For simplicity, we can assume that the Java function (Java 7) takes one parameter and returns a result, both of t...

Show Detail

Scala generic types

I there, I'm trying to combine the Command and Chain of responsibility patterns with Scala style. Basically, I would like to have one Executor who pass the command trough the chain and return the r...

Show Detail

How to implement generic function in Scala with two argument types?

I'd like to implement a function in Scala that computes the dot product of two numeric sequences as follows val x = Seq(1,2,3.0) val y = Seq(4,5,6) val z = (for (a <- x; b <- y) yield a*b).sum

Show Detail

Infer generic types on scala shell

Is it possible to use the scala shell to infer the type of a generic function? I was trying to undestand the type of the function Future.traverse (scala 2.10). The complete generic type is def

Show Detail

Scala convert tuple of types to tuple of generic of types

Say I have type a defined as a tuple of types type a = (Int, String, Int) and I define generic class Foo as class Foo[A]{} is there any method is Scala (preferably native Scala) that can conver...

Show Detail

Scala abstract types implement trait that extends from generic trait

I am new to scala and thus my question might be due to a lack of understanding of abtract types and traits. I currently have a Sensor trait which is generic and defines a value and newValue method.

Show Detail

How to write a scala function which receives a map function to a generic type

Using Spark 1.3.0 with Scala, I have two functions which basically do the same on a given RDD[(Long, String, Boolean, String)], up to a specifc map function from (Long, String, Boolean, String) to a

Show Detail

Spark Scala generic min() function

How can I create a generic min() function in Spark which returns a value of the same type as the generic used? Here's what I have for doubles and strings: def minDouble(rdd: RDD[Map[String, Strin...

Show Detail

to build a generic cast function in scala

I'd like to build a function to read the settings from a properties file, key/value pairs, and then cast it to expected types. If the given field name does't exist, we get the default value of the

Show Detail