spot7.org logo
Home PHP C# C++ Android Java Javascript Python IOS SQL HTML Categories
  Home » SCALA » Page 1
Parser Alternative Operator | Fails
| parses ONE side only (patterns OR filter but not both). Try a new rule like (change types, I used _): def patternOrFilterAll: Parser[_] = { rep1(patternOrFilter) } You may need to change patternRepetition and patternOrFilter to get the return value you want. edit Your Parser does not work because of the usage of ~!. If you use ~ instead the parser then your example will work (e.g. http:/

Categories : Scala

ScalaTest assertion mismatch due to Physical Address
The problem is the values in your SortedMap. > Array(42) == Array(42) res0: Boolean = false Array does not provide a friendly equal implementation. Edit: plus, Array is a mutable structure, usually not recommend to use them while passing messages between actors.

Categories : Scala

Scala implicit parameter and japanese smiley 'foldLeft'
Two answers: _ + _ is a placeholder for the function that takes two arguments and adds them. The underscore is here to mark the position of an argument in such a syntax. You can read this for all the uses of underscore in Scala. the implicit keyword denotes an implicit argument. It means in places where you require an Account, one may be provided using the constructor without explicitly giving

Categories : Scala

Is it possible to user reduceByKey((x, y, z) => ...)?
if you see the reduceByKey function on PairRDDFunctions, it looks like, def reduceByKey(func: (V, V) => V): RDD[(K, V)] hence, its not possible to have it work on a 3-tuple. However, you can wrap your 3-tuple into a model and still keep your first string as the key making your RDD as RDD[(string, your-model)] and now you can aggregate the model in whatever way you want. Hope this helps.

Categories : Scala

How to implement security Authorization using scala and play?
If you build an application intended for production: Don't do it Use one of the many frameworks out there: Deadbolt2 : https://github.com/schaloner/deadbolt-2 SecureSocial: http://www.securesocial.ws/ Silhouette : http://silhouette.mohiva.com/ They are also a great starting point to look for best practices. If you want to do it mainly for learning and there are no real scecurity concerns go

Categories : Scala

SSO login using scala script
I not sure, if a understand you problem correct, but for do login for couple of user you should add your token into correlation parameter in script, because it cannot be the same for different user. P.S. In my usage of Gatling I do not use opentoken in login at all. But as I know, you can get this token from response of previous request -> save it into correlation name -> use it like parameters.

Categories : Scala

Sum elements based on a predicate
There are a number of ways, one would be with fold as you said: scala> List(Some(1), None, Some(3), None, Some(8)) res0: List[Option[Int]] = List(Some(1), None, Some(3), None, Some(8)) scala> res0.foldLeft(0)(_ + _.map(_ => 1).getOrElse(0)) res1: Int = 3 Using fold allows one iteration, mapping and then using sum uses two iterations, one for map and the other for sum since the latter

Categories : Scala

Keep track of completed Futures
I would suggest using the standard Java AtomicInteger. You can increment it using the incrementAndGet() method, and obtain the current value via its get() method. import java.util.concurrent.atomic.AtomicInteger ... val completed = new AtomicInteger() val futures = for(i <- 0 until nSteps) yield future { ... val content = blocking { ... http request ... } process(content) compl

Categories : Scala

API Observable with dynamic caching
It is possible to use Scheduler workers to reschedule a call togetContracts: Observable[Set[EveContract]](observer ⇒ { val worker = Schedulers.newThread().createWorker() def scheduleContracts(delay: Long) { worker.schedule(new Action0 { override def call(){ if(!observer.isUnsubscribed) { val delay = getContracts(observer)

Categories : Scala

java.io.IOException: Remotely closed in gatling
That's probably not a Gatling issue, but a real error with your server forcefully closing the connection while Gatling is writing on it. It could be that your server cannot withstand the load and aggressively closes random connections, or some misconfiguration in some network component.

Categories : Scala

Scala permutations using two lists
Trickier than I thought! The first step is to calculate the n-times Cartesian product of l2, which can be done using a combination of List.fill, combinations, and permutations (I have a hard time believing that there is no easier way to do this, but I haven't found any): def prod[T](lst: List[T], n: Int) = List.fill(n)(lst).flatten.combinations(n).flatMap(_.permutations) The value of n is deter

Categories : Scala

Is Scala Either really a Monad
Yes, it really is - otherwise it would be in scalaz-outlaws. Either's bind is defined something like: trait Either[A, B] { def bind[C](f: B => Either[A, C]) = this match { case Right(b) => f(b) case Left(a) => Left(a) } } (in practice it's defined via a typeclass, but the above definition would work) I guess it's more proper to say that for a fixed A, the type ({type L[B]=

Categories : Scala

Spark: Use of distinct
You need to introduce some concept of "first": an RDD is a (distributed) set, not an ordered list. So given a function like: def first (t1, t2): return ... #(your logic here to choose between e.g. (aaa,1,2) and (sss,3,4) ... You can simply: theRdd.reduceByKey(first)

Categories : Scala

Identifying two type wildcards as identical
You are not using a raw type here, you are using a wildcard type. This is similar to Java's List<?> list = null; list.add(list.get(0)); It does not compile as the compiler works in several steps: The value n.value is returned, the compiler marks the value to represent a wildcard. The method n.perform is executed where the compiler only knows that the argument is another wildcard. Two

Categories : Scala

how to package spark scala application
I use sbt as my package and compiling tool. With sbt, I use "assembly" plugin to package dependencies in the final jar file. I recommend it. If you maven, I believe "shade" plugin do the similar job. See the usage page of shade and you will be inspired. http://maven.apache.org/plugins/maven-shade-plugin/usage.html Basically, you need to change the "build" part in your pom.xml file as this: <

Categories : Scala

Pattern Match on Scala `class`
The argument of unapply should be an instance of the type you want to match: class Foo(val x: Int) object { def unapply(f: Foo): Option[Int] = Some(f.x) }

Categories : Scala

Type mismatch when using higher-kinded types
Type labmdas should help: class Bar[T[_]] { def bar[A]: Option[T[A]] = None } def foo[A] = { new Bar[({type M[B] = Map[A, B]})#M] } val f: Option[Map[String, Int]] = foo[String].bar[Int] However I can't answer why type T doesn't work in this case.

Categories : Scala

Scala List match last element
You need to switch the order. The xs symbol will eagerly match anything in that position. Trying to match Nil first will make that statement no longer unreachable. Also, you'll still need to match Nil by itself to account for empty lists. def stringify(list: List[String]): String = list match { case x :: Nil => x case x :: xs => x + ":" + stringify(xs) case Nil => "" } Thoug

Categories : Scala

Composing Futures with For Comprehension
It looks like you're trying to compose Futures with Vector. For comprehensions in scala have to all be of the same higher type, which in your case is Future. When you unroll the 'sugar' of the for comprehension, it's just calling flatMap on everything. for { categories <- categoriesFuture // I'm not sure what the return type is here, but I'm guessing it's a future as well categoryIdsWith

Categories : Scala

Converting a column from a schemaRDD to a string array
I may not understand correctly what you are looking for, but I'll try. You need to read a line of String and split the String into array by spaces val converted = jsonFiles.map(line=> {line(7).split(" ")}) converted.collect The problem here is that (depends what you are doing) "Exception in task 1.0 in stage 2.0 (TID 5)" should be as a one String, not splitted. To do this we need: split

Categories : Scala

Apache Spark map and reduce with passing values
Maybe you could use a class and work with it through the flow. I mean, define RevenueHour class case class RevenueHour(date: java.util.Date,revenue: Long, id: String) Then built an intermediate RevenueHour in the map phase and then another one in the reduce phase. val map: RDD[(Date, RevenueHour)] = orders.map(row => ( getDateAs("hour", oo.getDate("date")), RevenueHour(

Categories : Scala

Getting the Correct Iterator
outer.iterator will always give you a new iterator. You need to create one and stash it somewhere, then use that single stashed one rather than creating a new one every time: new Iterator[A] { val outerIterator = outer.iterator override def hasNext = ... }

Categories : Scala

DI Binding to a subclass
It is possible to archive this with scaldi, but the problem is with the way Environment is defined and used. I assume (at least error tells me this) that Environment is invariant on both of it's type arguments. This is the problem, because you want to treat Environment[User, Authenticator] as a superclass of Environment[User, SessionAuthenticator]. It is not the case if both type arguments are def

Categories : Scala

Akka - Test that function from test executed
How about this: trait MyActor extends Actor{self:MyTrait def recieve = { case SayHello => printHelloWorld } } class MyMainActor extends MyActor with MyTrait "My Actor" should{ "println hello msg if SayHello sent" in{ class MockActor extends MyActor with SomeMyTrait val x = new MockActor val myActor = system.actorOf(Props(x)) myActor ! SayHello Thr

Categories : Scala

type mismatch; found : play.api.data.Form[controllers.Application.Userdata] required: play.api.data.Form[(String, String)]
Few points to note: Put your case class Userdata outside the Application controller. That is move it inside controller package. Also you can add @import at the beginning of your template, to import any arbitrary package or class. Try this. Application.scala: only one Userdata case class definition ( you had two of these classes defined ) import play.api._ import play.api.mvc._ import play.

Categories : Scala

Random as instance of scalaz.Monad
Here's a simple example. Suppose I want to pick a random size for a range, and then pick a random index inside that range, and then return both the range and the index. The second computation of a random value clearly depends on the first—I need to know the size of the range in order to pick a value in the range. This kind of thing is specifically what monadic binding is for—it allows you to

Categories : Scala

why yield can not work with while loop in scala
Because a while loop is a java equivalent while loop, and the 'for loop' is translated to function call of: <IndexedSeq>.map (if you use yield) or <IndexedSeq>.foreach (if you don't care the result). Example Scala Code: class ForVsWhileLoop { val dummy = for(i <- 1 to 10) yield i var dummy2 = Seq.empty[Int] var i = 0 while(i <= 10) dummy2 :+= i } Compiles to (sca

Categories : Scala

SBT sub-sub projects don't work?
How do you use sbt? Meaning, through IDE or command line? I had problems with my multi module project in IntelliJ but recent updates resolved my problem. The updates included IntelliJ itself (from 13.3 to 14.0) and scala-plugin to 1.1 version. When I used to run my project from not IDE everything worked out smoothly but when I switched to IntelliJ it was auto creating modules scala facets and wh

Categories : Scala

Let build.sbt define dependency on another local library
I tend to prefer a build definition in project/Build.scala, instead of build.sbt. But the following code within the object Build should also do for a standard build.sbt. //Build.scala import sbt._ object Build extends Build { lazy val projectA = project.in(file("a")) lazy val projectB = project.in(file("b")).dependsOn(projectA) } or: //your root build.scala name := "test" version := "1.0

Categories : Scala

How to define a trait with methods accepting any subtype of a particular trait
Try this trait AppModel {} trait ModelOperations { def get[T<:AppModel](model: T): Option[T] def create[T<:AppModel](model: T): Boolean } class User extends AppModel { val id = "xyz" val name = "abc" } class UserOperations extends ModelOperations { override def get[User](user: User) : Option[User] = { // get a new user object by filtering existing parameters return None

Categories : Scala

Json4s: Trouble while trying to convert Json attribute to java.sql.Date
Something along these lines (you might want to be more specific with formats). And then mix this trait in the classes which need to have access to this custom serializer. import org.json4s.DefaultJsonFormats._ trait JsonFormats { case object DateSerializer extends CustomSerializer[java.sql.Date](format => ( { case JString(s) => Date.valueOf(s) case JNull => null }

Categories : Scala

How to extract result from org.elasticsearch.search.suggest.Suggest object on scala
I got the working solution from this link - https://groups.google.com/forum/#!topic/elasticsearch/FwRv0D5qIi8 The solution is - import org.elasticsearch.node.NodeBuilder._ import org.elasticsearch.search.suggest.completion.CompletionSuggestionBuilder import org.elasticsearch.search.suggest.completion.CompletionSuggestion // Build query for searching the client. var suggeste

Categories : Scala

Mock a method from implicit class in specs2
When you write project.searchFile then searchFile is not a method which belongs to the mocked object but to the RichProject class. So Mockito can not mock it and will try to execute it. I don't fix there is a fix for this other than mocking the RichProject class itself.

Categories : Scala

Sequencing Scala Futures with bounded parallelism (without messing around with ExecutorContexts)
I have example how to do it with scalaz-stream. It's quite a lot of code because it's required to convert scala Future to scalaz Task (abstraction for deferred computation). However it's required to add it to project once. Another option is to use Task for defining 'doWork'. I personally prefer task for building async programs. import scala.concurrent.{Future => SFuture} import scala.util.

Categories : Scala

Array of elements of compound type
What is happening exactly behind scene here? The compound types are actually instances of the first type? Looking at the newArray method as defined for ClassTag: override def newArray(len: Int): Array[T] = runtimeClass match { case java.lang.Byte.TYPE => new Array[Byte](len).asInstanceOf[Array[T]] case java.lang.Integer.TYPE => new Array[Int](len).asInstanceOf[Array

Categories : Scala

Can I tell scala.xml to match any of two tags?
If you take a look at the source for \ in NodeSeq.scala you can see that it's really just a bit of sugar for a filter operation over descendant_or_self, which is a List[Node], using the node's label. So you could do the same thing yourself, matching against a set of labels, like this: val searchedLabels = Set("p", "div") val results = body.descendant_or_self.filter(node => searchedLabels.con

Categories : Scala

dsl for capturing field name
The field method takes a T => Unit function as parameter. Hence, the lambda _.age, which is equivalent to x => x.age, is typechecked as returning Unit. The compiler warns that you are using a pure expression (x.age) in statement position (expected type Unit), which basically means that the expression is useless, and might as well be removed. There is a very simple symptomatic solution to yo

Categories : Scala

In pattern matching, can I use the matched pattern as is?
If I understand your question correctly, you want to use the heading itself on ???'s place, this can be done using the @ pattern: case first :: rest => first match { case head @ Heading(_,_) => buildPairsAcc(rest, acc, head) case Paragraph(_) // ... other cases Note that this can be used on everything that'll pattern match, including lists: case lst @ head::tail => // do s

Categories : Scala

Typecheck when mapping some items in a sequence to None
Use collect instead of map to keep only the elements you want: val elements = parent.child.collect { case n if (n.label == "p") => Paragraph(n) case n if (n.label.matches("hd")) => Heading(n, n.child.head)) } Anything that isn't defined within the PartialFunction you pass to collect is discarded. There's no reason to map to None if you're just going to discard the elements anywa

Categories : Scala


Recently Add
How to match all words in a sentence with scala combinators?
Parser Alternative Operator | Fails
ScalaTest assertion mismatch due to Physical Address
Scala implicit parameter and japanese smiley 'foldLeft'
Is it possible to user reduceByKey((x, y, z) => ...)?
How to implement security Authorization using scala and play?
SSO login using scala script
Sum elements based on a predicate
Keep track of completed Futures
API Observable with dynamic caching
java.io.IOException: Remotely closed in gatling
Scala permutations using two lists
Is Scala Either really a Monad
Spark: Use of distinct
Identifying two type wildcards as identical
how to package spark scala application
Pattern Match on Scala `class`
Type mismatch when using higher-kinded types
Scala List match last element
Composing Futures with For Comprehension
Converting a column from a schemaRDD to a string array
Apache Spark map and reduce with passing values
Getting the Correct Iterator
DI Binding to a subclass
Akka - Test that function from test executed
type mismatch; found : play.api.data.Form[controllers.Application.Userdata] required: play.api.data.Form[(String, String)]
Random as instance of scalaz.Monad
why yield can not work with while loop in scala
SBT sub-sub projects don't work?
Let build.sbt define dependency on another local library
© Copyright 2017 spot7.org Publishing Limited. All rights reserved.