Scala in Business | Knoldus Newsletter – November 2014


Hello Folks

We are back again with November 2014, Newsletter. Here is this Scala in Business | Knoldus Newsletter – November 2014

In this newsletter, you will get to know the popularity of Spark in big data. Spark is getting more attention in all over the world for fast data processing. People are getting much faster data processing speed with Spark. You will get to know some Scala best practices and many more things.

So, if you haven’t subscribed to the newsletter yet then make it hurry and click on Subscribe Monthly Scala News Letter

Screenshot from 2014-11-26 22:49:18

Posted in Agile, Akka, Amazon EC2, Cassandra, Clojure, Cloud, Future, Java, LiftWeb, MongoDB, News, Node.js, Non-Blocking, NoSql, Play Framework, Reactive, Scala, Spark, Tutorial, Web | Leave a comment

Building Reactive applications with Akka


Knoldus organized a Meetp up on Wednesday, 29 Oct 2014 at 5:00PM. Mr. Nilanjan Raychaudhuri from Typesafe presented this session on “Building Reactive applications with Akka”.
He did the session remotely from Germany and it was very well accepted by the audience at the meetup.

Nilanjan is a consultant/trainer and member of Play framework team. He works for Typesafe. He has more than 14 years of experience managing and developing software solutions in Java, Ruby, Groovy and also in Scala. He is zealous about programming in Scala ever since he got introduced to this beautiful language. He enjoys sharing his experience via talks in various conferences and he is also the author of the “Scala in Action” book.

We would like to thank Nilanjan for his time and his valuable insights.

Checkout this presentation to find how Akka help you to build Reactive applications.

Posted in Akka, Reactive, Scala, Tutorial | Tagged | Leave a comment

Easiest Way To Map Optional Nested Case Class with Slick in Scala


Few days ago, I had a scenario, in which I was supposed to map optional nested case class in slick using Scala.

case class Employee(emdId:String,name: String, record: Option[Record])
case class Record(subject: String, mark: Int)

I was trying to do this mapping the way, I have explained below.

class EmployeeSlickMapping(tag: Tag) extends Table[Employee](tag, "Employee") {
  def emdId = column[String]("emdId")
  def name = column[String]("name")
  def subject = column[String]("subject", O.Nullable)
  def mark = column[Int]("mark", O.Nullable)
  def record = (subject, mark) <> (Record.tupled, Record.unapply)
  def * = (emdId,name, address) <> (User.tupled, User.unapply)
}

But I was getting below compilation error.

Multiple markers at this line
    - No matching Shape found. Slick does not know how to map the given types. Possible causes: T in Table[T] does not match your * projection. Or you use an unsupported
     type in a Query (e.g. scala List). Required level: scala.slick.lifted.FlatShapeLevel Source type: (scala.slick.lifted.Column[String], scala.slick.lifted.Column[String],
     scala.slick.lifted.MappedProjection[platform3.models.mail.Record,(String, Int)]) Unpacked type: (String, String, Option[platform3.models.mail.Record]) Packed type: Any
    - not enough arguments for method <>: (implicit evidence$2: scala.reflect.ClassTag[platform3.models.mail.Employee], implicit shape: scala.slick.lifted.Shape[_ <:
     scala.slick.lifted.FlatShapeLevel, (scala.slick.lifted.Column[String], scala.slick.lifted.Column[String], scala.slick.lifted.MappedProjection[platform3.models.mail.Record,
     (String, Int)]), (String, String, Option[platform3.models.mail.Record]), _])scala.slick.lifted.MappedProjection[platform3.models.mail.Employee,(String, String,
     Option[platform3.models.mail.Record])]. Unspecified value parameter shape.
    - No matching Shape found. Slick does not know how to map the given types. Possible causes: T in Table[T] does not match your * projection. Or you use an unsupported
     type in a Query (e.g. scala List). Required level: scala.slick.lifted.FlatShapeLevel Source type: (scala.slick.lifted.Column[String], scala.slick.lifted.Column[String],
     scala.slick.lifted.MappedProjection[platform3.models.mail.Record,(String, Int)]) Unpacked type: (String, String, Option[platform3.models.mail.Record]) Packed type: Any
    - Implicit conversions found: => anyToToShapedValue()
    - not enough arguments for method <>: (implicit evidence$2: scala.reflect.ClassTag[platform3.models.mail.Employee], implicit shape: scala.slick.lifted.Shape[_ <:
     scala.slick.lifted.FlatShapeLevel, (scala.slick.lifted.Column[String], scala.slick.lifted.Column[String], scala.slick.lifted.MappedProjection[platform3.models.mail.Record,
     (String, Int)]), (String, String, Option[platform3.models.mail.Record]), _])scala.slick.lifted.MappedProjection[platform3.models.mail.Employee,(String, String,
     Option[platform3.models.mail.Record])]. Unspecified value parameter shape.

After beating my head two days, I found a solution by adding custom mapping.

class EmployeeSlickMapping(tag: Tag) extends Table[Employee](tag, "Employee") {
  def emdId = column[String]("emdId")
  def name = column[String]("name")
  def subject = column[String]("subject")
  def mark = column[Int]("mark")
  def record = (subject.?, mark.?).<>[Option[Record], (Option[String], Option[Int])](
    { mappedRecord =>
      mappedRecord match {
        case (Some(subject), Some(mark)) => Some(Record(subject, mark))
        case _ => None
      }
    },
    { result =>
      result match {
        case rec: Option[Record] => Some(rec.map(_.subject), rec.map(_.mark))
      }
    })
  def * = (emdId, name, record) <> (Employee.tupled, Employee.unapply)
}

It worked and now I am able to compile and run my code.

Posted in Java, Scala | Tagged , | 2 Comments

Introduction on Playframework


In this presentation , I have discussed some important features of Play Framework.

Posted in Scala | Leave a comment

Scala in Business | Knoldus Newsletter – October 2014


Hello Folks

This time I got bit late due to the festival week in India. But i have some intresting stuffs for you only.

We are back again with October 2014, Newsletter. Here is this Scala in Business | Knoldus Newsletter – October 2014

In this newsletter you will get to know how organizations are getting benefits by using Typesafe Reactive Platform, how Akka is helping to build scalable and fault tolerant applications and how Spark is doing fast processing than Hadoop.

So, if you haven’t subscribed to the newsletter yet then make it hurry and click on Subscribe Monthly Scala News Letter

Screenshot from 2014-10-29 23:53:34

Posted in Java, News, Cloud, LiftWeb, Akka, Node.js, MongoDB, JavaScript, Non-Blocking, Future, Cassandra, NoSql | 1 Comment

Knolx Session : Design Principles for Mobiles


In this presentation i have explained the principles and elements we have to take care during a website design mostly for mobile view.

Posted in Scala | Leave a comment

SBT-dependency tree


In this blog , I am going to describe how to view sbt dependency tree.  Last week I had a problem related to different cross version of a dependency. I knew the problem cause but  I had spent a day to know which dependency had brought that cross version of dependency. I did some study and browsing about that problem then I come across a sbt plugins as a potential solution of that problem.
In a project there is a chance of using same library but different version by multiple dependencies.  I also victim of dependency version conflict.  The good way is just draw a sbt dependency tree. Here is a sbt-plugin sbt-dependency graph is available for that.

Following are the steps to install and use sbt-dependency-graph
a) add plugin to project/plugins.sbt

addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.4")

b) add sbt setting in build.sbt

net.virtualvoid.sbt.graph.Plugin.graphSettings

if project is multi module then add to Parent.scala:

object Parent extends Build{
lazy val parent = Project(id="parent", base=file(".")).settings(net.virtualvoid.sbt.graph.Plugin.graphSettings: _*)
}

Now run sbt commond:

$ sbt dependency-tree
+-org.jsoup:jsoup:0.2.2
[info]   | +-commons-lang:commons-lang:2.4
[info]   |
[info]   +-org.scala-lang:scala-library:2.11.0 (evicted by: 2.11.2)
[info]   +-org.scala-lang:scala-library:2.11.1 (evicted by: 2.11.2)
[info]   +-org.scalaz:scalaz-core_2.11:7.0.6 [S]
[info]     +-org.scala-lang.modules:scala-parser-combinators_2.11:1.0.1 [S]
[info]     | +-org.scala-lang:scala-library:2.11.0 (evicted by: 2.11.2)
[info]     | +-org.scala-lang:scala-library:2.11.1 (evicted by: 2.11.2)
[info]     |
[info]     +-org.scala-lang.modules:scala-xml_2.11:1.0.1 (evicted by: 1.0.2)
[info]     +-org.scala-lang.modules:scala-xml_2.11:1.0.2 [S]
[info]     | +-org.scala-lang:scala-library:2.11.0 (evicted by: 2.11.2)
[info]     | +-org.scala-lang:scala-library:2.11.1 (evicted by: 2.11.2)
[info]     |
[info]     +-org.scala-lang:scala-library:2.11.0 (evicted by: 2.11.2)

or 

$ sbt dependency-graph

or provide lot of options.for more info sbt-dependency-graph

Enjoy with  sbt  projects !!!

Posted in Scala | Leave a comment

Solution for Riak 500 Internal Server Error


I am new to Riak and learning it. Few days ago, I got 500 Internal Server Error exception while inserting data in a Riak bucket. It was weird because I was able to insert same data in different bucket.
I tried to find out root cause, but didn’t get success. After beating my head whole day, I posted it on Roak forum and stackoverflow. I got a response that issue is related to precommit hook. But still I didn’t get solution.

I didn’t make any change in Riak setting. I was not able to understand how this precommit was defined with the bucket.

I tried to overwrite precommit property of my bucket. Like this:-

curl http://127.0.0.1:8098/riak/abc-client -X PUT -H "Content-Type: application/json" -d '{"props":{"precommit":[]}}'

I tried again to insert data and got same error again.

[error] ! step error
[error]   RiakRetryFailedException: com.basho.riak.client.http.response.RiakResponseRuntimeException: 500 Internal Server Error<h1>Internal Server Error</h1>The server encountered an error while processing this request:<br><pre>{error,
[error]     {error,badarg,
[error]         [{erlang,iolist_to_binary,
[error]              [{hook_crashed,{riak_search_kv_hook,precommit,error,badarg}}],
[error]              []},
[error]          {wrq,append_to_response_body,2,[{file,"src/wrq.erl"},{line,215}]},
[error]          {riak_kv_wm_object,handle_common_error,3,
[error]              [{file,"src/riak_kv_wm_object.erl"},{line,1144}]},
[error]          {webmachine_resource,resource_call,3,
[error]              [{file,"src/webmachine_resource.erl"},{line,186}]},
[error]          {webmachine_resource,do,3,
[error]              [{file,"src/webmachine_resource.erl"},{line,142}]},
[error]          {webmachine_decision_core,resource_call,1,
[error]              [{file,"src/webmachine_decision_core.erl"},{line,48}]},
[error]          {webmachine_decision_core,accept_helper,1,
[error]              [{file,"src/webmachine_decision_core.erl"},{line,612}]},
[error]          {webmachine_decision_core,decision,1,
[error]              [{file,"src/webmachine_decision_core.erl"},{line,580}]}]}}</pre><P><HR><ADDRESS>mochiweb+webmachine web server</ADDRESS> (DefaultRetrier.java:81)

I found one more solution. I added a file advanced.config, where riak.conf is located, and below property

[
    {riak_search, [{enabled, true}]}
].

I restarted riak. This time my problem was resolved. Now I am able to add data in bucket successfully.

Posted in MongoDB, NoSql | Tagged , , | Leave a comment

Knolx Session : Introduction To AGILE


In this presentation I am going to elaborate basics of Agile process and one of its Methodology , Scrum.

Posted in Agile | Leave a comment

Tutorial: AJAX calling in Play Framework 2.3.4


In this tutorial we will discuss about the following topics of AJAX calling in Play Framework 2.3.4:

  1. Generating a Javascript router
    1. Embedded router
    2. Router resource
  2. Use of Javascript router
    1. jQuery Ajax
    2. Success/Error handler for each router
    3. Single Success/Error handler for all routers

Continue reading

Posted in AJAX, JavaScript, jQuery, Play Framework, Scala, Web | Tagged , , , , , | 1 Comment