Easiest Way To Map Optional Nested Case Class with Slick in Scala


Few days ago, I had a scenario, in which I was supposed to map optional nested case class in slick using Scala.

case class Employee(emdId:String,name: String, record: Option[Record])
case class Record(subject: String, mark: Int)

I was trying to do this mapping the way, I have explained below.

class EmployeeSlickMapping(tag: Tag) extends Table[Employee](tag, "Employee") {
  def emdId = column[String]("emdId")
  def name = column[String]("name")
  def subject = column[String]("subject", O.Nullable)
  def mark = column[Int]("mark", O.Nullable)
  def record = (subject, mark) <> (Record.tupled, Record.unapply)
  def * = (emdId,name, address) <> (User.tupled, User.unapply)
}

But I was getting below compilation error.

Multiple markers at this line
    - No matching Shape found. Slick does not know how to map the given types. Possible causes: T in Table[T] does not match your * projection. Or you use an unsupported
     type in a Query (e.g. scala List). Required level: scala.slick.lifted.FlatShapeLevel Source type: (scala.slick.lifted.Column[String], scala.slick.lifted.Column[String],
     scala.slick.lifted.MappedProjection[platform3.models.mail.Record,(String, Int)]) Unpacked type: (String, String, Option[platform3.models.mail.Record]) Packed type: Any
    - not enough arguments for method <>: (implicit evidence$2: scala.reflect.ClassTag[platform3.models.mail.Employee], implicit shape: scala.slick.lifted.Shape[_ <:
     scala.slick.lifted.FlatShapeLevel, (scala.slick.lifted.Column[String], scala.slick.lifted.Column[String], scala.slick.lifted.MappedProjection[platform3.models.mail.Record,
     (String, Int)]), (String, String, Option[platform3.models.mail.Record]), _])scala.slick.lifted.MappedProjection[platform3.models.mail.Employee,(String, String,
     Option[platform3.models.mail.Record])]. Unspecified value parameter shape.
    - No matching Shape found. Slick does not know how to map the given types. Possible causes: T in Table[T] does not match your * projection. Or you use an unsupported
     type in a Query (e.g. scala List). Required level: scala.slick.lifted.FlatShapeLevel Source type: (scala.slick.lifted.Column[String], scala.slick.lifted.Column[String],
     scala.slick.lifted.MappedProjection[platform3.models.mail.Record,(String, Int)]) Unpacked type: (String, String, Option[platform3.models.mail.Record]) Packed type: Any
    - Implicit conversions found: => anyToToShapedValue()
    - not enough arguments for method <>: (implicit evidence$2: scala.reflect.ClassTag[platform3.models.mail.Employee], implicit shape: scala.slick.lifted.Shape[_ <:
     scala.slick.lifted.FlatShapeLevel, (scala.slick.lifted.Column[String], scala.slick.lifted.Column[String], scala.slick.lifted.MappedProjection[platform3.models.mail.Record,
     (String, Int)]), (String, String, Option[platform3.models.mail.Record]), _])scala.slick.lifted.MappedProjection[platform3.models.mail.Employee,(String, String,
     Option[platform3.models.mail.Record])]. Unspecified value parameter shape.

After beating my head two days, I found a solution by adding custom mapping.

class EmployeeSlickMapping(tag: Tag) extends Table[Employee](tag, "Employee") {
  def emdId = column[String]("emdId")
  def name = column[String]("name")
  def subject = column[String]("subject")
  def mark = column[Int]("mark")
  def record = (subject.?, mark.?).<>[Option[Record], (Option[String], Option[Int])](
    { mappedRecord =>
      mappedRecord match {
        case (Some(subject), Some(mark)) => Some(Record(subject, mark))
        case _ => None
      }
    },
    { result =>
      result match {
        case rec: Option[Record] => Some(rec.map(_.subject), rec.map(_.mark))
      }
    })
  def * = (emdId, name, record) <> (Employee.tupled, Employee.unapply)
}

It worked and now I am able to compile and run my code.

Posted in Java, Scala | Tagged , | 1 Comment

Introduction on Playframework


In this presentation , I have discussed some important features of Play Framework.

Posted in Scala | Leave a comment

Scala in Business | Knoldus Newsletter – October 2014


Hello Folks

This time I got bit late due to the festival week in India. But i have some intresting stuffs for you only.

We are back again with October 2014, Newsletter. Here is this Scala in Business | Knoldus Newsletter – October 2014

In this newsletter you will get to know how organizations are getting benefits by using Typesafe Reactive Platform, how Akka is helping to build scalable and fault tolerant applications and how Spark is doing fast processing than Hadoop.

So, if you haven’t subscribed to the newsletter yet then make it hurry and click on Subscribe Monthly Scala News Letter

Screenshot from 2014-10-29 23:53:34

Posted in Akka, Cassandra, Cloud, Future, Java, JavaScript, LiftWeb, MongoDB, News, Node.js, Non-Blocking, NoSql | 1 Comment

Knolx Session : Design Principles for Mobiles


In this presentation i have explained the principles and elements we have to take care during a website design mostly for mobile view.

Posted in Scala | Leave a comment

SBT-dependency tree


In this blog , I am going to describe how to view sbt dependency tree.  Last week I had a problem related to different cross version of a dependency. I knew the problem cause but  I had spent a day to know which dependency had brought that cross version of dependency. I did some study and browsing about that problem then I come across a sbt plugins as a potential solution of that problem.
In a project there is a chance of using same library but different version by multiple dependencies.  I also victim of dependency version conflict.  The good way is just draw a sbt dependency tree. Here is a sbt-plugin sbt-dependency graph is available for that.

Following are the steps to install and use sbt-dependency-graph
a) add plugin to project/plugins.sbt

addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.4")

b) add sbt setting in build.sbt

net.virtualvoid.sbt.graph.Plugin.graphSettings

if project is multi module then add to Parent.scala:

object Parent extends Build{
lazy val parent = Project(id="parent", base=file(".")).settings(net.virtualvoid.sbt.graph.Plugin.graphSettings: _*)
}

Now run sbt commond:

$ sbt dependency-tree
+-org.jsoup:jsoup:0.2.2
[info]   | +-commons-lang:commons-lang:2.4
[info]   |
[info]   +-org.scala-lang:scala-library:2.11.0 (evicted by: 2.11.2)
[info]   +-org.scala-lang:scala-library:2.11.1 (evicted by: 2.11.2)
[info]   +-org.scalaz:scalaz-core_2.11:7.0.6 [S]
[info]     +-org.scala-lang.modules:scala-parser-combinators_2.11:1.0.1 [S]
[info]     | +-org.scala-lang:scala-library:2.11.0 (evicted by: 2.11.2)
[info]     | +-org.scala-lang:scala-library:2.11.1 (evicted by: 2.11.2)
[info]     |
[info]     +-org.scala-lang.modules:scala-xml_2.11:1.0.1 (evicted by: 1.0.2)
[info]     +-org.scala-lang.modules:scala-xml_2.11:1.0.2 [S]
[info]     | +-org.scala-lang:scala-library:2.11.0 (evicted by: 2.11.2)
[info]     | +-org.scala-lang:scala-library:2.11.1 (evicted by: 2.11.2)
[info]     |
[info]     +-org.scala-lang:scala-library:2.11.0 (evicted by: 2.11.2)

or 

$ sbt dependency-graph

or provide lot of options.for more info sbt-dependency-graph

Enjoy with  sbt  projects !!!

Posted in Scala | Leave a comment

Solution for Riak 500 Internal Server Error


I am new to Riak and learning it. Few days ago, I got 500 Internal Server Error exception while inserting data in a Riak bucket. It was weird because I was able to insert same data in different bucket.
I tried to find out root cause, but didn’t get success. After beating my head whole day, I posted it on Roak forum and stackoverflow. I got a response that issue is related to precommit hook. But still I didn’t get solution.

I didn’t make any change in Riak setting. I was not able to understand how this precommit was defined with the bucket.

I tried to overwrite precommit property of my bucket. Like this:-

curl http://127.0.0.1:8098/riak/abc-client -X PUT -H "Content-Type: application/json" -d '{"props":{"precommit":[]}}'

I tried again to insert data and got same error again.

[error] ! step error
[error]   RiakRetryFailedException: com.basho.riak.client.http.response.RiakResponseRuntimeException: 500 Internal Server Error<h1>Internal Server Error</h1>The server encountered an error while processing this request:<br><pre>{error,
[error]     {error,badarg,
[error]         [{erlang,iolist_to_binary,
[error]              [{hook_crashed,{riak_search_kv_hook,precommit,error,badarg}}],
[error]              []},
[error]          {wrq,append_to_response_body,2,[{file,"src/wrq.erl"},{line,215}]},
[error]          {riak_kv_wm_object,handle_common_error,3,
[error]              [{file,"src/riak_kv_wm_object.erl"},{line,1144}]},
[error]          {webmachine_resource,resource_call,3,
[error]              [{file,"src/webmachine_resource.erl"},{line,186}]},
[error]          {webmachine_resource,do,3,
[error]              [{file,"src/webmachine_resource.erl"},{line,142}]},
[error]          {webmachine_decision_core,resource_call,1,
[error]              [{file,"src/webmachine_decision_core.erl"},{line,48}]},
[error]          {webmachine_decision_core,accept_helper,1,
[error]              [{file,"src/webmachine_decision_core.erl"},{line,612}]},
[error]          {webmachine_decision_core,decision,1,
[error]              [{file,"src/webmachine_decision_core.erl"},{line,580}]}]}}</pre><P><HR><ADDRESS>mochiweb+webmachine web server</ADDRESS> (DefaultRetrier.java:81)

I found one more solution. I added a file advanced.config, where riak.conf is located, and below property

[
    {riak_search, [{enabled, true}]}
].

I restarted riak. This time my problem was resolved. Now I am able to add data in bucket successfully.

Posted in MongoDB, NoSql | Tagged , , | Leave a comment

Knolx Session : Introduction To AGILE


In this presentation I am going to elaborate basics of Agile process and one of its Methodology , Scrum.

Posted in Agile | Leave a comment

Tutorial: AJAX calling in Play Framework 2.3.4


In this tutorial we will discuss about the following topics of AJAX calling in Play Framework 2.3.4:

  1. Generating a Javascript router
    1. Embedded router
    2. Router resource
  2. Use of Javascript router
    1. jQuery Ajax
    2. Success/Error handler for each router
    3. Single Success/Error handler for all routers

Continue reading

Posted in AJAX, JavaScript, jQuery, Play Framework, Scala, Web | Tagged , , , , , | 1 Comment

Tutorial: Post Update on LinkedIn via Scribe using Scala


To post an Update on LinkedIn via your application, that is being built with Play 2.3.x, follow these steps ( this post summarizes the work done step by step).

1) Create a LinkedIn app (if you do not have one already)

Click here – LinkedIn’s Developer Quick Start Guide and create an app. Enter all the details including Site URL. The Site URL could be something like http://www.example.com (but it should be a valid site URL). Also, select the rw_nus setting under the OAuth User Agreement section (otherwise you wont be able to post updates on LinkedIn).

Once you have saved this app, you will get API Key & Secret Key. Take a note of this API Key & Secret Key. We would use them in our code.

2) If you are using Play, then add the API Key, Secret Key & Context URL(Callback URL) in application.conf file.

linkedin.key=<your_key>
linkedin.secret=<your_secret_key>
contextURL="localhost:9000"

3) Download a LinkedIn login image from linkedin.png & save it in “public/images” folder of app

4) Add following dependency in build.sbt (or Build.scala for Play 2.1.x or older versions)

"org.scribe" % "scribe" % "1.3.5"

5) Add following to routes file in conf folder

GET  /linkedin/login     controllers.LinkedInAPIController.linkedinLogin
GET  /linkedin/callback  controllers.LinkedInAPIController.linkedinCallback

Continue reading

Posted in Agile, JavaScript, Play Framework, Scala, Web | Tagged , , , , , , , , | 1 Comment

SCALA : Overriding the Default Numeric Type


In Scala, there are mainly 7 numeric types .These numeric types are Byte, Char, Double, Float,Int,Long, and Short. All numeric types are object.

Data ranges of Scala’s built-in numeric types are as follows :-

dataranges

Above was the brief introduction of numeric data type .

Continue reading

Posted in Scala | Leave a comment